Go-Lab Deliverable D8.3 First trial report
Abstract
This deliverable presents the results of a first set of evaluations in Go-Lab. The deliverable
follows the research questions that were presented in Go-Lab deliverable D8.1. These
research questions were divided into three clusters: questions aimed at students,
teachers, and organisations. In each of these clusters the focus of the questions is on a
specific Go-Lab intervention (e.g., ILSs or elements of ILSs, Golabz, and/or the Go-Lab
authoring facilities) and measure different types of outcomes (mainly knowledge and
inquiry skills). The three different clusters of research questions are presented in
separated parts in this deliverable.
The studies that evaluated students covered various tools from the Go-Lab set of tools,
different configurations of ILSs, different age groups, different labs and domains, and
different assessment methods. All studies were conducted “in vivo” meaning that we have
always conducted the studies in real classes under realistic circumstances. This
sometimes gave specific challenges, such as hampering internet connections, and made
that research conditions, in terms of time allowed for the study or number of subjects, were
not always optimal. We should also consider that in all of the studies these were most
probably the students’ (and teachers’) first encounters with inquiry learning. Despite this
some general conclusions can be drawn.
First, in all of the studies in which knowledge was measured (and where no internet issues
appeared) we have seen a significant increase in scores on knowledge tests. There have
been no comparisons with other, more traditional, approaches (this will be done in Y4 of
the project) but in any case offering online labs makes that students learn about the
domain. In the case of inquiry skills such an increase was not always measured, which
can be explained from the fact that for those skills to develop properly, we need more time
and a prolonged training of the skills.
For a specific set of tools we could find direct effects on students’ acquisition of knowledge
and inquiry skills. The conclusion tool, the hypothesis scratchpad, and the experiment
design tool all showed specific effects in some of the studies. In some case a comparison
was made to a condition in which the tool was not offered, in other cases a comparison
between a fully specified tool and a tool that was rather “empty” (such as a hypothesis
scratchpad without pre-defined terms) was made. Studies on the hypothesis scratchpad
showed that most probably offering pre-defined concepts in the tool was beneficial for
learning compared to letting students configure these terms themselves. In the study on
the concept map, no differences between including and not offering the concept map were
found but in this case the concept map did not have any predefined terms in the pull down
menu. If we extend the results of the studies with the hypothesis scratchpad to the concept
map, we might expect better effects when these terms will be offered in the concept map’s
pull down menu. So, a second conclusion n might be that tools often support students but
that they might need to be filled with domain terms in order to create an effect.
A third conclusion might be that there is no “one size fits all” solution. Several of our studies
(especially the ones on the experiment design tool (EDT)) show that tools are specifically
effective (differentially for knowledge and inquiry skills) for students with lower prior
knowledge or for younger students. The studies with the EDT also show that in these
cases there is an interaction with the difficulty of the domain involved, effectiveness of a
tool for the students who need this might be more distinct when the domain gets more
difficult.
Finally, also the specific configuration of a tool might matter. The studies with the data
viewer, for example showed, that when the data viewer automatically incorporated data
an improvement of inquiry skills was reached, whereas students who imported data
themselves gained better conceptual knowledge.
When deciding which tools to include in an ILS and how these tools should be configured,
it is important to consider students’ age, level of education, and prior knowledge, as well
as the difficulty of the domain, and whether the goal is for students to gain conceptual
knowledge, to acquire inquiry skills or both.
The (Phase B) teacher evaluation has been based on the analysis of pre- and post-data
of 130 teachers. A close look at those teachers teaching and technical skills reveals that
our sample has been composed of advanced teachers who were not only very much
interested in the use of online laboratories but had also quite developed pedagogical and
technological skills. Their background knowledge in combination with their interests and
the support mechanisms offered by Go-Lab, had an impact on those teachers’ knowledge
and motivation. What is particularly interesting is that although the majority of these
teachers intended to mostly use ready-made ILSs and online laboratories, at the end of
Pilot Phase B, most of them have started creating their own ILS.
When it comes to the evaluation of organisations, the analysis of teachers', headmasters
and policy makers’ interviews reveals that the applications and impact of Go-Lab expand
to multiple levels. Within schools, STEM teachers are teaming up with colleagues from
other disciplines and use Go-Lab to develop interdisciplinary activities (i.e., combining
STEM topics with language learning, schools' collaboration, and special needs education).
Headmasters begin to realise how the use of online laboratories can contribute to both
their teachers' and students' development in an easy and cost effective way. At the same
time, policy makers' understanding on the use of online laboratories is also strengthened
which opens possibilities for more support regarding the Go-Lab implementations.
Domains
Education
Origin : Explicit agreement for this submission
Loading...