Forces and Motion

The messiness of learning about physics

Perspectives for 11-14 14-16 16-19 TEACHER COMMUNITY

As a teacher, I am very interested in the process of learning. Schools can sometimes present the process as a straightforward, linear transition from being a novice to being an expert. Representations of students’ expected progress tend to show learning growing as straight lines without any steps backwards. In this blog post, I would like to present a different model of learning.

Example of a basic flight path – a common linear representation of student progression

Learning is a process of change. Over time, physics students, we hope, become less likely to use alternative ideas and more likely to use scientific concepts. For example, moving from the belief that an object in motion must experience a net force, even if moving in a straight line at constant speed, to an acceptance of Newton’s first law. This change can take place over different timescales. In my experience, on rare but powerful occasions, learning can happen over short timescales: so-called ‘Aha!’ moments. I believe I experienced an ‘Aha!’ moment in school when I realised that seemingly very different cases of simple harmonic motion (a pendulum, a mass on a spring, a ball oscillating in a bowl) all arose from the same relationship between force and motion. Typically, however, learning happens more gradually and can take weeks, months, and years to become stable.

When I started my doctorate, the fine-grained details of how learners move towards expertise in physics interested me. My supervisor, Keith Taber, introduced me to the microgenetic approach: a research strategy in which a changing phenomenon, for example a student’s understanding of force, is sampled at a high rate compared to its rate of change. The assumption behind the microgenetic approach is that if the sampling rate is too low the change may not be validly represented. In physics education research, many studies of learning, for sensible reasons, have used a small number of widely spaced tests. For example, a pre-test and post-test spaced several months apart. Such sampling can be useful, for example, when evaluating a novel intervention, but can fail to fully represent the process of change.

I sampled my participants’ understanding over 22 sessions, at roughly weekly intervals over a period of six months. This ensured a detailed representation of learning in my research, and captured movement towards stability. I interviewed five students, aged 16 and 17-years-old, about their understanding of forces using a range of probes, repeating some to assess change. This included probing to uncover the students’ explanations of pieces of apparatus (eg explain the motion of this mass on a spring) and conceptual questions about force (eg an object is dropped from a plane moving at constant velocity, what shape will its trajectory take?). Recordings of the interviews were analysed to represent the students’ learning.

The data showed the students’ progression was messy and far from a linear transition from novice to expert:

A representation of one of the participant’s, Ben’s, learning. The circles represent his use of a conception of force, either the scientific model (net force linked to acceleration on the upper line) or an alternative conception (net force linked to motion on the x-axis). Note that the x-axis is discontinuous as only time in the sessions is represented.

Rather than considering learning as something that happens either linearly or suddenly, it might be more useful to think about the process as a gradual decrease in the activation of alternative ideas alongside an increase in the activation of scientific concepts. Significantly, teachers should expect periods of regression as well as progression. Alternative ideas are not removed after teaching, but often remain part of a student’s repertoire and can be activated under the right conditions (for example, an exam question set in a particular context can trigger an alternative idea that had not been activated for an extended period of time). It is worth bearing in mind how contextually sensitive learning can be – it is normal for students to be able to answer questions correctly in some contexts but revert to alternative models in others. Such variability suggests that linear expectations on learning are unhelpful and create unrealistic expectations on both learners and teachers.

The messy nature of learning suggests teachers (and researchers) should be cautious about the conclusions they draw from assessment data. For example, an initial use of an alternative idea, followed by the activation of the scientific concept might be taken as evidence of successful learning. Teachers, researchers, and students might then assume that no further teaching related to that concept is required. However, the data above suggest that the pattern of apparent learning described may be part of various progressions. For example, the learner might be in a phase where they use both alternative and scientific ideas depending on the context. The data, therefore, represent a period of stable equilibrium rather than change. All of us who work in education and make judgements about students’ learning should be careful about the inferences we draw about change, based on a limited number of data points.

The messy progression of student learning can be frustrating. If all learners progressed in a linear manner, teaching and designing interventions would be easier. However, that picture does not represent the reality of learning. Rather than ignoring the complexity of the process, it is important that assessments and teaching interventions acknowledge the messiness of the process.

Submitted by Dr Richard Brock

Limit Less Campaign

Support our manifesto for change

The IOP wants to support young people to fulfil their potential by doing physics. Please sign the manifesto today so that we can show our politicians there is widespread support for improving equity and inclusion across the education sector.

Sign today