In 2018, CERG funded five Education Research Fellows to support their classroom based action research. Over the next few months, we will hear updates from these projects. In the first of the series, Robert Campbell reports on his flipped activities project. More details on the scheme are available here. Calls for 2019 Fellowships will be open in the Autumn.
Preparing year 11 students for answering questions on required practicals – a reflection.
With the recent changes in GCSE and A-level specifications, it is perhaps no surprise that three of this years CERG fellows focussed their research on methodologies to enable students to better access laboratory work. In this reflection I focus on my project “evaluating flipped learning approaches to GCSE practical work”.
The justification for the project
Over recent years flipped classroom has received significant attention as a tool which can be used to improve student performance. Work from Schultz (2014) and Fautch (2015) have highlighted the positive impact flipped classroom can have on both student performance and their attitude towards studying chemistry. In my own experience I have seen the positive impact flipped teaching can have on A-level students’ understanding of organic mechanisms (Campbell 2016). This study sought to ascertain whether these same gains would be identified in using flipped material for preparing students for questions on required practicals in GCSE separate science examinations. Before commencing the study, students were asked to review their perception of practical chemistry by completing Likert style questions and a short questionnaire. Figure 1 shows how students ranked their confidence in answering exam questions on required practicals.
Figure 1 a review of student perception of answering exam questions on required practicals. (n = 15)
Most students felt less confident about answering practical questions, so I felt justified in introducing this pedagogical technique and reviewing its impact.
Setting up the flipped material and initial thoughts
The research was conducted with my year 11 class a top set separate science class of 21 students with target grades of grade 6-9. Students had a timetable plan of when required practicals would be completed and those practicals studied in year 11 would have a video as a “pre-lab” to help prepare students to understand the requirements of the practical. These pre-labs were AQA required practical videos which were hosted on the edpuzzle website (www.edpuzzle.com) with imbedded questions in the video. Students had to answer these questions in advance of the lesson. As a teacher I was able to review student responses and overcome any misconceptions or areas of misunderstanding in advance of completing the practical. An example of a video, embedded question and student response are shown below in figures 1a-c.
Figure 2a an example AQA required practical video on the edpuzzle platform with embedded questions (shown in green)
Figure 2b an example “exam style” question embedded into the video.
Figure 2c an example student response to the question shown in figure 2b (Click to increase image size)
In figure 2c we can see that the student has correctly identified the test for chlorine, namely that it would turn blue litmus red and then bleach it white, however the compound to be identified is a chloride, the student has struggled to distinguish between chlorine (Cl2) and a chloride ion (Cl–). Potential pitfalls such as these can be shown using the flipped model that may otherwise be missed.
Student reflections part 1
After two required practicals students were asked to give feedback on the flipped material and justify their thought process. Students gave a mixed review of the website Edpuzzle. Figure three shows how students perceived using flipped videos on edpuzzle.
Figure 3 A graph showing how students perceived flipped videos on edpuzzle. (categories: green = really useful I would like to see these used in theory lessons, yellow useful but I would like to keep teacher demonstrations, aqua – no more useful than teacher demonstrations in regular lessons).
Students identified clear examples as positive features of the videos but also highlighted a frustration that the practicals reviewed in example flipped material were not consistent with questions in example sample assessment material.
Students were also asked if they would like more exemplar videos to use for revision rather than specifically flipped material. Surprisingly 66% of students requested more videos despite 50% of students suggesting they did not find the videos useful in a flipped pre-lab format.
These results initially appeared slightly contradictory and therefore it was decided to review student performance in upcoming mock examinations and compare performance in non-practical and practical style questions.
Student performance in mock exams
Mock exams taken in April of 2018 were analysed on a question by question basis. As this is a new specification with the removal of coursework past papers from previous specifications could not be used. AQA sample materials were used so a level of caution must be taken in analysing this data as access to these papers may have been possible in advance of assessments. Furthermore, evaluation of changes to A-level exam papers has shown that sample assessment materials do not necessarily give an accurate reflection of exam weighting for particular topics. With these limitations in mind A question by question analysis was conducted which did appear to show a statistically significant difference between average student performance in particular theory topics and practical style questions. Results are shown in figure 4.
Figure 4 question by question analysis of average score as a percentage.
One can see that performance in practical chemistry remains lower compared to all topics except for mole calculations. One must take caution over analysing this data. Firstly, the example practical question in this case asked students to evaluate a conclusion rather than simply regurgitate a studied required practical, secondly the first question on testing for ions was built up of lower scoring questions such as completing tables with observations where it can be less challenging to gain marks. A six-mark question on crude oil asked students to explain the stages of fractional distillation, a topic that had been covered in some detail. It is therefore perhaps not surprising that students scored significantly higher on some of these questions than in practical questions. However, the final question on analysing data was a challenging AO3 style question, so it is not true to say that the challenge of the question entirely accounts for difference in scores achieved. Moreover this data does not account for discrepancy between students of different target grades, with students of lower target grade achieving disproportionately lower scores in practical questions. Irrespective of the limitations of analysing data from exam questions, by this stage of the project students’ ability to answer questions on analysing experiments and apply their knowledge of required practicals to new scenarios remained an issue.
Student survey part 2.
Before students left on study leave, they were asked to complete an additional questionnaire on their perceptions of flipped material. In the interim students continued to watch flipped videos as appropriate when studying required practicals. Under achieving students were also invited to intervention sessions, and students were presented with required practical packs including methods and example extension questions suggested by AQA. The survey therefore asked students to review how confident they felt about answering practical questions compared to survey one and asked students to rank the range of materials used as teaching tools in order of decreasing value. These results are shown in figure 5.
Figure 5: how do students feel about answering practical style questions and non-practical examined questions May 2018?
It is worth noting that significantly less students (n=7) completed the second survey so any conclusions made will be limited. What is pleasing to see is that students feel more confident compared to at the start of the project, however there does not appear to be any marked improvement in confidence of answering practical questions compared to non-practical questions. This may be due to the number of type of student who answered the question, however I must accept that the lower student participation of students in the second survey may be reflective of student perception.
Those students who completed the second survey tended to rank edpuzzle flipped material as less useful. This may be because of the time of year and its intention as being used as pre-labs none the less, students did not revisit the videos in advance of their GCSE examinations.
Because of the large discrepancy between the number of students answering the first and second surveys any potential conclusions will be more muted. The data does however seem to suggest that flipped videos have not significantly improved student confidence in approaching practical chemistry style questions.
It may be that the time of this project did not fully consider the impact of forced changes to assessment of GCSE and the lack of past paper revision materials available. It may be that this adversely impacted student perception of flipped classroom interventions.
I would warmly recommend applying for the CERG fellow programme in the coming academic year.
Campbell Robert, (2016) “Does using Zaption for flipped classroom improve student attainment in AS chemistry?” Poster exhibited at Association of science education annual conference University of Birmingham 3rd January 2016.
Fautch, J (2015), “The flipped classroom for teaching organic chemistry in small classes: is it effective? Chemistry education Research and Practice 16 (1) 179-186
Shultz D, Duffield S, Rasmussen S, Wageman J, (2014) “effects of the flipped classroom model on student performance for advanced placement high school chemistry students”, Journal of Chemical Education 91 (9) 1334-1339