AfL in Primary Science
Student-generated Questioning by De La Salle School
Getting students to ask questions is not equal to getting them to apply their conceptual knowledge. Student-generated questioning is the means to an end rather than the end itself. The greatest discovery of this study is the goldmine of students’ questions. Although questioning has had a long history in the teaching of science, it has most often been used to elicit answers from students. This study has found that it is not the answers but students’ questions that are vital for teachers to understand the knowledge and thinking process of students.
Assessment in primary science includes a section that requires students to answer open-ended questions based on a problem context provided. For the past three years, analysis of school-based as well as national examination results has shown that students could not attain good scores for this section. One of the reasons for this was that they were unable to apply the conceptual knowledge to the context of the questions. While they were able to recall textbook concepts, they were unable to translate these to different contexts. As science teachers, the team found that this was a gap that needed to be addressed.
In attempting to address the gap, the team considered “student questioning” as a strategy. This was also in alignment with the key learning direction of the school "Socratic Questioning". Science education literature has had much focus on "questioning" and some of the key studies are described in the table below.
The action research was conducted over a span of 6 months and involved two Primary 5 science classes. The 72 students in the study were of mixed ability ranging from middle to lower middle progress. Two lesson packages were designed by the team to administer the intervention and facilitate data collection.
The intervention strategy was mainly to get students to generate questions as well as good answers. A blended approach was used in the lesson planning. Inquiry-based learning was the main pedagogy and “questioning” fit nicely into the "Engage" stage in the BSCS 5E Instructional Model used by teachers in the school. Each lesson included an authentic problem or case, a question-generation stage, an explanation stage and a ICT tool for collaborative learning. The case hooked the students to the topic and provided a purpose for questions to be generated.
Lesson package (LP) 1 was carried out at the end of a topic (Plant Transport) while LP 2 was carried out as a pre-activity to a topic (Photosynthesis).
At the end of the first lesson, three types of data were collected: students’ questions and explanations and teachers’ reflection. The data was analyzed to feed forward to the next lesson. Data collected was also used to inform the team for the planning of the next research lesson.
In LP 1, students were using simple 5W 1H question starters to generate questions. The questions generated showed that students were able to ask questions based on the problem given. They were also using concept words that showed that they were linking what they had learnt to the problem presented. As many as 10 questions were collected per group and the 2 teachers then grouped the questions into categories according to the concepts that students had made links to. Students were then asked to rank the questions in order of relevance. Ranking the questions challenged students to evaluate their own questions and prioritise them.
The team found that students were able to generate questions using the 5W 1H starters. However, the questions were mostly factual or procedural. For example, "What type of plant was it?". Some groups had largely irrelevant questions. The students were very ICT savvy and could use the tool (Linoit) to put up their questions very efficiently.
LP 2 was planned after analyzing the data collected with LP 1. LP 2 was similar to LP 1 in that students were provided with an authentic context to anchor them. However the question-generation stage was shifted to a later stage. In LP 1 students were required to generate questions as they attempted to provide a solution to the problem. In LP 2 the question- generation took place after students had written the explanations for the problem. The questions had a purpose as they were posed to peer groups to critique their explanations. Additional scaffolds were added in LP 2 and these include Socratic question starters and instructions on PowerPoint Slides. Another ICT tool (Padlet) was used to enable downloading of students’ responses into Excel Spreadsheets for analysis.
With LP 2, the Socratic question starters seemed to improve the depth of students’ questions. Peer group questions also helped groups refine their explanations. This was a clear sign of collaborative knowledge construction. Using Padlet as the ICT Tool was a lot quicker for the students as well as the teachers who downloaded students’ responses into PDFs or Excel Spreadsheets. The tool also enabled students to view each other’s group responses. While LP 1 and LP 2 showed improvements in student learning, teachers were none the poorer. Students’ questions and explanations provided rich data for teachers to access student’s understandings and thinking processes. The data also provided timely information for AfL as teachers were able to identify misconceptions as well as pre-conceived ideas about the concepts. With this data, teachers could plan forward for subsequent lessons.
Summary of Findings
At the end of the two LPs, the following are the key findings of this study:
- Providing structures to scaffold student questioning will move them towards asking higher level questions.
- Students’ questions provide a rich database that helps science teachers understand the thinking process of the students.
- The teachers’ capacity to access and assess for learning and provide forward feedback is greatly enhanced through the use of ICT. For example, teachers could easily export student responses into Excel spreadsheets for further analysis and this could be archived for future use.
- Students are able to access each other’s understandings in real time through the use of online collaborative platforms.
Linking back to AfL, though the lesson was planned to elicit students’ understanding, it evidently activated students as resources for one another through questioning. The lesson design enabled students to learn from each other’s questions to help refine their understanding. Students were engaging in their own learning as they sourced for information to form their explanations. They also evaluated their peers’ explanations and constructed new knowledge. For example, students question to clarify peer group's explanation and this provided the peer group with scaffolds to refine their explanations and form new knowledge. This exemplifies the highest level of understanding in the revised Bloom’s Taxonomy (Anderson and Krathwohl, 2001).
The data collected in this study is of value for future lesson designing as it could provide teachers with information to use as feed forward for the planning of lessons. Teachers could consider the use of question starters to expose students to “questioning language”.
In moving forward, the team sees possibilities for further research on student-generated questioning. These could be in relation to the use of peer questioning and evaluation as an AfL strategy in the teaching of science. Lesson studies could also be a good way of enhancing teachers’ pedagogical approaches to support student-generated questioning.
Anderson, L. W., Krathwohl, D. R., & Bloom, B. S. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's Taxonomy of educational objectives (Complete ed.). New York: Longman.
Chin, C., & Osborne, J.(2008) Students’ questions: A potential resource for teaching and learning science. Studies in Science Education, 44, 1-39.
Chin, C., & Osborne, J. (2010). Students' questions and discursive interaction: their impact on argumentation during collaborative group discussions in science. Journal of Research in Science Teaching, 47(7), 883-908.
Harper, K.A., Etkina, E., & Lin, Y. (2003). Encouraging and analyzing student questions in a large physics course: Meaningful patterns for instructors. Journal of Research in Science Teaching, 40(8), 776–791.
Dillon, J. T. (1988) The remedial status of student questioning, Journal of Curriculum Studies, 20:3, 197-210
King, A. (1994). Guiding knowledge construction in the classroom: Effects of teaching children how to question and how to explain. American Educational Research Journal, 31, 338–368.
King, A. (2002). Structuring peer interaction to promote high-level cognitive processing. Theory into Practice, 41(1), 33–39.
Paul, R., & Elder, L. (2006) Thinker’s guide to the art of Socratic questioning/Dillon Beach, CA: Foundation for Critical Thinking.