In an attempt to not run from the need to improve my pedagogy, prepare students for multiple choice questions, but also use multiple choice as more of a formative tool rather than summative, I am experimenting with something called a daily “5-in-5”. To do this, I create a daily set up multiple choice questions, and allow 5 minutes for the students to solve the problems. Students are not graded on the quizzes, but upon completing we immediately grade them together, and students enter their score (1-5) into the same form each time using a shortened URL and a QR code that is located on the top of the quiz. See screenshot below:
When students complete the form, they choose the topic and indicate the “block” (period) of class they are in. This data is then filtered using a pivot table so it is organized by topic and block with respect to number of correct answers. See screenshots below:
To create a “scoreboard” across blocks, I copied the above pivot table to another sheet in the same spreadsheet, and altered the output so that only block and average score across all “5-in-5’s” is displayed. Using the chart gadget, I created a bar graph from this pivot table that will dynamically update once score are added by students. See screenshot below:
This was a lot of work upfront, but now, I simply coy the same URL and QR code to the top of each “5-in-5” and it feeds this system. I made the sub-sheet that that displays the chart public via our class website for our students to monitor, established some cheesy goals like: “Block with the highest score at the end of the week gets…”, and send out a daily updated about the scores. Without entering a single grade, it’s been amazing to see the way this simple, yet visual, rewards/accountability system has motivated students to engage in multiple choice items. Students want to do the “5-in-5”s, not because they want to improve their grade, but because they want to see if they can improve their score from the last one, and if they can contribute to the overall class score, two things I feel create a positive culture in my class, from block-to-block.
Moreover, having time set for multiple choice testing in a fashion that is void of the stress of getting “tricked” or misreading a problem, we as a class have intentional discussions about test construction and item analysis. Conversations like “Which one is the best distractor”? emerge, and help students think about the process through the eyes of a test writer, rather than a stressed out students. Next week I am going to experiment with having the students choose two answers, one for the correct response and one for the most obvious distractor, and report the “5-in-5” score accordingly. Either way, I think it is 5 minutes well spent, and I am excited to see how my students embrace the multiple choice portion of the AP test after preparing this way, rather than via the 30 questions that wold normally precede the short answer section of a boring unit exam.
However, “drill and kill” activities that promote assimilation of content do have their place in the classroom and can often provide a great opportunity for entertaining review activities that can help build community. When these assimilation activities are placed after the inquiry and accommodation process, their purpose is more evident and their application more relevant. This week I introduced a game called “Lower Blooms Basketball” as a way to promote assimilation. We are in the process of studying chemical equilibrium, and the ability to predict which direction an equilibrium will shift when a stress is applied (Le Chatelier’s Principle) is a skill whose solution comes quickly, but one that requires a lot of practice.
To do this, I created a sheet of ~ 25 different, quick problems, and then made ~ 100 copies. I the cut the problems into little slips, placed them in bins and distributed them throughout the classroom. These were the “basketballs”. Below is a screenshot of one sheet:
Using duct tape, I placed a “three point” from one end of my classroom to the other, and placed a garbage bin in front of the classroom as the “hoop”. The rules of the game are as follows:
- Grab a problem, write your name on it, and solve the problem.
- Crumple it up and shoot it in the hoop.
- For every problem you make in the hoop, correct, you get 1/10 of an extra credit point.
- You can shoot as many times as you want, but each shot must be a new problem.
- I will be roaming and shot blocking :)
By the end of the 30 minutes, each student, on average, solved ~ 50 problems. One of my weakest, most disinterested students said he solved 150 problems. The next day, I sprung a pop quiz on my students, however the quiz comprised the toughest Le Chatelier problems I could find (most having to do with changes in pressure and temperature). Student performance was incredible. Although this task promoted rote, “drill and kill” learning, assimilation of this concept, AFTER students battled their misconceptions in the laboratory proved to be a meaningful use class time. Below are a few pictures:
Albeit only perception data was obtained, this strategy appealed to a wide spectrum of learners, and although implemented over a decade ago, the emergence of new learning technologies, in particularly growing student access to lecture material online, empowered the process for a majority of the participants. Results suggested that students preferred the inverted approach to the traditional lecture, and would enroll in future courses that employed the same strategy (Lage et al., 2000). Given the fast growth of learning technologies over the past 12 years, specifically the combination of tablet and screencasting devices, momentum around the Inverted Classroom has increased exponentially. This momentum is evidenced by, among many things, the best selling Bergman and Sams (2012) ISTE publication Flip Your Classroom: Reach Every Student in Every Class Every Day.
Moving from an objective lens to a personal one, my own exploration into the Inverted Classroom has been tumultuous. After a few years of employing an inverted Advanced Placement (AP) Chemistry curriculum, lack of student interest, motivation and stagnate test scores made it evident to me that a pedagogical shift was needed. This realization was, admittedly hard to swallow. I had spent the past three years blindly dedicated to the inversion of lecture and homework, something I was confident was the most innovative flip I, or any educator, could ever make. It was at this moment that I had a huge, but very simple aha moment: no pedagogical shift was taking place!
To more thoroughly explain this distinction, I must first outline my understanding of the term Pedagogy. Defined by the Cambridge Dictionary (2010) as the “...science and art of education”, I like to refer to pedagogy as simply: the things a teacher does to help students learn. What was I doing to help students learn” (para. 1). Yes, I was providing more in-class opportunities for problem-solving and self-paced lectures, but was I truly living up to the promise I made to myself at the beginning of my career to not recreate the de-motivating lecture environment my high school and college chemistry instructors left me with? Or, was I dressing up the same lecture-driven approach I found so ineffective with pretty technology?
Clearly I was leveraging more tools than I ever had. The use of wireless Wacom tablets, expensive screencasting software, and integration of various Google tools gave me the feeling I was peaking as a teacher. Back to the aha moment. Just because lecture happens in a different space doesn't make it, in today's information leviathan, a meaningful pedagogy. Yes, the self paced medium video provided was better than in-class lectures, and with more class time available for one-one-one assistance my students were solving harder problems more frequently and with greater accuracy. But, when I was honest with myself, I realized I was just employing a “high tech” version of the same didactic approach.
I failed to ask myself the largest pedagogical question of all: How is the information constructed? Is it organized and applied by the student, and facilitated by the instructor? Or is it created by the teacher, and delivered to the student? In the days before the printing press, and even before the internet, the teacher’s role naturally fell into the realm of information transfer. Harvard Physics professor Eric Mazur corroborates this observation, noting that teaching is a two part phenomenon: first transfer of information, second information assimilation. Mazur goes on to suggest in his now famous talk Confessions of a Converted Lecturer, that the ubiquity of information for today’s student naturally changes the role of the modern teacher from one of a medium of information transfer, to one of a facilitator of information assimilation (Mazur, 2009). The simplicity of a Google search alone validates Mazur’s point.
In my opinion, what Mazur, Lage et al., and the plethora of popularized blogs and infographics about the flipped classroom rarely address is the real problem: When information transfer happens, not where. Although I believe passionately in Mazur’s assertion that assimilation is the role of today’s teacher, it is important that not only the location of assimilation be flipped, but also the timing. Rather than view transfer and assimilation as a “one-two punch”, I propose we ask students to engage in the process of assimilation initially, or as a colleague of mine says, “the mess of discovery”, allowing subsequent transfer events to be directed, tailored, and most importantly, driven by student misconceptions, not teacher choice.
During the summer following this aha moment, the chemistry teacher in me naturally began by reflecting on the scientific method. The scientific method does not begin with the dissemination of information, but with questions, problems, dilemmas and issues. The scientist must negotiate these issues, gather information, consult with colleagues and eventually construct conclusions. If the scientific method was the essence of my discipline, why was I reserving to a section in a textbook? How dare I gloss over it like an isolated piece of content? The “mess of discovery” my colleague spoke of, the “assimilation” Mazur refers to, and the pedagogical shift I was waiting for was the scientific method. From that summer on, I have dedicated myself to creating an environment that embodies assimilation before transfer, one that is guided by the principles of discovery, not only in a laboratory rubric, but in the structure of the pedagogy.
A question still remained: How could I harness the benefits of the inverted approach while not being a slave to it? In other words, how could the "flip" be used as a technique in the context of a student-centered pedagogy, rather than a pedagogy in and of itself? I began by studying various learning cycles. The work of physics instructor Robert Karplus spoke to me the most. In Karplus’s cycle, an initial “Explore” phase, where pupils worked through guided inquiry exercises is followed by an “Explain” phase, a more teacher-centered moment where necessary and tailored information is transferred (Sunal, n.d.). The cycle concludes with an “Apply” phase where the concept is extended to new and unique situations. I rewrote my learning objectives into Karplus-like cycles, developed associated assessments, and began writing lesson plans.
Unlike previous years, a pedagogy emerged guided by student questions and facilitated by teacher content, rather than the reverse. During further reflection and planning, the “Explain” phase of the Karplus cycle surfaced as an appropriate phase to “flip”. However, unlike my 20-30 minute videos of the past, this new pedagogy called for the creation of short, tailored videos designed to address misconceptions and assimilation errors that arose during student exploration. Rather than devote hours to creating complicated and intricate screencasts, I elected for simpler systems, with less frill, but more pedagogical weight. The technology became a slave to the pedagogy, rather than vice versa, and the videos became, if you will, “inquiry spackle”. The figure below is a model of this Kaplus “flip” variation.
Figure. Explore-Flip-Apply Model. Based on the Explore-Explain-Apply inquiry learning cycle developed by Robert Karplus (click here for an interactive version).
Serendipitously, the College Board, the governing body of all AP courses, has begun the process of redesigning a handful of course curricula in search of less content transfer, and more content inquiry. The AP Chemistry test will embrace a new inquiry driven curriculum in the 2013-2014 school year, and upon initial inspection, it aligns well with the “Explore-Flip-Apply” Karplus variation I have already begun to implement. This directly from the College Board: “In moving away from the lecture-and-demonstration model toward a more hands-on, interactive approach to studying chemistry, the course enables students to take risks, apply inquiry skills, and direct and monitor their own progress” (College Board, 2011, para. 6). I welcome this statement and am inspired by the College Board’s shift in pedagogical emphasis.
Whether it be asking students to figure out why we put salt on frozen roads and then telling them, creating an environment where students explore the features of acid-base titration before sharing the known characteristics, or facilitating the discovery of how batteries work rather than detailing their intricacies, the role of lecture, in particular video, is nothing more than a technique we can leverage. I encourage all educators contemplating “flipping” their classrooms to first detail a path towards meaningful student learning, via a struggle to negotiate perplexity, then inspect their pedagogy in search of useful places to off-load content transfer to video. It is my opinion that placing the “flip” before the pedagogy is nothing but a step in the reverse direction.
Bergmann, J. (2012). Flip your classroom : reach every student in every class every day. Eugene, Or. Alexandria, Va: International Society for Technology in Education ASCD.
Cambridge Dictionaries Online (2010). Cambridge Dictionaries Online. [online] Retrieved from: http://dictionary.cambridge.org/dictionary/british/pedagogy?q=pedagogy [Accessed: 5 Jan 2013].
College Board (2011). The College Board Redesigns the AP Chemistry and AP Spanish Language and Culture Courses. [ONLINE] Available at: http://press.collegeboard.org/releases/2011/college-board-redesigns-ap-chemistry-and-ap-spanish-language-and-culture-courses. [Last Accessed January 5, 2012].
Lage, M.J., Platt, G. J., Treglia, M. (2000). Inverting the classroom: a gateway to creating an inclusive learning environment. Journal of Economic Education.
Mazur, E. (2009). Confessions of a Converted Lecturer . [online] Retrieved from: http://www.youtube.com/watch?v=WwslBPj8GgI [Accessed: 5 Jan 2013].
Musallam , R. (2011). Cycles of Learning. [online] Retrieved from: http://www.cyclesoflearning.com/page1/page1.html [Accessed: 5 Jan 2013].
Sunal, D. (n.d.). The Learning Cycle:. [online] Retrieved from: http://astlc.ua.edu/ScienceInElem&MiddleSchool/565LearningCycle-ComparingModels.htm [Accessed: 5 Jan 2013].
In conversation with a colleague who is a history teacher, we started brainstorming ways of integrating Wordle into his classroom in a more critical, exploratory fashion. We tossed around the idea of taking speeches from history that, although separated by time, represent a similar mind-set or message, copying the text into Wordle, and having students analyze the clouds for similarities and differences. Or even better, providing the clouds first, and asking students if they can predict the speech or event that the cloud was derived from. Below are two clouds. One from Martin Luther Kings’ I Have a Dream Speech and the other from Barack Obama’s A More Perfect Union speech on race. Can you match each cloud with its Author? Notice an similarities? Differences?
Step 1: Distribute final practice final exam to students.
Step 2: Provide students with whiteboards and markers (we use sticky whiteboard material that is adhered to each table).
Step 3: Assign students a partner. One student is the videographer, who will hang out over the shoulder of the other student recording while they solve and explain the problem.
Step 4: Create and embed a Google form to submit the URL (YouTube, Vimeo, etc.) to the video of their solution.
Step 5: Embed the form and the associated spreadsheet on one page so students can provide and view videos simultaneously. Click here to visit our site, or see the screenshot below:
I have been very impressed with the quality of the video solutions and so far students have commented that they love seeing their classmates solve the problems, and it provides a nice change from my usual babbling. Surprisingly, the students said they wished I solved the problems this way, rather than the tablet/screencasting combination I traditionally use. They mentioned that the authentic look and feel of the whiteboard and the pen nice, and it was fun to see the person’s hand writing and the classroom in the background. Looks like I’m going to be changing things :). Below is an example of one of the solutions:
This year, I wanted to slow down and do something different with the concept. Although simple, the diffusion of gases through a medium is a concept that aligns horizontally well with their physics course, and also vertically with various concepts in biology. To better explore this concept, and given its relative simplicity, I tried to package and entire “Explore-Flip-Apply” lesson plan which would normally take 2-3 days into one period. Below is what I did:
Step 1: Explore
I showed students the below video clip and asked them to write down all the questions they had. Students unanimously asked the question: “Where will the gases meet?”
Step 2: Flip
Next I handed out a document to students with a screenshot of the tube shown in the video and asked students to predict where they would meet by drawing a line. I circulated and watched students negotiate this process. Most students started by calculating the molar mass of ammonia and hydrogen chloride gas. Because hydrogen chloride is roughly 2 times the molar mass of ammonia, students instinctively hypothesized that ammonia would diffuse twice as fast.
Although incorrect, this was a victory in that students conceptually understood that at the same temperature, a gas with a smaller molar mass will diffuse faster than its heavier counterpart. At this point I strategically had the class stop working, and brought the class together. I told students that their conceptual understanding was correct, but their quantitative reasoning was incorrect. To negotiate this, instead of lecturing, I directed students to read about Graham’s law in their text individually for 5 minutes. This was the “flip” because rather than lecture to the entire class, I allocated lower Blooms to content to students to work through individually at their own pace.
Student then discussed their readings with their group members, and revise their drawings accordingly. Groups then took pictures of their revisions, and posted to their blogs. Below is a screenshot:
While students uploaded their blogs during class, I worked at my desk copying each image from their blogs into keynote slides. I copied an image of the correct answer (which will be revealed to students in the “apply” phase below) above each of their images. Below is a screenshot:
Step 3: Apply
Once I had all group images copied into the keynote presentation, I played the below video, which is the complete version of the “explore video”:
After I played the video, I then revealed each slide and students voted on the group who’s solution was the closest to the actual solution. Students shared their strategies, completed a few practice problems, and wrote their own problem for their peers to solve. All in all, the process took 60 minutes total, and was, in my opinion, a much more useful, productive and critical way to learn about Graham’s law.
First I posed a situation by posting an image along with a set of questions via a google form. See screenshot below or click here for the assignment page.
After students participated in the “exploration” (essentially an experimental design problem), I assigned the typical “Read blah blah and do problems blah blah on page blah blah”. Instead of having students bring in the hard copy of their problems, I wanted to create a catalog of their solutions so that I could analyze them for mistakes, misconceptions, etc. along with their experimental design above. To do this, I had students snap a pic of their solutions and post to a site called postimage.org and past their link in a Google form to create a single data base with links to all of their solutions. See screenshots below or click here for the assignment page.
See screenshot of spreadsheet below, or click here for live sheet.
In order to guarantee that students were “locked” into the inquiry cycle (participated in the exploration prior to the application), I told students they would receive three points for the assignment: 1 point for the exploration, 1 point for the application and 1 point if their exploration submission is time-stamped prior to their application time-stamp. Although it seems like a like or work to sift through (and it was) organizing all things via two google forms that are individually time-stamped added efficacy (I think) to a process (homework) that I often feel can be very ineffective in that I was able to keep students questioning and forming hypothesis prior to application, which is the goal of our inquiry driven classroom structure. Moreover, the paperless turn-in allowed me to assess student work prior to class to facilitate a more critical and informed class the first day back.
First I found this released free response question from the 2012 AP Chemistry Chemistry test that fit the standards to be assessed on the quiz:
Then using my iPhone, and with the help of a colleague, I recorded this video version of the question and published to YouTube:
Because I posted the video to YouTube, I didn’t want students distracted by all the other garbage. To negotiate this, I created a URL that would open the video in full screen immediately. Below is how I did that:
After using bit.ly to shorten the link (I chose bit.ly instead of goo.gl so I could customize the address for student ease of access during the quiz), I created and administered the below quiz:
Below is a picture of the process:
Branching Google Form
Form Emailer Script
While in theory, the “apply” phase calls for much more than practice, in chemistry in particular, for me, there is a tendency to get stuck in a “study hall” like environment as students struggle through problems in class. To try and bridge the gap between practice, and also promoting extension and critical investigation of problems, I did a version of Kelly Oshea’s Mistake Game. Students were given a problem, and were asked to solve that problem, but plant one common mistake, as if they are writing multiple choice item distractors. Students then viewed one another’s problems, and qualitatively identified their mistakes.
I enjoyed how this activity seemed to quench the students need to practice problems, however it also forced students to reverse their role, and step into the shoes of the test writer. Not a true “extension” of their knowledge to the real world, but this activity successfully forced students to assess problems through a different lens. To accomplish this, I harnessed the “snapshot” tool in google docs (thanks to @followmolly for this tip at ISTE) to create a fluid jig saw where students created their “mistakes” and then shared them with others. Below is an outline of the process:
1. Create google presentation with one slide for each group and make the presentation public and editable by all.
2. In the presenter notes section of each slide, write unique question for each group.
3. Instruct each group to solve that question on a white sheet of paper, using a sharpie and planted one mistake. See below:
4. Using the snapshot tool, instruct each group to insert picture into the slide. See below:
5. Groups then observe one another’s slides and identified the mistake made.
6. Upon conclusion of activity, facilitate a class discussion where groups discuss the most appropriate mistake.
See an example of the complete slide deck here:
I use the Google form to promote algorithmic reflection using an MCQ (multiple choice question) and conceptual reflection using a FRQ (free response question) embedded at the conclusion of the screencast. I enjoy putting the questions at the end of the video, as it promotes student progress through the video prior to reflection. I also ask students to indicate any questions they have and I use those questions to tailor the beginning of the third phase of the cycle.
I have always struggled however, with developing a simple and efficient way to provide students with feedback on the reflection questions. Initially, I would go over them in class the following day, and award students with points, or lack thereof, for correct and or incorrect answers. I ultimately swapped this process for a pass/no-pass system as awarding points defeated the purpose of formative feedback and reflection.
Instead, I began to play around with various Google form scripts that would email students with a description of each reflection question. This worked well, but it still felt like too many steps, and students reported back that they rarely opened the emails. Then, while working with a group of teachers this year in Yosemite at CUE Rockstar, we all came up with a solution: EDIT CONFIRMATION!
We all know that after submitting a Google form, you get this nice little message that informs you that you have correctly submitted the form. However, we often take for granted that this message can be edited. See screenshot below:
So, instead of sending students an email, I have started recording short videos using the Showme app on my iPad that work out the solutions to these questions. Below is a screenshot:
I then create a goo.gl shortened URL of the public Showme video so that I can track how many times students viewed the video and when. In the edit confirmation, I then provide a link to this video, and I was happily surprised to learn that when a link is placed in the edit confirmation area, it appears as a “clickable” link! Because I use the same form for every lesson, I am keeping a running tab of all lesson solutions in the edit confirmation. Students said this is nice because they can refer back to other reflection solutions in one place. Below is a screenshot:
Two weeks into this process, I am absolutely loving it. For the first lesson, 115 students viewed the video, and almost all students got the right answer. I know this because they are not able to see the edit confirmation until they have submitted their response and no students submitted twice. This tells me that the question was a challenging one, but they understood it. Or perhaps it tells me they were interested in seeing the solution and motivated to learn more. Either way, on the second lesson, they all got it right, but I only observed 8 clicks. Perhaps this question was easier, or they were less interested in viewing the solution for some reason? Below are two screenshots:
Reflecting on this process, and providing formative feedback in such a simple way is really exciting and reminds me of the power of technology in the classroom when viewed through the lens of pedagogy.
The model shows a simple allocation where video that is used to spark inquiry is allocated to the classroom space or “community” and video that is used for content delivery is allocated to the homework space or “individual” setting (of course, not to beat a dead horse, but this occurs after classroom exploration of content). This allocation was motivated by a previous model where instructional spaces were allocated along a spectrum of Bloom’s Taxonomy, placing more rigorous tasks such as inquiry in the community space, and procedural tasks, common in content driven videos, to the individual setting.
In addition to reflecting on the use of video, I got motivated and finally transferred all of my video content (movie clips, etc.) to google drive. My current collection contains video clips taken from movies, commercials, talks, and other random events. I draw from this collection primarily for inquiry purposes as content videos in Explore-Flip-Apply are made on the fly in response to student misconceptions during the inquiry phase. Click here to access the g-drive folder of vids.
Because reassessment questions are “standards based” they are very targeted and usually involve only one or two questions. To manage this process, I simply write a question on a notecard and then ask the students to solve the question on a whiteboard in front of me in class. Below is a picture:
This process has worked very well. However, given the number of students reassessing, and the lack of a paper copy of the reassessment, tracking the process has been difficult. Moreover, although reassessments are summative in their performance on a specific standard, because students can reassess continuously throughout the year, their answers also provide a lot of formative information that is not cataloged in this system. Introduce Evernote! Last week we received our class schedules, and during our youth STEM camp, I had some of my high school helpers make an Evernote “stack” for each of my “blocks” (class periods) and then make a separated Evernote folder in the appropriate stack for each student. Below is a screenshot:
Because my iPhone/iPad has an Evenote app, after each reassessment, I plan on making a note in each students file that includes a picture of their whiteboard, and a quick audio interview about how they felt, areas of improvement, growth, etc. Below is a screenshot of the note interface:
I will then share this folder with the students (maybe even parents) and thus build a formative library to not only track progress, but also help students build metacognition (“thinking about thinking). Not sure how this will go, but I am very excited to begin the process!
Check out an awesome roller coaster one group made out of pipe insulator, masking tape, and a marble:
During the “Explore” phase, I noticed something interesting and also very encouraging. Because every single outcome (unpacked standard) was essentially “locked” into an inquiry learning cycle this year, unlike at the beginning of the year, my students appeared to be attacking the problem with an incredible sense of confidence and strength. A year of inquiry had, I hope, rubbed off on them, and observing them test and re-test the different metals, monitor their produced voltage, and explore a myriad of other intricacies I had not even hypothesized they would was extremely rewarding. Within a half-an-hour, students had not only figured out the direction of electron flow, respective charge of the anode and cathode, and how to determine cell voltage (at standard conditions), but more importantly, had fully construct the knowledge I had planned on delivering during the “Flip” phase. Instead of holding out, and requiring that they view the instructional screencast, I decided to test their constructed knowledge via a game of “Battery Relay.” I had each lab group report to the chalk board and with a chart of standard reduction potentials in hand, yelled out two different metals, and challenged each group to draw a battery. Randomly I would yell “switch” or “rotate” which would signal another group member to continue from where the other left off in the diagram.
Moral of the story, inquiry pays off, and although I had an elaborate plan of “filling in the gaps” via and instructional video, with a little time, space and guidance, students just might construct that knowledge on their own. Check out a video clip of our relay race below:
Having a large surface on their desk to perform and create problems seemed like a perfect way to check for understanding and empower students to demonstrate critical thinking. Moreover, using their camera phone/video camera, groups could easily “hover” above the desk and record quick tutorials, bypassing the need for screencasting and tablet technology, iPads, etc. Despite the obvious benefits, my administration did not approve the painting of our classroom desk tops.
In search of a cheaper and less-permanent substitute, I stumbled across Self Adhesive Dry Erase Material. It is working like a charm! I purchased a few rolls, and measured out sheets that stick to the top of our classroom desks. The sheets can be removed at the end of the school-year, and for now, appear to work as well, or better than, a traditional whiteboard. See below for a video of a student in my AP Chemistry class working through a problem “on her desk” at the conclusion of a learning cycle on atomic structure:
So why is this post titled “Sub Videos”? When I first began merging tablet and screencasting technologies to create instructional video, one of my favorite applications was for sub assignments. I would simply record myself modeling a few problems for students…have the sub play those videos…have the students solve some related problems…have the sub play solutions to those videos, etc. It worked like a charm (at least I thought…). See example video below:
Despite initial “success”, after presenting at CUE in Palm Springs last month, something struck me. I was in the middle of my standard discussion about Blooms Taxonomy, and how the true “flip” does not involve homework with lecture, but intentionally matches the “community” (classroom) with learning activities appropriate for the community (higher end Blooms). Conversely, matching the “individual” (outside of class time) with learning activities appropriate for that space (lower end Blooms). See image below:
While giving that presentation, I realized that my students were back in San Francisco, with a substitute, watching videos IN THEIR COMMUNITY SPACE (classroom) of me solving problems as I would IN THEIR INDIVIDUAL SPACE! Because my students were all together, I was missing an opportunity to use video as an inquiry tool, and instead, using it as I normally would on the back end of an inquiry cycle, as a tool delivery medium. So, the last few times I have missed class since CUE (happens often given the arrival of my second child!), I have been experimenting with using video in a way that values the community, promotes inquiry, and models how I normally would carry out class if I was there. Rather than solve problems, I have been presenting open ended scenarios for students, and given a set of prompting questions, instructing the sub to have students discuss possible solutions to the scenario. Often, I have coupled the situation with follow up videos that provide further explanation, but NOT UNTIL the initial inquiry scenario is presented. Below are a few examples:
Pre Video Question: Does Bromothymol change color in an acidic or basic environment? Justify by writing a chemical reaction to describe the process.
Pre Video Question: Can you explain this observation using what you know about ideal gas behavior?
Because the “Explore” and “Apply” phases of the “Explore-Flip-Apply” learning cycle rely heavily on collaboration and model development, quick group presentations and sharing of information is essential. Below is a list of different ways I have been using Reflection App, in conjunction with our classroom LCD projector, to better facilitate inquiry and collaboration:
- Mobile Document Camera: Using the camera, move about classroom displaying student work, laboratory procedures, demonstrations, etc.
- Spontaneous Group Presentations: Using the camera, quickly interview different groups during various phases of the inquiry cycle. For example, feature a group’s problem solving strategy, or ideas re: a specific model they are currently developing.
- Annotation of Student Work: Using the camera, take a picture of various student solutions during a problem solving session and using apps like PhotoPen, annotate and comment on problems.
- Quick Slide Shows to Summarize Activities: Snap pictures of student work, and upon conclusion, quickly scroll through your camera roll to serve as a daily summary. Alternatively, revisit the pictures at the beginning of the following day as a transition into the next phase of the learning cycle.
- Screencast iPhone/iPad Applications: Although more connected to the “Flip” phase of the “Explore-Flip-Apply” cycle, by mirroring your display onto your Max OSX display, you can use programs such as Quicktime’s screen recorder and Screencastomatic to capture applicable information from your device.
1. Using the Google Sites “iFrame Gadget” or any other iFrame function, embed your public Showme or Educreations profile into your class site.
2. Identify frequently asked questions or questions students ask via email or text.
3. Record solutions using Educreations or Showme.
4. Walla! Solutions are posted automatically to your website available for ALL students, not just the question asker.
Although this seems like a very simple/no-brainer process, the fact that the iFrame functionality completely removes the step of publishing and distributing the video to students makes this process sustainable. I find myself carrying my iPad with me at all times, pulling over on the side of the road to answer student texts (I provide students with my google voice number), or simply wandering the classroom and recording solutions that I would have otherwise done on a separate sheet of paper or a whiteboard. Click here and here to see videos made via Showme and Educreations (I switched to Educreations for the second semester when it was launched). The videos are super shaky and bad quality, but that’s THE POINT. Capturing the authentic tutorial stores it for everybody, with learning taking the lead, rather than aesthetics.
Below is a text transcript between myself and a student that references this process:
Because the video is long, I recommend clicking the: button on the bottom right of the embedded video below to watch it directly on vimeo’s site. You can also click here. You will notice chapter markers below the video that look like this:
Chapter markers can only be seen when viewing the video directly on the vimeo website (Dear Vimeo, Please add this feature to embedded videos as well. Sincerely, Ramsey). By clicking on each blue link, the presentation will be automatically forwarded to that point in the video, without reloading a page. Traditionally, I encourage teachers to make very short videos for their students, that introduce or reinforce basic concepts needed for in class application. However, from time-to-time, it is appropriate to create a long video for students that, for example, provides an interactive key to a final exam, or provides a systematic review of all material in preparation for an end of the year exam (AP, etc.). Using chapter markers can be a great way to chunk up a long video, and empower students to think metacognitively about what they need the most help with. Below the presentation is a tutorial video on how to make chapter markers in vimeo.
CLS RTI & Tech Conference Presentation
Vimeo Chapter Marker Tutorial (via WebVideoSchool)
Recently I stumbled upon a research Meta-Analysis and Review of Online Learning Strategies provided by the U.S. Department of Education that validates this reflection mechanism. Below is a screenshot taken from the report:
I have always preached the following: “It is not a silver bullet, if your students don’t have access, then try something else. If your students do, then harness that option appropriately”. However, the more I speak to others about technology in the classroom, I become even more perplexed by the access reality. Some argue that access is no longer an issue, others feel that requiring students to engage in online activities outside of the classroom creates an unequal playing field, while many passionately claim that we are handicapping our students by not empowering them with the ability to seek out the tools necessary to engage in the 21st century learning activities. Click here to access data regarding internet access provided by Pew Internet.
This past week I was honored to work with a friend who is a 9th grade algebra teacher at a school just south of San Francisco. He is interested in the Explore-Flip-Apply model, however, given the dynamics of his school, and diversity of his students, he was weary about assigning homework that required access to the internet. Rather than simply guess and go for it, we worked together to develop a simple plan to more intentionally address any access issues in his class. The goal was to create a survey that would encourage a very honest response regarding access rather than couch our question in the context of a homework assignment plan, etc. Below are the steps we took:
Step 1: We created a google form with the following two questions. Students answered form form the library computers.
1. Can you update your Facebook status from home or from a mobile device (e.g., laptop, phone, iPad, etc.)?
2. If you answered “No” to the question above, you did so because...a) you don’t have personal internet access b) you don’t use Facebook.
Step 2: We analyzed the data.
72% answered “Yes” to the Facebook status question. It was assumed that these students can access the internet individually, outside of the classroom.
15% answered “No” because they do not use Facebook, however said they could access the internet for other activities. It was assumed that these students can access the internet individually, outside of the classroom.
13% answered “No” because they did not have access at home.
Step 4: On the following day, we met with each student who fell within the 13% (4 students) and asked for their feedback on the following options:
Option 1: Come to class 15 minutes early on the day an internet based assignment (usually instructional screencast) is due, and view it on one of four classroom computes.
(This teacher purchased 4 laptops from eBay for fairly cheap and also offers up his own personal laptop during this time).
Option 2: Research hours of the school’s library/Internt access and that of their local library. Use locations to complete internet based assignment.
Option 3: Ask a family member, neighbor, or friend whom you see often if you can use their internet for ~ 30 minutes/week
(In a traditional Explore-Flip-Apply model, the students are only viewing ~ 2 vids a week, and each is ~ 8 minutes. We added on an extra 15 minutes to give the student time to fill out the associated form used for reflection and tracking of the video).
Option 4: Come up with your own plan for accessing the internet.
Students were given a few days to figure out their strategy, while we confirmed that the other 87% had access. The 4 students were told that if they could not figure out a feasible strategy, that we would work with them individually to figure something out. Two out of the 4 students decided that coming into class early on days where lectures were due would work for them. Anecdotally, when my students choose this option, I find that their investment in the following problem solving sessions is greater. Most likely due in part to the close proximity between viewing a video of basic content, and direct application of that content. Another student decided to use the school library, and the final student, given her schedule, found none of the options suitable. For her, the instructor loaned out an old department laptop with wireless capabilities, and worked out a plan with her and her mother where she would visit a local coffee shop that provided wireless access near her home a few times a week.
All in all, the process was very simple. However, rather than simply assuming all students had access as I have done in the past, helping a colleague through this process shed light onto the realities of the access issue for this particular class. Additionally, we learned that negotiating the access issue, with a little creativity, can be a very feasible process. Obviously how the percentages break down is a function of the school community, student demographics, etc. Perhaps this post can shed light into a simple way of negotiating student access in your class. Subjectively, I could just sense that the students appreciated the intentional way we worked WITH them, rather than simply assume that they are “digital natives” and all have access.
Day 1: Explore
Topic: Effusion and Diffusion.
Task: To compare the effusion rates of helium gas and nitrogen gas.
Materials: Three balloons, measuring tape, helium tank, your breadth (for nitrogen gas…yes, I know, this is a HUGE REACH, but whatever, it was fun..), scotch tape, a thumbtack, and a stopwatch.
Procedure: We use a guided inquiry approach during exploration days. Whenever appropriate, I try to put the procedure development process in the hands of the students. In this lab, I gave students 15 minutes to brainstorm a procedure of their own. Students had many different procedures, but after pulling their hair for 15 minutes, the majority of groups developed a process that looked something like this:
1. Fill one balloon with nitrogen gas.
2. Fill the other balloon with helium gas to the same volume using tape to measure diameter.
3. Blow the third balloon up to ~ 2/3 the size of the other two balloons. Use this balloon as the “reference”.
4. Secure a piece of tape in nitrogen filled balloon.
5. Insert thumbtack into tape.
6. Remove tape and record time (sec) it takes for balloon to deflate to size of reference balloon.
7. Repeat for nitrogen.
8. Estimate rate by taking the reciprocal of the time.
See images of process below:
Data Analysis: Students then entered data into a class chart where effusion rates were gather across groups, a class mean was taken and percent error data was calculated. See images below:
Model: Students worked in groups to analyze the data and create a model. The group mean diffusion rate was ~ 2.6! Students ultimately, with a bit of prodding :), realized that the this was approximately the square root of 7 (how many times larger the molar mass of nitrogen is than helium). Students recorded this data and conclusion in their notebooks and were assigned a lecture on effusion and diffusion for homework (the flip).
Night 1: Flip (Instructional Video)
For homework, students were assigned an instructional screencast video that addressed their observations from the laboratory, formally defined effusion and diffusion, presented the equation for Graham’s Law, and solved a few example problems. Students reflected on the video via a google form embedded below, where they provided a 5 sentence summary of the video and entered in answer to the second worked example (video ended abruptly in the middle of the second example). Below is a quick snippet of the video:
Below is a screenshot of the google form used for reflection:
Day 2: Apply
Goal: To apply knowledge of Graham’s Law.
Task 1: Work individually and collaboratively to practice released AP Chemistry problems related to Graham’s Law of Effusion-Diffusion.
See examples of problems below:
Task 2: Use new assimilated knowledge of Graham’s Law to verify the equation with respect to diffusion, rather than simply effusion (See Day 1).
Materials: Clear plastic straw, two cue-tips, ammonia, toilet boil clear (source of hydrogen chloride), ruler.
Procedure: We again guided inquiry approach, however, because students have already explored this concept and have viewed and instructional screencast, the goal is to verify, rather than develop, Graham’s Law.
1. Simultaneously place ammonia and toilet bowl cleaner soaked cue-tips in both ends of a clear plastic straw.
2. Work together on practice problems while waiting.
3. Measure distance from top of cue-tip to cloudy ring in straw (location where ammonia and hydrogen chloride gas meet).
4. Calculate rate (distance/time) for each gas.
5. Determine value of rNH3/rHCl and share data.
6. Evaluate class mean and calculate percent error.
See images of process below:
Night 2: Prepare for Quiz
Day 3: Quiz and new cycle begins
1. Click “Record”
2. Enter your name and email and click “Generate”.
3. Click link.
4. Enter your name.
5. Click allow and wait for countdown.
6. Navigate back to this page.
7. Click “Ink”
8. Create lesson on whiteboard.
9. Click stop sign on bottom right.
10. Check emaill for video.
11. Distribute to students.
Step 1: Each group designed and solved one problem on a blank sheet of paper. I did not allow the use of any materials, textbooks, etc.
Step 2: Groups recorded their solution using screenchomp according to the below rules:
- Question is only spoken
- Solution is spoken and written
- Multiple colors are used
- Each member must be heard or seen (handwriting)
Step 3: Groups shared link with google form on class website. Spreadsheet that collects links was embedded on website using iFrame gadget. Click here to watch chomps.
Step 4: Students watched chomps for homework (the flip). Quiz questions the following day were be randomly chosen from the chomps.
Use this program to build my class website. It allows me to integrate things like google forms into the website in a seamless fashion for students. Students seem to enjoy the clean look and feel.
Student/group produced instructional videos upon conclusion of “Explore” day of learning cycle.
Produce quick tutorials for students during class. Use iFrame function in Rapidweaver so videos are automatically housed on class website for ease of access. Also, given the need for spontaneously produced instructional videos in response to misconceptions that arise during the “Explore” day of the learning cycle, this app allows me to create quick instructional videos for students.
Collect students summaries to instructional videos. Collect and grade multiple choice responses to quizzes and tests (I use the scripts Flubaroo for item analysis and MCQ to email students score and feedback). Collect and analyze group lab data. Collect student and parent course evaluations. Peer Instruction “buzzers” (see sidebar on right).
Shared “collections” for student lab and writing portfolios. Public google document for “Virtual Review” before exams.
Enter student instructional video summaries to generate word cloud. Use this to stimulate Q & A about instructional video before “application” phase of learning cycle.
Screenflow, Jing, & Quicktime:
Record teacher and student produced instructional videos.
Coaches Eye (iPhone):
Use app to record and deconstruct student lab work.
Voice Thread (iPhone)
Record teacher and student produced worked examples.
Wacom Graphire & Wacom Bamboo tablets (w/wireless adaptor):
Use for mobile instruction/modeling when needed during class.
Use in conjunction w/ tablets to annotate over pdf documents during class.
Use in conjunction w/ tablets to annotate over blank screen during class.
Use in conjunction w/ tablets to annotate over screen, videos, slides, etc. during class and in instructional videos.
Use in conjunction with tablets to annotate over pdf documents in instructional videos (clunkier than FormulatePro but easier to change color).
Use to time students when taking AP style exams, working on challenge problems, or negotiating any task that is timed during class.
Use as a backchannel for students and groups to ask general questions to me or the class during problem solving sessions.
Use to obtain clips from DVDs for #anyqs style video clips during “Explore” phase. See “movies” tab above for examples of clips.
Use to obtain mp4s of YouTube clips to integrate into instructional video and Keynote slides for #anyqs and general demonstrations.
Vimeo & YouTube
Publish instructional videos. Both allow for time-marker integration and annotations to help scaffold videos.
Use to generate iTunes feed for instructional videos.
IPEVO & Boardcam (iPad):
Use as doc cam to showcase student work, demonstrations and lab set up.
Logger Pro w/ Vernier:
Use for data analysis during “Explore” day of learning cycle.
Use video camera function to record class demonstrations and student work.
Use as “home base” for activities during “Explore” or “Apply” day. Also, #anyqs pictures and videos are housed in the keynote slides as ways to begin “Explore” day of learning cycle.
Students explore the core objective of the cycle first and reflect on their own constructs, models, and ideas together in class. The subsequent screencast provides introduction to lower level content standards (definitions, etc.), and during the application phase, students work through higher order standards collaboratively. Standards are then noted next to each quiz question and students track their performance, as do I. Each standard is graded and and entered into the gradebook individually. Only the standards are placed in the grade book:
Student Tracking Snapshot
I find this makes the reassessment process much easier as students know which standards they are still not proficient in, and then I can design reassessment opportunities on the spot that are more meaningful and rigorous (usually by writing questions spontaneously on class whiteboards) without creating an entire new assessment. I enjoy doing this in class rather than online (moodle, etc.) as it allows for conversation following the reassessment. If students are ready to reassess without extra help, they may do it on the same day. If they need extra tutoring from me, I decide on a reasonable time, usually the following day, in order to create a more authentic and valid assessment experience.
If an assessment given in the future includes a previously assessed standard, for now, I am simply replacing the old score with the new one. I have a hunch this is where I have a lot of growing to do, but for now, it seems to be working with students, and in a course that concludes with a year long cumulative examination, I am hopeful it will motivate students to recall their specific strengths and weakness more directly.
What interests me most about SBG and the associated reassessment process, is the way students naturally talk about the material. Rather than discussing issues around “Quiz X” or “Exam Y”, students now use terms like “I really get Standard X”, or “I am completely lost on Standard Y”. When students come to reassess, their comments are also much more directed. “Mr. M, can I reassess Standard X”, is now a common statement, rather than the infamous, “I need to re-take Exam Y”. The prior gives me, and the student, much more critical information. I hope :).
Below is a screenshot from a student that demonstrates this idea:
If you are interested in SBG, I recommend reading Frank Noschese’s blog. His resources are amazing and provided the energy I needed for my SBG process this year.
- Good science instruction should inspire students to construct their own knowledge.
- I teach at an urban catholic high school, and although improving, our schedule is still very limited (primary reasons include: additional school holidays and lack of athletic facilities that require students to miss class often).
Given this reality, in order for me to make it through an entire AP chemistry curriculum AND encourage students to construct their own ideas first, I must strike a balance between inquiry and teacher facilitated instruction. If I had it my way, and perhaps when I become a more experienced and well versed instructor, I will be able to move through an entire AP Chemistry curriculum in a way that completely removes any direct instruction from the picture.
In the meantime, rather than stepping in to fill in knowledge gaps and address misconceptions in class, doing this via annotated and narrated screencasts works very well for me, and for my students. (See “Explore-Flip-Apply” model below). Students get an opportunity to struggle with concepts in a collaborative and hands-on fashion first, and then use the homework space (only a few times a week at most) to learn key phrases, definitions, and models from me, that I feel push them through the curriculum at a good clip, in my voice and my handwriting.
Thus, it is VERY clear, that the above process is more a function of MY situation than anything. A lot of great information, knowledge and wisdom can be found when sifting through the past month’s debate, and for me, I was able to develop a model that I feel is working. But (deep breath…) that’s relative to me. For all interested in flip teaching, I encourage your to reflect on your own practice, what works for YOU, who YOUR students are and what resources they have. Then, perhaps aspects of flip teaching could help address a few of the road blocks you might encounter.
So, why am I writing this, and why is it titled “On-the-Go Responses? To be honest, going back and fourth about pedagogical differences, efficacy of flip teaching, etc., has totally burnt me out. Given this, I have been asking myself lately: why did I begin to do this in the first place? The answer is simple.
One day, 6 years ago, before Dan Pink even name dropped the term “flip”, I was frustrated with the time it took to go over homework in class, and decided to post video solutions on line. This led me to a fascination with mastering the tools needed to make this happen in a fluid and clean way for students and teachers. Done. I got super nerdy about the technology, and how it saved a few minutes of class time for me, and that was that.
So, lately I have been thinking about a similar thing, and rather than surround myself with debate on a grand scale, I thought it would be fun to get back to the nerdy tools that fascinated me in the first place. I got an email from a student asking for help on a problem while I was in the car yesterday (yes, like an idiot I looked at my phone while driving, I regret this…). I pulled over, and started typing out a long explanation. Then, I remembered a post form Kyle Pace on twitter about the new VoiceThread iPhone app.
I pulled up the app, and immediately realized I could only annotate over pictures. So, in an attempt to turn VoiceThread into Screenchomp, Replaynote, Explain Everything or Showme (iPad programs that allow you to record video tutorials on a blank white screen), I took a few pictures of a white sheet of paper I had in my car and recorded solutions to the problem over those images. In five minutes, not only was I able to send the solutions of to the student, but on their end, they received an explanation that is not only cataloged for other students to watch but, like all good instructional videos, maximized their audio and visual working memory channels.
Simple, but quick, and without an iPad. I think I’m going to use this method to respond to all student inquiries regarding difficult problems while I’m away form my computer. Below is a simple tutorial I created for how to do this. I used BoardCam (an iPad app that turns the iPad into a doc camera) to record this tutorial. It was my wife’s iPad :). Ignore the music at the beginning of the video. Forgot to turn Pandora off...
This is my 5th year trying implement an effective model of the “inverted classroom” (Lage, Platt, & Treglia, 2000) in my AP Chemistry class. I say “trying” as that is exactly what the past 5 years can be reduced down to: an attempt. While class-time was opened up for student problem-solving, and the video responses and reflections were amplified via the use of a google form as a tracking device, students seemed to be passively learning the material, at best. For all the benefits of flip teaching with respect to class-time, I now realize the HUGE negative was not flip teaching as a pedagogy, but simply the order of learning activities. Students come to my class with a rich and diverse prior knowledge, derived form 17 years of living “in” the subject. In the previous model, while my focus was on using class-time effectively, I failed at giving my students an opportunity to access their prior knowledge, tackle their misconceptions actively, and work to construct their own meaning FIRST. Derek Muller explains this extremely well in his video Khan Academy and the Effectiveness of Science Videos.
To address this issue, my first step was to RE-ORDER the way my class is structured and give students an opportunity to construct their own ideas and models BEFORE learning anything directly from me. Because, I still passionately believe in the time-shifting benefits of flip teaching (added classroom time, catalog of basics, focus on problem solving, etc.), my goal was to merge inquiry learning with flip teaching to promote knowledge construction, while also opening up class-time by off-loading any aspects of direct instruction as homework via annotated screencasts. I am definitely a rookie in this regard, and given the pace, content, and high stakes nature of an AP Chemistry class, I decided to make a list of all things factual, mechanical, and low level (definitions, equations, few examples, etc.) and create instructional videos around those ideas only. All other forms of learning are incorporated in a Explore-Explain-Apply learning cycle. Because the “explain” portion is off-loaded to the homework setting, I refer to the cycle as “Explore-Flip-Apply”. Mayer (2004) articulates the goal of this process well: “Students need enough freedom to be cognitively active in the process of sense making, and students need enough guidance so that their cognitive activity results in the construction of useful knowledge.”
Basically, there are still things that I, as the instructor, want control over teaching. I just won’t be using class-time to teach those concepts. I fully accept that this is where the model diverges from a truly strident-driven inquiry learning cycle. Even though I do play an active role in the “flip” phase of the cycle, not front-loading students with content, as I did in the past, but rather giving them at least one opportunity to form their own models first, has felt like an effective merger of both pedagogies…for me. Anecdotally, my students seem to be much more invested in the laboratory activities, and more motivated to apply their knowledge towards complex problem solving given an initial phase of exploration. A student approached me today and I feel his comment sums this process up well. Word-for-word quote: “Mr. M. In all my other classes, we learn all this complicated %&$* first, then do boring labs. In this class, the labs kinda make me think, and then you help me understand during the vids. I guess it helps me understand what my answers mean, or something…” Beyond the Napoleon Dynamite esque lingo lies for me, subtle evidence that I am working towards a model of Flip Teaching that I feel is sustainable, effective, and respects the way my students naturally all “want” to learn.
Below is an example of one “Explore-Flip-Apply” cycle. I will be posting different examples frequently throughout this year, and conclude with an action research report on the efficacy of the project in May, with a midterm report in January. As an aside, this re-structuring has also opened the door for me to touch on a wide range of strategies, not solely the inverted classroom. Other strategies addressed in “Explore-Flip-Apply” include:
Explore-Flip-Apply (Example 1)
Day 1: Explore
Step 1: Opener (~ 10 minutes)
The following question is displayed: Why is salt placed on icy roads in the winter? I use a variation of Peer Instruction to facilitate this process: a) Students work for 3 minutes to answer the question individually on their opener sheet. b) Students then group up (3 or 4) and share their responses and agree on a collective answer. c) One student “buzzes” in answer from smart phone or computer device using a google form embedded in class website designed to collect both multiple choice and free response openers. d) I display the google spreadsheet where data is collected and we as a class investigate all answers, discussing trends, commonalities, etc. I never explicitly give them the solution to the opener when collected on Day 1, as the purpose is purely exploration of concepts.
Step 2: Lab Exploration (~ 65 minutes)
Students are given a lab worksheet (Yes, I love the old paper-based lab worksheet action!) where, after a pre-lab discussion, they work in groups to develop and outline a procedure to answer the following question: How does the addition of sodium chloride affect the boiling point of pure water? This is where aspects of Guided Inquiry enter as students are given a research question and asked to design their own procedure. Students were only given the following materials (temp probe/computer w/ LoggerPro, two beakers, glass stirring rod, table salt, hot plate):
In the “data” section of the lab worksheet, students are asked to provide both a data table and a graph. An example of a graph gathered from one group’s procedure is below:
Students then work together to write conclusions and provide and “explanation” of the phenomena in their lab worksheet. Explanations are translated onto class-whiteboards and we spend the last 10-15 minutes of class discussing their explanations group by group. This may bleed into the “application” phase the following day. I guide this process without ever actually revealing the correct answer to the initial question posed in the lab. Various group procedures are highlighted and trends between groups are noted. This process might continue into the next day, however I usually plan lab explorations to take about 45 minutes, allowing time for an opener and group presentations. My classes are 75 minutes long.
Night 1: Flip (Instructional Video)
Students watch a screencast instructional video where I introduce additional concepts, definitions/equations and provide two problem-solving examples that relate to the exploration that current day. The purpose is to build on their exploration by introducing more structured concepts, providing any mechanical knowledge (definitions and equations) and briefly model a few problems. I am still trying to figure out exactly how much information to include and what to leave out during this phase. I find myself falling into my old bad habits of providing too much information and not letting the inquiry, and subsequent application phase, play a larger role. Perhaps I need to reflect on this Clough and Kruse (2010) article more? In order to engage students in the video process, and also promote reflection, a google form is embedded DIRECTLY BELOW the video that asks the students to provide a structured summary of the video according to a guide I provide for them. Additionally, the video ends in the middle of the second example. Students are asked to complete the problem and provide the numerical answer in the box labeled #2. Click here for an example of the video and form layout. My hope is that by asking students to reflect via a summary, and complete a problem, I am addressing both the conceptual and algorithmic side of the concept, and also obtaining information about what students struggle with via their responses (they are asked to indicate something they did not understand or still have questions about). This is where aspects of JiTT enter. The video for the exploration phase described above is below, along with a screenshot of the form and google spreadsheet where data was collected:
Day 2: Apply
* Activities on the “application” day vary from more directed lab application tasks, to individual/group problem solving sessions, to challenge problems and class competitions. Students have problem sets we refer to as “Learning Packets” that house the majority of practice problems used during the “application” day often. Click here for an example of a Learning Packet designed around “Free Response #4” on the AP Chemistry examination. Below is an example of an application day that involved a more specific variation of the lab activity from the previous day described above. Guided Inquiry is used again, but informed by the screencast lecture.
Step 1: Opener (~ 10 minutes)
Follows the same Peer Instruction model described above. This time, the question is more specific (usually AP multiple choice question). After individual attempts and group discussion, groups buzz in answer and we collectively go over responses by displaying google spreadsheet. I highlight groups who obtained the correct answer and keep track of this as a motivational tool for the opener. We critique wrong answers and discuss logic behind test construction of that item (good and bad distractors, etc.). See spreadsheet below:
Step 2: Lab Application (~ 65 minutes)
Students are given a blank sheet of paper to show their work in route to answering the following question: What mass of sodium chloride do you have in your tray? Prior to the lab, I measured the same mass of sodium chloride for all groups (50 grams). Students are instructed NOT to use a balance, but instead, the concepts they learned in the night’s lecture to obtain the mass of sodium chloride provided. Although students’ lab procedures ended up being fairly similar to the prior exploration, the specific task of determining the actual mass of sodium chloride, forced merger of skills constructed in the exploration phase and applications learned in the instructional video. Students were only given the following materials (temp probe/computer w/ LoggerPro, two beakers, glass stirring rod, to plate and 50 grams of sodium chloride):
Night 2: Prepare for Quiz
Students prepare for a quiz the next day by finishing problems in their Learning Packets. Quizzes usually have a total of four questions and ask students to apply and synthesize concepts from the application day. Quizzes are standards based, and I allow students to reassess as they strive towards mastery of the standards (many different versions of the quizzes are made to facilitate the re-assessment process). Students must wait at least one day after meeting with me for additional instruction before reassessment. Click here for an excellent post that describes the logic behind separating the re-teaching and reassessment process. Although I provide opportunities for students to reassess, for me, I have a hard time merging the “Explore-Flip-Apply” with an asynchronous mastery learning system. Because emphasis is placed on student construction of knowledge during the “explore” phase prior to video instruction, I find it easier to keep all students on the same cycle, rather than monitoring which videos each student has progressed through, and making certain that they were exposed to the laboratory BEFORE each video. To keep this cycle in-tact, I publish each video sequentially, as the associated exploration phase ends. To keep advanced students motivated, I scaffold the “application“ day to provide additional resources and challenge problems.
Day 3: Quiz and new cycle begins
1. It is insisted that constant student engagement is better than the “sage on stage” method of direct instruction. Then how to you justify the generations of students that have successfully been instructed by good teachers through some direct instruction? We’ve all had teachers that have the gift of the spoken word and have been effective at teaching students necessary content. And the teacher “knows” that content is being taught to a student instead of guessing through self-directed learning.
In reading this question, I am reminded why the most common advocates of flip instruction are science and math instructors. Disciplines where ultimately, students will commonly face some sort of algorithmic challenge lend themselves very well to flip instruction. Students can re-watch the problem-solving process, and have a cataloged data base of videos that address common strategies is very helpful. For example, in chemistry, balancing a chemical reaction is an essential skill that can be found in almost every single chapter in the second semester of a standard high school course. Providing video instruction for such a skill can be nothing but helpful for a student that needs the information repeated, and chunked, and readily accessible.
Personally, my high school history instructor was a genius at lecturing. His stories, method of explaining historical phenomena, and relating it to our everyday lives in a way that elicited critical thought and inquiry was amazing. I wanted nothing more that to sit in his classroom, and soak in the information his relayed to us. I was truly learning in his class. Not all subjects lend themselves to the concept of video-based direct instruction, and more importantly, neither do all teachers. Whether you embrace a purely constructivist classroom, where the heart of instruction comes from student exploration and the scientific method, or you lecture, the question “to flip or not to flip” I feel, is the same: are YOU using your precious class time with the activities YOU feel best promote meaningful learning with YOUR students?
If the answer is no, then perhaps there is something that you could off-load to the homework setting, and if you feel there is, video-based instruction takes advantage of our human cognitive architecture in a way that can help promote retention and long-term schema formation. I think :) For me, any use of video-based instruction that saves class time can be considered “Flip Instruction”. It doesn’t have to be the entire mode of information transfer, and it MOST DEFINITELY doesn’t require losing the enjoyment that comes from delivering aspects of instances of direct instruction to students during class. I am a huge proponent of Flip Instruction, and spent a majority of my day today in AP Chemistry, standing in front of the classroom, helping students understand and complicated process that could never have been properly addressed in the video they watched the night before, and I failed at helping them construct during the “exploration” phase (see response to question #2 below). Simply, I see Flip Instruction as a way to a) open up ANY AMOUNT of class time for more meaningful student engagement and b) create a more interactive and accountable homework experience for students, especially when merged with other methods of technology such as google docs/forms, and other ways to reflect individually and collaboratively outside of the classroom. Click here and navigate to the “tracking” tab for a simple example.
2. Speaking of bad power points and lecturing; isn’t a video the student watches at home basically the same thing? What’s the difference between my 15 minute power point at school and the 15 minute power point at home? I mean besides the fact that I’m right there to help answer questions. Isn’t that direct instruction?
I completely agree with this critique and personally, feel the future of Flip Instruction as a useful pedagogy, especially for science educators, lies in our ability to address this question. Over the past 6 years, I have experimented with many different models of Flip Instruction, and ultimately come to the same conclusion: front-loading information is bad. Even if it saves classroom time, and even if students get all the multimedia benefits of a screencast lecture (interactivity, modality, segmentation, etc.), true learning, I feel, happens best when students construct their own knowledge, while we act as facilitators in that process. With that said, I absolutely love the concept of off-loading aspects of information transfer to the homework setting, and therefore, find that redefining, or broadening our definition of Flip Instruction is essential.
This year, I am doing a variation of the “Explore-Explain-Apply” model outlined by Karplus (1977). In this adaptation (“Explore-Flip-Apply”) students are part of an inquiry learning cycle where open-ended, inquiry driven exploration occurs first, followed by video instruction (the flip) to address any student misconceptions, and transfer any necessary factual information that was not addressed in the exploration phase. Video instruction is then followed by a classroom application phase where information is used to solve real-world problems, and in my case, AP style problems (mainly free response). This is one example of how the basic concept of Flip Instruction (lecture for homework, homework in class) can exist as a sub-component of a larger learning cycle. Ideally, even the explanation/video phase could be designed, created and distributed by students, and programs such as Explain Everything, Replaynote, Screenchomp, and Showme are excellent first steps in providing a student friendly way to create and share instructional videos with simple technology.
3. What not reading? Flipping classrooms seems to be all about the video experience while seemingly totally ignoring reading. Is this a wise course of action? I saw a comment on Twitter that insisted that no reading should ever be assigned without something interactive attached to it. Aren’t we downplaying the importance of the read word?
I love this question, and it got me thinking more than any other. Ultimately, I default to my definition of Flip Teaching: “…moving any aspect of direct instruction from the classroom to the homework setting using teacher produced, annotated and narrated screencasts.” This is a definition that I worked on developing for the YouTube Teacher’s Studio this summer in Seattle, WA, and my attempt was to create a broad, more inclusive definition, rather than Sal Khan’s comments that I feel over simplified the process (see comments under question #2 above). Sticking to this definition, video instruction is used to address aspects of direct instruction that could more effectively, and efficiently be addressed in a one-on-one video setting with a student plugged in. Thus, in my own experience, it has allowed me more time for promoting in-class literacy skills and critical reading activities when I am there to help guide them. I still assign reading, and have simply found that using class time to help students deconstruct a text, while simultaneously assigning a video that helps explain algorithmic processes, etc., the reading experience outside of class can be more directed.
Either way, I teach Chemistry, and beyond deconstructing a worked example, there isn’t much reading. With that said, our AP Biology teacher has posted videos where she models the reading process for students using a document camera or the e-book, and then assigns various readings where the students are too repeat this process. I consider this Flip Instruction as the modeling process did not occur in class, and the students can revisit it throughout the year. All in all, literacy is increased in a way that embraces 21st century tools with 21st century learners, while also encouraging students to interact with written material. More interestingly, I have had many conversations with English instructors in the past week about Flip Instruction, and we have arrived at a variety of applications that, I feel, only promote good literacy. One teacher had a specifically interesting application process. Her assignment read something like this: 1) “Read pages X-Z in Catcher in the Rye” 2) “When you finish reading page Y, watch video” 3) “Complete Reading pages X-Z” 4) “Submit Google Form with your reflection” While parts 1, 2, & 4 are standard, it is part 3 that incorporates aspects of Flip Instruction. In part three, she recorded a screencast of herself reading a certain, critical passage in the text where, for example, Holden Caulfield demonstrates an important behavior that is crucial to understanding the text, but might not be easily interpreted by students. In the screencast, she helps direct students attention, underlines certain key phrases, while providing a conversational style narration about her thoughts. She does not reveal her direct interpretation, but through the video, increases the chances that all students might have a more meaningful reading experience. The next day in class, students debate the meaning of the reading and the key passage with one another.
I am most definitely a rookie when it comes to anything beyond science instruction, however in working with this teacher over the past week, she noted an increase in student engagement, not only during class, but during the reading process. Students said they looked forward to the section she would “help them read” and felt her direction added not only a motivational boost, but helped to make the reading process more interactive. The teacher noted that she is especially excited to continue this process when they get to Shakespeare as helping students negotiate complex language and meaning is what occupies much of the class time. All in all, I feel good teaching should always help students build literacy skills, and as online tools, especially video, become the norm in classroom instruction, we as educators need promote creative ways to use this technology to create a design infrastructure that amplifies all skills. Farb Nivi, CEO of Grockit.com addresses this well when he states: “The problem with education is not one of engineering, but one of design.” The tools (video, etc.) are not the problem. How we couch these tools in a design infrastructure that encourages development of all skills (literacy, problem solving, inquiry, etc.) is the real goal. Flip Teaching, I believe, is one powerful design tool that happens to harness aspects of current engineering.
4. And finally, what happens when students don’t have Internet? I’ve asked this a dozen times and I either get ignored or I get “well all children need to have Internet to be successful in the 21st Century environment. The government needs to make Internet penetration in this country a priority for all students or we are doing them a massive disservice.” Yeah, thanks for the public policy message but that still fails to answer my question. How do you flip a class when half the kids don’t have online access?
The accountability of my flip process revolves entirely around student submission of a form, embedded directly below the embedded video (shameless plug for my opinions regarding managing extraneous cognitive load). Click here for example. My rule is students must have the form submitted (via the spreadsheet time stamp in google docs) 1 second before the bell rings for class that day. Thus, not only do I not require the video to be watched and processed at home, I encourage students to use resources made available at the school, prior to class, to view the instructional videos. I have noticed, anecdotally, that in providing the conceptual and algorithmic reflections I require in the form in close proximity to class application (see “Explore-Flip-Apply” model noted in question #2) that student engagement increases. Moreover, because I couch the Flip process in an inquiry cycle, students usually only have ~ 2 instructional video/week assigned. With that said, I am fully aware that not all schools have a surplus of computers students can access, and that not all students can arrive at school with the 20 minutes to spare required to watch the videos on site, before or after school.
I teach at an inner city Christian Brother’s Catholic High School school in San Francisco, and while many of our students do have internet access (either at home or via smart phone devices) a majority of our student body does come from a variety of under served areas in our community. Keeping in mind these students, and others whom internet access may not be that readily available, I always start with the same process of questions to determine access and availability. First, I hand out a sheet of paper to students and have them respond “Yes” or “No” to the following questions: 1) “Can you watch and hear a video from the internet on your phone or on a computer at home”? 2) “If you answered no, do you have access to a computer at school or in your community, where you can watch and hear a video on the internet at least 2 times/week”. 3) “If you answered no, do you have access to a DVD player at home where you can watch and hear videos at least 2 times/week, for 10-15 minutes each”? I quickly shuffle through the papers and look for any students that answered “no” to all questions. Usually there is 1-2 students a year in this situation. For these students, I have acquired a few laptops over the years, via ebay, etc., and I loan them the laptops for the year.
I am COMPLETELY aware that this low number is entirely due to my situation. With that said, I am fine spending my money on serving these students, and actually, am now a passionate “horder” of any device that can access the internet in preparation for trying to provide access to all my students. It’s a hobby of sort :). For students that only have DVD access, this is around ~ 5 every year, I burn DVDs of the lecturers and created a variation of the google form other students create, that they do by hand. The reflection process in my practice is absolutely essential, and although the written responses are not nearly as valuable as the cataloged google form response, they do serve as a necessary bridge helping students with limited access participate. In a sense, I am dancing around the access issue, however this process has been almost full-proof, and there have actually been a few years where all students had access. I would suggest, before assuming that access is a big issue, create a process such as this where the instructor sequentially presents access options to students, in order to get a real number of how many actually can’t access videos online.