By Terry Greene

This is IDIGOntario‘s 2nd post of the #9x9x25 Challenge

When I signed up to be on the IDIG team I very vaguely said I would like to write something about the “Front End Analysis” phase of Instructional Design. Also known as the “What are we doing and why” part.

If you do this part well you can avoid making big mistakes down the road. You might even realize that you shouldn’t even do it at all. You also tend have that “come on, come on, let’s get going!” feeling buzzing around you. I am feeling that right now as we prepare to try a new way of delivering Ontario Extend in January. But no matter how many angles you try to anticipate, something will surprise you when you implement it.

A good example of this came from the scholarly project I worked on to complete my ID Master’s. I created an instructional Alternate Reality Game (ARG). It was designed to help youth identify problem gambling behaviours and know how to reduce the harm of them. I completed a lengthy front end analysis in which I tried to anticipate who the learners were and what their needs would be to complete the game. I never really considered that some kids might not be up for suspending their disbelief in what was meant to be a fun way to learn.

English mastiff Tyra aka Chance the “missing” dog

The first test went great, with a group of ninth grade students who were asked to participate and agreed of their own accord. They had fun and were successful in taking the story to its conclusion. The final test run, however, was a different story. In working with the program facilitator for the gambling awareness group, we were able to bring the game to test it out on an entire class of (I think) 11th grade students at an “alternative” high school. I don’t recall too much about the makeup of the class or the reasons they had enrolled in a “different” kind of high school. In general you could say that the students were rightfully kind of pissed off about how their education was going so far.

They didn’t want to pretend. They didn’t want to make believe. They didn’t give a damn about rescuing a fake dog. They did the game activities, but it would have probably served them better to give them a list of problem gambling behaviours to look out for and harm reduction strategies to use and to just have a discussion about how these things have affected their lives. I remember clearly the look one student gave me when he realized I was trying to trick him into playing along. It was utterly deflating. The results of the test run were that yes, the students reached the objectives. Learning was measured. But the feeling in the room was not the fun buzz I was working toward in the back of my mind. It was a stark opposite.

I’m going way over 25 sentences by digging in to that anecdote. My point is that I did not anticipate, at all, that this idea of learning via a game would resonate so poorly with these students. I didn’t ask the right, or enough questions in my front-end analysis. JR Dingwall’s post in which he did ask the right questions to help bring about a great result, is what got me thinking about what questions to ask in the beginning.

So I ask you, what questions do you ask yourself and others when you first sit down to analyze an ID project? How can you avoid making something that leaves students feeling flat and misunderstood?

Terry Greene is a Program Manager at eCampusOntario, seconded from Fleming College where he is a Learning Technology Specialist. You can find him on Twitter @greeneterry

“Question?” flickr photo by spi516 https://flickr.com/photos/spi/2113651310 shared under a Creative Commons (BY-SA) license

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *