"Nothing that is worth knowing can be taught."

Oscar wilde (1854-1900)

“Nothing that is worth knowing can be taught.” I don’t agree whole-heartedly with this statement, but it does shed some light on me, and what I’ve learned while studying educational technology. I don't think Oscar Wilde was being cynical towards teachers. He was talking to learners, impressing upon them the fact that meaningful knowledge and wisdom comes from experience.

This is my fifth year teaching algebra to eighth graders. On my first day teaching I entered a classroom with a textbook and some markers for the overhead projector. I had been hired in January to replace a teacher that hadn't shown up to teach since September. It wasn't the best of situations, but I made it work, and by the end of the year my students were well-behaved and had even learned some math. It's hard to believe that was only four years ago. I've since learned that there are better ways to educate a person than the hasty, stop-gap measures I implemented that first half of a year. I started off sitting at the overhead projector and lecturing to my students. Now I use a digital presentation station and my students work in collaborative groups.

I have taken Oscar Wilde's message to heart while planning lessons for my students and leading trainings for adults. As an instructional designer, I’ve learned that it’s my job to set up a challenging learning environment, motivate my students, and provide the tools necessary for success. I’ve grown to see myself not as the ‘sage on the stage’ as they say, but rather the ‘guide on the side.’ I provide the boundaries of the learning experience and my students learn by taking part in it.

Nearly two years ago, I began the COMET program naively thinking that I had signed up for a master’s degree in learning how to use cool technology in the classroom. My classes, I imagined, would introduce me to new websites, software, and hardware that would revolutionize my teaching methods and change the way my students interacted with computers and with each other. This was going to be two years of fun.

How wrong I was. While I did learn many of those things (and had fun doing it), they were only the icing on the cake. The cake itself was made up of instructional principles, models, and learning theories that provided meaningful ways to design instruction and implement innovative new technology in the classroom. This recipe wasn’t about using technology for technology’s sake. It was about collecting and analyzing data, applying theory, and drawing on experience to design high-quality instruction. I went into the oven as a technologist and I’m emerging as an instructional designer.

A is for analysis

Analysis, design, development, implementation, evaluation; every time I hear these words they take me back to the beginning. ADDIE was the first model I learned and it was my introduction to the world of instructional systems design (ISD). It has served as a foundation for everything that I’ve learned since.

While all steps in the ADDIE process are important, analysis has had the most profound impact on my practice. When I think of how I used to approach planning instruction many clichés come to mind. I used to “go with my gut.” I would “trust my instincts.” I’m sure there are others but the point is, I never took the time to gather data and analyze it. It always seemed so time-consuming, so overwhelming. Collecting data made me think of lists of numbers, statistical functions, and graphs. Why would I go through all that trouble to plan a lesson for one day? I didn’t need to survey my students to know they didn’t know how to do the next day’s lesson. Of course they didn’t, they hadn’t learned it yet. So analysis got pushed under the rug for more pressing things.

What I didn’t know about were all the different types of analysis that were at my disposal and the kinds of information that each analysis provided. Mager and Pipe (1997) introduced me to training needs assessment, and performance, task, environment, and goal analyses. What I found so interesting was that conducting these analyses didn’t have to take as much time or effort as I previously thought. These tools could quickly help me answer questions like,

  • Who’s involved?
  • Where are they?
  • What do I need them to be able to do?
  • What gaps in performance exist?

Over the past two years I have worked hard to integrate analysis into my planning routine throughout the school year. At the beginning of the year I perform a goal analysis to set specific goals for my students to reach before high-stakes testing, such as the California Standards Test (CST). As a math teacher I use task analysis on a daily basis to determine the best way to teach students how to solve complicated multi-step equations. I use performance analysis after every project, test, or quiz to compare student scores against each other, against other math classes, and against my expectations.

In addition to using the Mager and Pipe-style analyses, I also have found a greater appreciation for analyzing numerical using basic statistical functions like averages, mean, median, mode, and standard deviation. My group members Jason Barclay, Jennifer Ellis, and I performed data-analysis in Ed 690 while researching the effects of synchronous and asynchronous discussion on student learning outcomes. Collecting and organizing the data allowed my group to draw meaningful conclusions from lists of numbers in a spreadsheet. This is not the type of analysis I use on a regular basis to guide instruction because it is so time-consuming. However, this is an essential tool when analyzing student test results on district benchmarks and on high-stakes tests. Websites such as DataDirector allow me to analyze student test data, which lets me know which skills need review or which skill instruction needs improvement for the next school year.

While it hasn’t been easy, building analysis into my routine has paid dividends. Student achievement is on the rise, and I am more confident in my instructional decisions because I know they are not being made haphazardly. Analysis provides me with a framework for organizing and drawing conclusions from data that I already had in most cases.

Beginning with the end in mind

Writing a good math test is hard. You need to select problems that are good representations of the skills covered in the chapter, being sure to include important special cases. Both computational and conceptual knowledge should be assessed. A word problem or two is also a good thing to add. And, the test should serve as both a formative and summative assessment. Formative in the sense that it is providing information about student progress in preparation for high-stakes testing, and summative because it measures student achievement on a given skill set.

While a good math test is difficult to write, it’s even more difficult to plan instruction to effectively prepare students to perform well on a test. While designing an AVID (Advancement Via Individual Determination) training for math teachers in Edtec 544, Dr. TJ Kopcha introduced me to a simple principle that has completely changed the way I plan instruction. Beginning with the end in mind. Making sure that the skills being tested align with a learning objective and opportunities for practice. For example, if I want students to solve quadratic equations on a test, there had better be a well-written objective describing what students will be expected to be able to do. And the practice that I provide for that skill must be the same type of problem that is on the test. In other words, if students are assessed on drawing a linear graph, then they need to practice drawing graphs. If on the test they are asked to choose the correct graph from a list, then I should provide opportunities to practice choosing graphs from a list in their classwork and homework.

Beginning with the end in mind is so powerful because of its simplicity. I always knew this was a good practice, but I didn’t fully understand it until grappling with it in Dr. Kopcha’s class. For this reason I see beginning with the end in mind as a skill that is learned. It makes planning more purposeful, more deliberate. In many ways, it simplifies things. Assessment questions lead to good objectives, which lead to meaningful opportunities for practice. And all of this leads to tests that students are better prepared for because there are no surprises. Well-prepared students achieve higher learning outcomes and demonstrate greater confidence in their abilities.

Good design principles

We live in an increasingly on-demand society. With tools like smartphones, wireless Internet, and Google web searches, a vast amount of information is available at the touch of a button. When people want something, they want it now. They don’t want to spend a lot of time looking for it. Following good design principles, like those described by Williams and Tollett (2006), makes interacting with that information more appealing to the viewer and organizes it logically on the page.

When it came to drawing or painting, my artistic ability reached its limit sometime shortly after stick figures and finger-painting. However, that doesn’t mean that I can’t create nice looking documents and websites. To achieve high quality designs I learned to use Adobe Photoshop, Flash, and Dreamweaver for graphic and web design. I used Microsoft Office, and Apple iWork for word processing and presentations. I also used Weebly and Google Sites to create and edit websites online. Although I learned to use many different types of software, I applied the same basic design principles to each project.

I view design in two major categories: graphic and web design, and print (or text) design. As I am not a professional graphic designer (nor do I aspire to be), I have tried to stick to the basics. I’ve used the CARP principles to guide my graphic and web designs. CARP stands for contrast, alignment, repetition, and proximity. Effective use of these principles results in a nice looking, logically organized graphic design.

According to Dr. Jim Marshall (lecture, October 7, 2008), “Presentation is at least 80% of anything you write.” When writing text for a training, math lesson, or website, I’ve learned to keep the reader in mind. A large monotonous block of text can be daunting to any reader and may discourage them from focusing on what the text is trying to say. To this end, I’ve learned to use bullets and tables to organize information and make the text more visually appealing to the reader.

I like design because I find it interesting that the way any given media looks affects how the information contained within it is perceived. I see this in my math class all the time. My students are more receptive to lessons with interesting pictures, color, and sparse text, rather than looking at worksheets and textbooks. In a classroom with 35 eighth graders, I can use all the help I can get. If it means that I have to spend a little extra time making things look more visually appealing so that my students are a little more interested, a little more engaged in the learning process, then I consider that time well spent.

Moving forward

As I finish my coursework and prepare to receive my diploma I feel ready to meet future challenges head-on and apply what I’ve learned about instructional system design. But that doesn’t mean that I have no more growing to do. The more I learn about instructional design, the more I realize that I’ve just scratched the surface of a constantly changing field. We’ve studied proven ways to use Web2.0 applications for education, but what happens when we get to Web3.0? How will the rise of online learning affect K-12 education? What will be the affect on education, as cell phones get cheaper and smarter? These are just a few big questions that loom on the instruction design horizon.

Despite how things change, core principles such as analysis, beginning with the end in mind, and good design principles will still apply.

  • New methods of analysis will certainly come about, but the core principle of using data to draw meaningful conclusions will always be there.
  • Delivery of instruction and assessment will change, but the need for working backwards and aligning objectives with practice and assessment will remain valuable.
  • As media continues to become more visual, practicing good design principles becomes ever more important.

Looking back at all I’ve learned, the recipe for my instructional design cake is made up of principles, theories, and models that, when mixed together make a tasty dish. But as I continue to learn new things, I will add them to the recipe to spice it up, creating an increasingly complex dish that I can use to nourish my students who are hungry for knowledge.



Mager, R., & Pipe, P. (1997). Analyzing performance problems or you really oughta wanna. Atlanta, GA: CEP Press.

Williams, R., & Tollett, J. (2006). The non-designers web book (3rd ed.). Berkeley, CA: Peachpit Press.