Instructional Technology: Innovation Challenges and Solutions in Education

Posted: February 1, 2007 in Adult Education, Education, Educational Technology, Instructional Technology, Technology Integration

An instructional problem that often challenges educators involves innovation. Educators face new proposals each year, and are inundated with requirements from national, state, and local levels. Educators must sort through each innovation in order to meet the needs of their students, and to determine what serves as best practice. Some innovations are determined through course evaluation during the delivery process, some originate via training programs, and many are mandated by legislation without the benefit of local adoption. Levels of use of new ideas and programs are fundamental in an educator’s success in achieving a needed or wanted innovation. Research in the field of instructional technology assists educators in processing and developing innovation.

Course improvement can take many forms, but a valuable approach is through evaluation. Lee J. Cronbach points out that evaluation should determine what changes a course produces and should identify areas where a course should be revised. The evaluation process should be fluid and take place during various stages of a course so that changes can be made midstream and throughout (Ely & Plomp, 2001). Cronbach warns that, “It becomes immediately apparent that evaluation is a diversified activity and that no one set of principles will suffice for all situations” (Ely & Plomp, 2001, p.123). Cronbach identifies that through evaluation changes can be made over the course of instruction and changes can be identified to ascertain if further revisions are needed. Outcomes observed should be general and consider items “beyond the content of the curriculum itself-attitudes, career choices, general understandings and intellectual powers, and aptitude for further learning in the field” (Ely & Plomp, 2001, p. 133). As educators identify areas of course improvement they gain personal control of the educational process.

Teachers can determine materials, resources, and technology that may hinder the success of students, and educators can find and improve their use of other resources that can assist in the educational or delivery process. Robert B. Kozma identifies the benefits of many forms of media that enable learning: the stability of text to aid comprehension, the advantages of pictures and objects to complement or focus learners, the positive aspects of viewing video with a purpose, the ability of computers to aid instruction via graphical models, and the merits and possibilities of multimedia in education (Ely & Plomp, 2001). Kozma outlines that an educator’s ability to take advantage of emerging technologies depends on continued media research (Ely & Plomp, 2001). The study of such research is vital in order for educators to bring about innovation.

Training programs are key to introducing educators to research based innovations. Simply attending in-service meetings, conference sessions, or specific training programs is not enough to sustain innovation. Donald L. Kirkpatrick lays out four evaluation steps that assist in assessing training programs: reaction, learning, behavior, and results (Ely & Plomp, 1996). Kirkpatrick states that reaction is defined by how well trainees liked a training program; however, he cautions about the determining factors of such evaluation due to the pitfalls of a dynamic presenter that may not necessarily relay helpful or valid information (Ely & Plomp, 1996). Learning is more difficult to measure than reaction, and Kirkpatrick advises the need of measurable assessments that provide statistical evidence that trainees have learned specific information or innovations (Ely & Plomp, 1996). Changes in behavior via training must be measured through particular studies that have been established in the past. Kirkpatrick advises that the use of professionals (statisticians, research people, and consultants) can be very beneficial in this area (Ely & Plomp, 1996). Hopefully, training programs have objectives that are stated in terms of results, as reactions, learning, and behavior all lead to that end. Results or levels of use of innovations are factors that can assist in the effectiveness of training programs.

There are various levels of use frameworks that enable administrators or educators to measure an innovation’s effectiveness. Gene E. Hall, Susan F. Loucks, William L. Rutherford, and Beulah W. Newlove define eight Levels of Use: Level 0 (Non-Use), Level I (Orientation), Level II (Preparation), Level III (Mechanical Use), Level IV (broken into two levels: IV A Routine, IV B Refinement), Level V (Integration), and Level VI (Renewal) (Ely & Plomp, 2001). Each level identifies the rate at which a user has incorporated an innovation into their field, and ultimately determines a point where the user modifies an innovation or searches for and finds an alternative. Each level of use can be matched to models such as the Concerns Based Adoption Model (CBAM) of G.E. Hall and S.M. Hord. This model identifies and provides assessment for seven stages of concern: awareness, informational, personal, management, consequence, collaboration, and refocusing (Loucks-Horsley, 1996). In this model people who are experiencing or considering change are found to have many questions. The model grants the importance of attending to where people are and the questions they have, and it suggests that attention be paid over an extended period of time as innovation may require three years for concerns to be resolved and later ones to emerge (Loucks-Horsley, 1996). Christopher Moertsch’s Levels of Technology Implementation also aligns with the Levels of Use model by identifying categories in the following way: nonuse, awareness, exploration, infusion, integration (mechanical), integration (routine), expansion, and refinement (Moertsch, 1998). Any and all of these models can assist educators in implementing and evaluation of the innovation process, and educators can benefit from using these models to chart or define stages of success.

An approach that may assist program designers at the national, state, and local levels to increase the success of innovations they hope to institute in education settings would be John M. Keller’s ARCS model. The ARCS model defines four major conditions that must be met in order to motivate people: attention, relevance, confidence, and satisfaction (Ely & Plomp, 2001). Although the ARCS model is a process for designing classroom instruction, its process has value in evaluating innovation. Attention is a prerequisite of learning and it must be sustained. No Child Left Behind (NCLB) is a federally generated innovation that currently has the attention of educators. NCLB’s relevance is founded in the fundamental concern over accountability, but confidence in this national legislation is limited by parallel concerns over funding, time, and resources. Satisfaction with the results of NCLB remains to be seen, but the mandates it requires are far reaching. The ARCS model includes a systematic design process that includes the steps: define, design, develop, and evaluate (Ely & Plomp, 2001). This process can be applied to new ideas, legislation, or training particularly to generate or maintain motivation in such innovations. Without motivation any innovation will remain at Level 0 (Non-Use) in the Levels of Use scale, and what may bring about great change and instructional success may be lost in the caverns of disinterest, negativity, and neglect.

Innovations are things that are new, introduced for the first time, or a new way of doing something. When it comes to instructional technology it includes fresh ways of looking at ideas based on research and careful evaluation. Innovation can be challenging for educators if they are not prepared for change; however, educators can be agents of change through careful evaluation of courses, training programs, and through self-determination of levels of use concerning innovation. Mandated innovations, such as NCLB, require educators to evaluate the coursework they are currently using and to transfer that to the requirements of the legislation while maintaining a proactive approach in helping shape the future of NCLB. Designers of such national programs can learn from models such as Levels of Use, CBAM, or ARCS to bring about or evaluate positive change via the processes each model proposes. In the end, those involved in educational technology are engaged in what the preliminary Association for Educational Communications and Technology (AECT) definition proposes: “the study and ethical practice of facilitating learning and improving performance by creating, using, and managing appropriate technological processes and resources” (Rezabek, 2004). All innovation is not equal, and all innovation in not necessarily good, but as educators systematically analyze, design, and evaluate instruction they can create solutions to educational challenges. Those solutions become the innovations, and educators serve as the agents of change for the better as they incorporate the fundamentals of instructional technology.

References

Ely, Donald P. & Plomp, Tjeerd (1996). Classic Writings on Instructional Technology. Englewood, Colorado: Libraries Unlimited, Inc.

Ely, Donald P. & Plomp, Tjeerd (2001). Classic Writings on Instructional Technology. Englewood, Colorado: Libraries Unlimited, Inc.

Loucks-Horsley, Susan, (1996). The Concerns-Based Adoption Model (CBAM): A Model for Change in Individuals. National Standards & the Science Curriculum. Retrieved November 15, 2004, from http://www.nas.edu/rise/backg4a.htm

Moertsch, C. (1998), Levels of Use of Technology. Computer Efficiancy, Learning and Leading with Technology. Retrieved November 15, 2004, from http://www.rmcdenver.com/useguide/cbam.htm

Rezabek, Landra (2004). Retrieved September 6, 2004 from University of Wyoming, Department of Adult Learning and Technology, ITEC 5010 Instructional Technology online course discussion.

About these ads

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s