Sunday, October 14, 2007

Measurement, Innovation, & Success

 George Grossmith as Ko-Ko,1885 from http://math.boisestate.edu/gas/mikado/docimages/kkgrossmith3_sm.jpg

Assessment schemes have long been a core component of educational systems. From primordial extreme evaluations (of, say, that Socrates guy) to contemporary network based rubric, evaluation, and information management systems (such as
WEAVE) they have been welcomed and loathed by the implementers.

From a skim of sources published over the last several years, the essence of the loathing comes from imposed, non collaborative implementations.

A blogger (college professor), complains of the “here’s a miracle and the guaranteed fix to all things ailing the school” approach.

In a 2007 interview with a retired department head in an Illinois regional high school, she noted that (circa 1970) she had the distinct feeling that once all of the forms filled out, all reports written, all notes taken, all revisions made, all retyping at the kitchen table concluded….. that all that material got stuck into the Superintendent’s safe and was never used again.

As some day it may happen that a victim must be found,
I've got a little list — I've got a little list

KoKo: The Mikado

This burden of method not unique to education.

The Red Tide of performance management, quality control, institutional “dashboards” representing key variables appeared throughout industry in the 1980s.

An example, from the aviation industry, is indicative:
Total Quality Management Systems (TQMS) were de rigueur… TQMS became known to middle managers of Long Island's Grumman as “Time to Quit and Move to Seattle.”

Meanwhile, other practitioners of the particular craft of programming encountered massive Methodologies (capitalization intentional) which examined the needs of a project the way Audubon examined the beauty of the bird: at the end of the process the souls and the birds were dead.

By the time the “specification” got nailed down, the needs had changed, interests had faded, and often the beginning of the specification was so aged that it had to be refreshed over and over and over.

Planning lemma: the sampling of the change in the environment has to be faster than the change rate of the environment. (This is with apologies to Dr. Claude Shannon and his work in information theory.)



Trends: Faster Networked Assessment as Design Feedstock

Collaboration and engagement.

Non judgmental evaluation.

Participatory design. Evolution and extension. Artifact. Portfolio.

Surely a hefty string of buzz words, but in context they offer accelerating improvement in educational design, delivery, and outcomes. The positive outcomes extend beyond the student: the strength of fact based design and implementations drive innovation and, typically, the joy of good work.

Less time is spent mulling over basic questions (how many, when, averages) and actually the questions become better because the reference data have worth and credibility. Fact Books, institutional almanacs, reduce the search time and improve the visibility of trends in student, institutional, and increasingly, world data.

An observation: these data become part of the institutional memory. The data are "owned" by the culture; "Joe's report", or "Madeline's Chart" become institutional assets.

Long time assessment practitioners, such as Boston College and Delaware County Community College have adopted Institutional Research and assessment mechanisms as their “tao” of pedagogy.

Libraries of assessment frameworks have started to appear on the Internet, although most of these libraries target elementary education.

These Internet resources have not matured, but they change on Internet time: rapid, and characterized by both incremental change (like the adoption of new programming languages) and discontinuous punctuated equilibria (like the takeoff of YouTube or American Idol).

These resources will become authoritative and likely “peer reviewed” as moderated wikis (although state of the art will move beyond the wiki).

Networked systems will begin to "understand" content through implementations of the emerging Semantic Web: a common framework to allow data and resource sharing across organizational boundaries. The underlying mechanisms of the Semantic Web promise interoperability and heightened reuse of educational materials (a low level specification of the Semantic Web is even called Knowledge Interchange Format).


A key learning of these sources: effective assessment schemes depend upon the involvement of all layers of the educational institution:
  • Instructors (design and delivery)
  • Program Design Teams
  • Institutional Research
  • Accreditation Organizations (e.g, North Central)
  • External Partners (Microsoft, Autodesk, John Deere....)

Broader Scope & Fragmentation

Increasingly, students, prospective students, and community educational partners have become integral to the measurement and assessment process. This involvement appears more in the cradle to cradle cycle starting with the (argot alert) formative (prospective) contributions to answer questions of what curricula are needed now or will be needed in the relevant future, and the (argot alert) summative (retrospective) programs which evaluate “what we wanted to do, what we did, what went right, and what went wrong” issues.

More types of information have been included in the “assessment environment”, adding to the base information (how many, what kind, how’d they do) with more qualitative narratives of how the educator, the student, or the employer found the results being worthy of praise or correction.

Assessment “refresh” times have decreased and should continue to decrease. Colleges generally and Community Colleges in particular face a fragmented market where mass production shifts to mass customization. The use of good practice, mostly learned from the industrial and software domains, will be essential to innovation and survival, hence quality and job satisfaction, in education.

Good assessment practices uniformly emphasize that this is an iterative and continuous improvement process: there is no tape to break, no line to cross, no “well, that’s that”.

In applying evaluation and assessment to the dynamics of communities and human beings all reacting to changing relationships, technologies and practices.

No comments :