Tuesday, October 30, 2007

You give me fervor: dSt/dt = a(C, P, ...) [St/Ut] * [1 - St/Ut]

Having made stuff up conducted trend analysis and flipped coins integrated qualitative research for about 30 years, this generation of nets and boxes begins to rock in pretty disruptive ways. In this new place, massive networks of storage, more consumer driven than ever, connected by faster networks and wicked magic small tech.We've only hadaspirin around for 110 years. Modeling complex systems with itty bitty Reverse Polish Notation in the hallowed HP-45's ten or so registers and 49 programming steps.Writing decision support programs in Lotus 1-2-3 with 5 inch floppies.... and that dreamy IBM PC XT with what, 10 or 20 droolworthy Megabytes of storage. Then HyperCard and hmmm.... I think it was Mac with 6.something used to model Automatic Teller Machine user interfaces with real live consumers pointing and clicking.... And these kids today: so many tubes!Adoption curves (usually s shaped, often tied with knowing marketing hosers arguing either for or against the adoption of the Next Big Thing depending upon whether said hoser has (or can assert ownership of) the Next Big Thing) apply, but do the curves start to point ever more upward?More people. More networks. More transparency to the tech. Better abstraction and generalization of interfaces (none dare call it commoditization). Like whoosh? Potentiated equilibrium? Chicxulub?

Andy S. Kydes for the US Department of Energy provides a concise review of how capital budgeting decisions occur in the context of changing technology. (The model best applies, I believe, to infrastructure decisions which tend to be big lumps of capital and concurrent retooling of skills, but is nonetheless useful in exposition of the dynamics of market adoption.)The "dSt/dt = a(C, P, ...) [St/Ut] * [1 - St/Ut]" bits concern, essentially, the rate at which an infection moves through a population. Technology adoption follows this kind of logic. A few try it, they like it, they tell their community and the new drives out (or back) most (sometimes all) of the old.NB: Experienced technology managers will recognize that for early releases the infection model truly speaks truth. Although the DOE's exposition of tech adoption does involve energy components (coal, nukes, etc.) but the principles of how one generally assesses and then opts to adopt a new technology inform the supply side of storage componentry and the demand side of direct storage consumers (thumbdrives) and service providers employing storage as a means to an end (Google, Carbonite, HuLu, yadda yadda).What continues apace: Acceleration Interesting to see the delivery of high capacity spinning media, high density flash media (Gb), and emergent nanotech promising huge efficiencies and availability in something like 18 months (mid-2009) in terabytes.Well, interesting in the sense of wonks, not interesting like "the promise of moonlight in a martini", as Mr. Shanley noted.Curious as to how the Technology Adoption Life Cycle morphs. Do we all become Early Adopters? Seems that time to market becomes the critical success factor.... I mean.... like really really vital on the supply side (sell 'em) and the demand side (buy 'em for the benefits). Hmm.... so let's say I can abstract the command and control systems in some stable overlay that permits rapid change out in underlying components, thereby reducing the friction of changing out the So Last Year blinkenlights for the So Happening new blinkenlights. Doodling time. Time for coffee. Or moonlight.

Sunday, October 14, 2007

Measurement, Innovation, & Success

 George Grossmith as Ko-Ko,1885 from http://math.boisestate.edu/gas/mikado/docimages/kkgrossmith3_sm.jpg

Assessment schemes have long been a core component of educational systems. From primordial extreme evaluations (of, say, that Socrates guy) to contemporary network based rubric, evaluation, and information management systems (such as
WEAVE) they have been welcomed and loathed by the implementers.

From a skim of sources published over the last several years, the essence of the loathing comes from imposed, non collaborative implementations.

A blogger (college professor), complains of the “here’s a miracle and the guaranteed fix to all things ailing the school” approach.

In a 2007 interview with a retired department head in an Illinois regional high school, she noted that (circa 1970) she had the distinct feeling that once all of the forms filled out, all reports written, all notes taken, all revisions made, all retyping at the kitchen table concluded….. that all that material got stuck into the Superintendent’s safe and was never used again.

As some day it may happen that a victim must be found,
I've got a little list — I've got a little list

KoKo: The Mikado

This burden of method not unique to education.

The Red Tide of performance management, quality control, institutional “dashboards” representing key variables appeared throughout industry in the 1980s.

An example, from the aviation industry, is indicative:
Total Quality Management Systems (TQMS) were de rigueur… TQMS became known to middle managers of Long Island's Grumman as “Time to Quit and Move to Seattle.”

Meanwhile, other practitioners of the particular craft of programming encountered massive Methodologies (capitalization intentional) which examined the needs of a project the way Audubon examined the beauty of the bird: at the end of the process the souls and the birds were dead.

By the time the “specification” got nailed down, the needs had changed, interests had faded, and often the beginning of the specification was so aged that it had to be refreshed over and over and over.

Planning lemma: the sampling of the change in the environment has to be faster than the change rate of the environment. (This is with apologies to Dr. Claude Shannon and his work in information theory.)



Trends: Faster Networked Assessment as Design Feedstock

Collaboration and engagement.

Non judgmental evaluation.

Participatory design. Evolution and extension. Artifact. Portfolio.

Surely a hefty string of buzz words, but in context they offer accelerating improvement in educational design, delivery, and outcomes. The positive outcomes extend beyond the student: the strength of fact based design and implementations drive innovation and, typically, the joy of good work.

Less time is spent mulling over basic questions (how many, when, averages) and actually the questions become better because the reference data have worth and credibility. Fact Books, institutional almanacs, reduce the search time and improve the visibility of trends in student, institutional, and increasingly, world data.

An observation: these data become part of the institutional memory. The data are "owned" by the culture; "Joe's report", or "Madeline's Chart" become institutional assets.

Long time assessment practitioners, such as Boston College and Delaware County Community College have adopted Institutional Research and assessment mechanisms as their “tao” of pedagogy.

Libraries of assessment frameworks have started to appear on the Internet, although most of these libraries target elementary education.

These Internet resources have not matured, but they change on Internet time: rapid, and characterized by both incremental change (like the adoption of new programming languages) and discontinuous punctuated equilibria (like the takeoff of YouTube or American Idol).

These resources will become authoritative and likely “peer reviewed” as moderated wikis (although state of the art will move beyond the wiki).

Networked systems will begin to "understand" content through implementations of the emerging Semantic Web: a common framework to allow data and resource sharing across organizational boundaries. The underlying mechanisms of the Semantic Web promise interoperability and heightened reuse of educational materials (a low level specification of the Semantic Web is even called Knowledge Interchange Format).


A key learning of these sources: effective assessment schemes depend upon the involvement of all layers of the educational institution:
  • Instructors (design and delivery)
  • Program Design Teams
  • Institutional Research
  • Accreditation Organizations (e.g, North Central)
  • External Partners (Microsoft, Autodesk, John Deere....)

Broader Scope & Fragmentation

Increasingly, students, prospective students, and community educational partners have become integral to the measurement and assessment process. This involvement appears more in the cradle to cradle cycle starting with the (argot alert) formative (prospective) contributions to answer questions of what curricula are needed now or will be needed in the relevant future, and the (argot alert) summative (retrospective) programs which evaluate “what we wanted to do, what we did, what went right, and what went wrong” issues.

More types of information have been included in the “assessment environment”, adding to the base information (how many, what kind, how’d they do) with more qualitative narratives of how the educator, the student, or the employer found the results being worthy of praise or correction.

Assessment “refresh” times have decreased and should continue to decrease. Colleges generally and Community Colleges in particular face a fragmented market where mass production shifts to mass customization. The use of good practice, mostly learned from the industrial and software domains, will be essential to innovation and survival, hence quality and job satisfaction, in education.

Good assessment practices uniformly emphasize that this is an iterative and continuous improvement process: there is no tape to break, no line to cross, no “well, that’s that”.

In applying evaluation and assessment to the dynamics of communities and human beings all reacting to changing relationships, technologies and practices.