hopper, 1993 [4.3, abstract, overview, toc, switchboard, references]

4.3.1 Availability

As daunting as "multimedia" and constructing microworlds proved to be for the ESCAPE project, the most conspicuous reason that these were not pursued was the need to conquer another problem that consumed a large amount of resources. The source of the problem was the project's long range plan to make ESCAPE available to all entering freshman engineers at Purdue and even to other institutions. By 1990, there were not enough Macintosh computers in Purdue University Computing Center (PUCC) instructional labs to provide access for all engineering students at Purdue. Macintoshes were not a viable method of providing access to the desired audience, but the Schools of Engineering and PUCC provided a large lab of 90 Sun Workstations for introducing all beginning engineering students to computers and C programming. The Suns were the only viable way to accommodate virtually all beginning engineering students. In the fall of 1990, work began to port the HyperCard courseware to a format that could be utilized on the Sun UNIX environment. While the ESCAPE development team's decision was based on the issue of availability, they also believed the Sun workstation's high speed processing, high resolution graphics, large RAM, and multi-gigabyte storage was an inviting environment for finally providing multimedia databases and true microworlds for problem solving. By the fall of 1992, ESCAPE had been partially ported and was successfully implemented within the Sun environment using HyperNews.
 
At Athena, as personal computers developed with operating systems besides UNIX running X-Windows System Version 11, availability became a major goal. The following passage demonstrates Athena's well learned lesson about the pit falls of defining coherence around operating systems. The Athena environment was limited to a few expensive hardware platforms, and a single operating system for the entire network. This limitation of adaptability through availability became apparent as personal computers approached the power of workstations:
 
The power of personal computers is now approaching the capabilities of the standard Athena workstation. At the other end, multimedia and three dimensional color graphical displays require more powerful workstations. Many exciting educational applications can be imagined using the resources. The future Athena computing environment should accommodate a more heterogeneous mix of workstations. (Murman, 1989, p. 1-7)

 
These conflicts resulted in the conclusion that diversity in the form of a much wider variety of personal computers and workstations, as well as a larger variety of operating systems, a re-evaluation of goals and the deeper issue of what it means to be an academic computing organization, and eventually a redefinition of the term "Coherence." Athena moved to abandoning hardware and operating system coherence in favor of data and application coherence. The fact that this redefinition comes from hard learned experience is relatively clear in the tone and observations in the following passage:
 
Why not standardize operating systems? Commercial software developers aim at mass markets. Mass markets grow up around widely used operating systems. Except for the "IBM-compatible" domain, widely used operating systems rarely extend across hardware types. Therefore, operating systems suitable for standardization across hardware types--such as UNIX--support very little commercial software. Standardizing on a cross-cutting operating system such as UNIX thus deprives users of the 20000 or so pieces of commercial software they might otherwise consider. This deprivation makes coherence of operating system across different hardware all but impossible to maintain within a community like MIT. ((Committee on Academic Computation for the 1990's and Beyond, 1990, p. 30)

 
The desire to access commercially available programs and the desire to intercommunicate suggested coherence built around a minimal set of desirable common services like electronic mail, on-line teaching assistant that can be used in any course, the course catalog and registration services, and access to scholarly databases at MIT and elsewhere. This is "coherence" defined in terms of the data. Data coherence means that data may be interchanged among computers.
 
More recently, especially within communities such as universities, communications networks have become more pervasive and capable, making data communication simpler, faster, and often less expensive than it once was. This capability encourages community members to transfer documents and data files among their computers. Simple protocols for exchanging data, such as ASCII text files, satisfy some data-communications needs. Increasingly, though, computer users want to convey fully annotated files, such as word-processor documents, spreadsheets, or binary files to their colleagues elsewhere on a network. This requires more sophisticated protocols, which in turn constrain the choice of applications programs and, to a lesser extent, hardware. (Committee on Academic Computation for the 1990's and Beyond, 1990, p. 28)

 
In this new framework at Athena, individuals can use commercially available software in the hardware and operating system of their choice. In addition, from their native environments (such as an IBM personal computer or a Macintosh) they can also access common services in straightforward ways. As a result of the experiences of the pioneer's on Athena's "distributed computing" frontier, it is now clear that in the future data coherence and the standards that implies, will be powerful keys to providing for the wide availability of software and courseware across a wide variety of hardware in the next decade.
© Mary E. Hopper | MEHopper@TheWorld.com [posted 12/04/93 | revised 04/12/13]