Thursday, 22 October 2009

The Ramblings of a Madman . . .

A few thoughts from a long time ago . . .

On Hypermedia:

“In the future everything will be fact for fifteen minutes.” - futuristic UK Magazine Cyber Times - New Computer Express, 15 September 1990

The idea of interlinked information being electronically stored and accessed at great speed from any place is attributed to Vannevar Bush in 1945.

Michael Sperberg - McQueen argues that only conventions make traditional books linear and not any inherent characteristics and many people agree that cross referencing and indexing amongst other techniques allow books to be non sequential.*1

Among them rages the debate between those who favour ‘ideal’ Hypermedia and those favouring ‘applied’ Hypermedia.*2 Ideal Hypermedia is a situation where the user browses through a system at leisure driven by his own curiosity and desire for information - such a system is characteristically visualised as a web-like structure. Applied Hypermedia is a system that is designed with the accomplishment of a specific task in mind. 'Pro-idealists' point to the availability and accessibility of a myriad choices driven solely by the user’s desire for information. The process of navigation is seen to represent ‘learning, creativity, collaboration and understanding’. Therefore Hypermedia systems should be an ideal way of learning compared to traditional methods of education.

Yet who is to make the decisions concerning what is to be contained in such systems? Could they not potentially invest themselves with more authority than the written word?

In its development, the written word acquired through the uses to which it was put the notion of truth. Even now (1989) , dictionaries are referred to in order to discern the correct way of spelling and encyclopaedias are used by schoolchildren to accumulate ‘facts’ just as a piece of metal exists in a museum in Europe that defines the exact length of a metre. This standardisation is an attempt at creating order and exercising control. Even books critical of the society from which they originate are forced to use the language and conventions of the established order.

The move to allow the user to modify systems can be interpreted as an attempt to disallow established interest from assimilating the concept thus making it possible to dissolve institutions

Users are only allowed to make the changes that the system has been designed to accept. Simply because a system appears to offer choice of information it does not necessarily follow that it is offering freedom of access and contribution. Such systems, since they allow a form of choice, appear therefore to be containers of truth because it is held that the information within them is created by people of all walks of life. As a result of there apparently being no exclusion policy, Hypermedia systems could come to be seen as unbiased, constantly updated (and therefore constantly corrected) networks of truth.

Since the system is not likely to be user created, how does the machine know what we might find interesting or useful. It merely presents its designers’ concepts and ideologies. In ‘Hyperland’, Douglas Adams’ own personal nightmare about interactive television, Adams is floundering among choices offered him by his system. Are, then, our choices being made for us? And in a consumer haven are we being sold something that we do not need?

In ‘The Missing Link - Why We’re All Doing Hypertext Wrong’, Norman Meyrowitz claims that to date all Hypermedia products are ‘monolithic and insular’*17 in that they demand the user to abandon his current computer environment in order to enter the Hypermedia space and HyperCard™ is no exception. *3

Hypermedia is seen as a metaphor for the brain’s associative networks *4. Therefore, it should aid learning since emphasis is placed upon the relationships involved and not the acquisition of sterile facts. Hypermedia systems would also engage the pupil interactively thus encouraging learning by doing and an understanding of links and relationships. . . . that gives a teacher the ability to produce interactive teaching modules - tailor made for the capabilities of their students. This could put an end to the churning out of homogenised groups of people only able to think in the ways in which they have been taught. Since no two individuals learn in the same way and at the same rate, each student can take the route best suited to his needs and capabilities whilst still remaining within a common framework. Students’ awareness of different ways of thinking can only aid development of individuality as well as nurturing negociative tendencies as more and more of their work involves collaboration. Interconnection of students could make life more accessible for those who find direct socialising difficult as well as exposing students at extremes of ability and culture spectrums to each other. Education, therefore becomes seen far more as a development of the individual and not as a saturation of things ‘they should know’. Associative learning helps the individual to constantly look for links and therefore enable him to apply lessons/ideas learnt in one part of his life to another. The process of learning by doing should be encouraged to nurture an active and willing participation in wider social issues and activities as the student assesses his position as an individual within a system of individuals. Interactive learning and the mix of media should suggest a more creative approach on the part of students and teachers increasing motivation and combating disillusionment as his study bears more relevance to their life. Ultimately the student should be encouraged to reflect how what they do affects their life and the environment surrounding them.

Information contained in Hypermedia systems to date (1989) has been of a different nature in its presentation to other forms of information. Much of this is due to the accessing of information ‘out of context’ . . . . In order to fit into the constraints . . ., information, whether it be text, sound, graphics or video, is usually presented in the form of small, short morsels or ‘chunks’. The information is thus iconic or alluding in the sense that it merely alludes to more information accessible in the system. As a result the information can often appear, ironically, to be stating the factual and lacking in any real depth or substance. . . . Information . . . is thus further ‘chunked’ to fit into a . . . experience.

By this alluding to more information, such a system itself can be interpreted as a tease in which the user gets a certain amount of information but never quite enough whilst what he is presented with hints at more ‘goodies’ within the system.

Perhaps a better system would be one where the user could explore motivated by curiosity but with a some direction and goal. Such a system would take into account the various contexts involved and languages that exist in those contexts rather than decree information to be universal, singular and discrete.

*1 Query about traditional, linear, bookbound text, Michael Sperberg-McQueen, Humanist discussion group, U35395@UICVM, 24 September 1988

*2 Hypertext and Intelligent Interfaces for Information Retrieval, Patricia Anne Carlson, from The Society Of Text - Hypertext, Hypermedia, and the social construction of information, ed. Edward Barrett, MIT Press, 1989. Carlson gives these two terms in the introduction to her paper.

*3 The Missing Link; Why We’re All Doing Hypertext Wrong, Norman Meyrowitz, from The Society Of Text - Hypertext, Hypermedia, and the social construction of information, ed. Edward Barrett, MIT Press, 1989. Meyrowitz describes a project entitled ‘Intermedia’ undertaken at the Institute for Research in Information and Scholarship at Brown University, to develop a system allowing information in one document to be permanently linked to information in any other. The system has interlocking protocols which if implemented in new software would allow those applications to be Intermedia documents and thus be linkable.

*4 Limited Freedom; Linear Reflections on Nonlinear Texts, Joseph T Jaynes, from The Society Of Text - Hypertext, Hypermedia, and the social construction of information, ed. Edward Barrett, MIT Press, 1989.

On Technology:

Each technology has been cited by commentators and the public alike as being markedly different from any other and has given special impetus to the rejuvenation of claims regarding the inevitability of technological change, its causal effect upon society and our relationships to each other and to events and situations around us.

. . . it is much more relevant to view technology as being the product of intricate social relationships, values and interests and that one of the reasons why much of the hype concerning computers has not materialised in a concrete form is that rather than the technology being inevitable, the technology was seized upon and its uses directed by certain groups within the social structure . . .

technological determinism and symptomatic technology

. . . by Marshall Mcluhan*5 who attempts to illustrate how a new technology demands different forms of approach and sensual organisation from its predecessors and that this difference in the way that we are obliged to interact with the technologies leads us to take up new ways of organising ourselves and gives rise to new kinds of priorities.

Symptomatic technology, on the other hand is an attempt to understand the complex techno-social relationship by viewing the development of a particular technology as being a result of changes that are already taking place within society. The technology once discovered would then be accepted and put to use by society. In other words, under this view the technology does not stimulate new relationships as above but offers us media for stimulating new ways of life.

Though superficially, the two views can be interpreted as being opposite and contradictory, they do share common ground. This is most notable in the common theme of separation between spheres of Research and Development and the wider social contexts. In both views the innovations are held to originate from discrete, isolated groups of individuals who by most accounts are operating in an environment that is distinct from the wider social one.

. . . argument used by proponents of technological determinism in reference to this period is that in no way was there any evidence to suggest that a market was forecast for computers and thus it would be clearly wrong to suggest that the continued Research and Development into these machines was prompted by economic investment. Therefore, it is argued, the technology itself created the huge markets that were to appear. Certainly no-one predicted the massive explosion in demand for the machines. Thomas J. Watson Snr., the President of IBM believed there to be no market and believed that it would require only one large computer to solve the worlds computational needs.

Such major groups like, for example, the government and businesses failed to perceive the benefits of distribution of these powerful commodities to society at large and instead feared for the possible disadvantages of making them freely available. As we shall see later the computer has entered everyday life in a form that makes it difficult to use them other than for the purposes envisaged by their manufacturers.

A technological determinist would pointed out how the introduction of microcomputers into the home gave rise to the birth of many new industries and ways of looking at our relationship to ourselves and society such as increased leisure time, the drive to do things for ourselves and indeed the emphasis put on democracy itself. A symptomatic technology viewpoint would argue that the technology existed and was seized upon in a consumer environment by entrepreneurs and marketed as a commodity thus giving rise to new industries and so embodying Schmookler's statement 'Innovations are made because men want to solve economic problems or capitalise on economic opportunities'*6

Computers were now being marketed as essential to life in contemporary society as well as being the focal point of a culture promised since the sixties. Most people at this juncture still had no idea of what a computer was but were being told how the machine could help them with their finances, planning, entertainment and the education of their children; being enticed to buy computers for what they were supposed to potentially do for the way they led their lives not how they would affect them.

The gap between government and individual participation now seemed to widen, the seventies being really the last era of the major social protests against policies of government. The eighties saw a disillusionment with the way in which so much of everyday life appeared out of control of the individual or boring and tedious

Advertising was building glossy, desirable images of artifacts and lifestyles that people were persuaded to aspire to and to spend their lives searching for. It was the age of American soap operas, transsexuality and glamorous popular icons in the media and traffic jams, ‘nine-to-five jobs’ for the public. The government's idea of laissez faire was taking hold and the microcomputer emerged out of and back into a culture where it was believed that the individual must have complete autonomy over his life. The computer was cited by people often desperate to find some way of being able to transcend or improve the quality of life, as being the key to true democracy and individual control and choice since isolated microcomputers could be shown to be able to communicate over distance by being linked to a computer network. The implication was that established institutions would disappear since the connection of microcomputers made it impossible for any one concern to gain advantage. It could be said that the early computers were for many a means of owning something over which they had direct control and could alter to serve their needs. Despite a mood of growing neuroticism people could feel more creative and confident of their own identities and place in the world as a result of being able to gain mastery over these machines and come into contact with others opening up avenues of their lives. Even this itself harks back to the invention of the telephone and the subsequent communication model it was to become. The microcomputer was used as a means of furthering modes of communication that were seen as necessary by groups dependent in maintaining touch and control with a society expanding at a frantic pace but whilst using the models now accepted by telephone. The concept of its content did not at this juncture come into play.

Therefore, it is not difficult to understand the implications for control, exploitation and expansion particularly by Kling’s Private Enterprise and Statist models where the organisation’s interests take priority. The development of computers is, and always has been, funded and guarded jealously by, in the main, the military, multinational corporations and governments and is carried out at high security research centres or prestigious universities which are still dominated by the higher social strata and thus are sites for the perpetuation of the domination of computer technology by those classes. It can be seen that the the private enterprise and statist models enjoyed a considerable advantage here and it is not enough to say that the computer would soon be used to any social need since the direction of innovation and the ways in which they are put to use are most likely to serve the needs of the most powerful and their needs are more likely to get priority and thus it is with computers. Thus development is dependent upon the relationships between groups and their needs rather than being inherent of the technology.

Despite a modern belief in human rights and civil liberties, the Statist and Private Enterprise views retain most direct power over computers whilst often simply paying lip service to other interests such as the Libertarian and Neo-populist models . Computers are complex in the extreme and it is not (easy) for just anyone to create one. This together with the high cost of the technology which itself can be interpreted as an attempt at exclusivity is means enough to make its actual implementation easier for the resource wealthy.

Norbert Wiener*7, the founder of cybernetics, links the concept of communication and control since control, irrespective of the object of that control, is dependent upon communication between the controller and the controlled.

The fact that everything can be digitised and sent via cables has not, as predicted, led to the crumbling of existing media and institutions. When competition is sensed the first option is to try to ban it as the press tried with radio and if that fails then option two is to buy it as is the case with the current wave of cross-ownership of press and cable television. The potential for democratic global connectivity suit primarily the libertarian and neopopulist supporters but the producers and providers of the machines see these as secondary concerns. Most existing global communication networks rely on long existing modes of communication like telephones and mail. Williams argues that in the case of broadcasting and television*8, 'It is not only that the supply of broadcasting facilities preceded the demand; it is that the means of communication preceded their content.' The same could be said of . . . in that the ability to globally link is available but as yet the uses of them and the needs to which they should be applied have yet to be identified and the determinist view that these facilities will generate a social structure that will centre around their inherent characteristics is clearly false. Ithiel de Sola Pool declares that the technology dictates its form but not its content of application*9.

Few of us actually have any idea how to make a computer do what we want. We have to buy a machine designed with specific capabilities by companies and then we must choose from a range of software which limits us to the confines of the parameters of the software and so even widespread availability of powerful microcomputers will not allow one to enter the telematic arena on one’s own terms. The keepers of power are hidden and so computers have been built to disallow confrontation since all use of them must be within limits acceptable to the computer’s built in repertoire. The machine does not query anything outside this, it merely rejects it as an error. Thus, one is forced to comply. We are adapting to a technology whose current structure is imposed upon us by unchallenged forces and ideologies that have been with us long before the notion of the powerful microcomputer. The machines are built with specific uses or types of uses in mind that, using Kling’s categories, support private enterprise and do not conflict with statist views whilst at least satisfying libertarian demands and being marketed as neopopulist and systems concerns. The drive to make people computer literate is not purely one feels, a distribution of skills and tools but also a dissemination of the culture and ideology embedded within this technology.

All this has been due to human beings. The machines are potentially capable of immense benefits to society as is any technology. No one doubts that telematics could potentially broaden our horizons and improve the quality of our daily lives but we must realise and be aware of the social context within which we and they exist so that the wool is not pulled over our eyes.

*5 Mcluhan, M, 'Understanding Media'

*6 Schmookler, J. 'Inventions and Economic Growth'

*7 Wiener, N. 'The Human Use of Human Beings'

*8 Williams, R. 'The Technology & The Society'

*9 de Sola Pool, I. 'Electronics Takes Command'


No comments:

Post a Comment

Note: only a member of this blog may post a comment.