Cognitive Engineering as the development of information spaces

David Benyon

Dept. Computer Studies, Napier University, Edinburgh, EH14 1JD

In presenting their conception of the cognitive engineering problem, Dowell and Long provide a thought provoking and challenging view of the design and development of computer systems and other ‘cognitive’ artefacts. The current craft approach to the development of tools and systems that enable people to work effectively in the abstract world of information and knowledge needs to be replaced by an engineering discipline. The central argument which they present is founded on a notion of the ‘dualism’ of worksystem and domain. On the one hand is a combination of people (or in more general terms, agents) and devices which constitutes a cognitive worksystem. On the other is a domain; ‘an abstraction of the real world’. The aim of cognitive engineering is to maximise the performance of this system with respect to this domain.

By posing the problem in this way Dowell and Long are able to provide clear definitions of two hitherto rather vague principles of human-computer interaction (HCI); usability and learnability. Their conception highlights the need to see people using devices as a system. It recognizes that it is this only within this whole cognitive system that agents can formulate intentions to change the state of the domain. Such a conception strikes a chord with recent developments in theoretical views of cognition such as distributed cognition (Hutchins, 1994) – knowledge and reasoning ability is distributed through a worksystem – and aspects of activity theory (e.g. Nardi, 1995) – that the activities of agents are mediated by devices. Where the analysis is lacking is in the recognition of the role that subjectivity plays in understanding cognitive design.

Dowell and Long define a worksystem in terms of its boundary; ‘the extent of all user and device behaviours whose intention is to achieve the same goals in a given domain’. This suggests that the boundary is defined objectively from the inside of the system. Defining the boundary of a worksystem is crucial because it establishes the dualism with a domain; by defining the domain, the boundary of the worksystem is established and ‘correctly identified’. In contrast, Checkland (Checkland, 1981) argues that the boundary of a system ‘is a distinction made by an observer which marks the difference between ... a system and its environment’ (p. 312). Whereas Dowell and Long equate the domain with the environment, thereby placing the domain outside the worksystem, an alternative view is to see the worksystem as including the domain. Dowell and Long’s conception of cognitive engineering is predicated on the dualism of worksystem and domain. But is it realistic to expect people to agree on such a distinction? I see the whole of Dowell and Long’s figure 1 as a system; an abstract ‘space’ of agents and devices, pursuing some on-going purpose and including its measures of performance. In some cases the purpose of the system may be clearly stated, but it many cases the purpose of the system will be qualitative and only describable at a highly abstract level.

A central tenet of systems theory is that systems are defined from a particular perspective and an essential part of engineering systems is to look at the problem situation from as many perspectives as possible. The history of computer systems development is littered with examples where the perspective from which the system was designed is not the same as the perspective of the people who wanted to work within that system to undertake useful activities. People perceive systems in different ways. They have differing perceptions of the purpose of the system and of the structure and functions of the domain. The domain of air traffic management will look very different for a controller at Heathrow than it will for a controller at a small regional airport. A carefully designed intranet and document retrieval system may fail to satisfy the needs of the users because the documents are not really the domain which interests them; it is the people who know about documents that are important. Moreover, domains are not static; they interact with the worksystem. For example, the worksystem which we now have for communication includes e-mail, the internet and the world wide web. The performance of this worksystem as measured by speed of communication, say, has no doubt improved over the penny post. But more than that, the possibilities afforded by the new worksystem have changed the very nature of the domain.

By opposing the worksystem and the domain, we gain the opportunity to engineer towards a target. We can define and measure the performance of the worksystem and discuss task quality. Dowell and Long argue that the conception of cognitive engineering must come before the other attributes of the ‘discipline matrix’, but the conception assumes certain values. The ‘work ethic’ view of cognitive engineering demands that effectiveness can be measured, but there are other values which may lead us to alternative views of cognitive engineering. We may adopt different values and see cognitive engineering as a democratisation of human activity (Ehn, 1994), as emancipation, as an opportunity to enrich our lives or as a way of improving opportunity.

An alternative conception is to see cognitive engineering as the development of an abstract informational space in which human activity will take place. Cognitive engineers create systems which subsequently define what people need to know, what people need to do, what people have to attend to and who or what has control over activities and outcomes. The worksystems which we engineer facilitate or deny the opportunity for people to formulate goals and to express their ideas and feelings. The engineering of cognitive systems is not a ‘hard’ engineering problem in which objectives can be clearly stated and performance in a domain can be maximised. It is a ‘soft’ systems problem in which there are ‘conditions to be alleviated rather than problems to be solved’ (Checkland, 1981. p. 155).

Conceiving of cognitive engineering as creation of information spaces encourages us to look to the designers of physical, geographical spaces – architects, city planners and the like – to help us understand our discipline (Benyon and Höök, 1997). In the design of physical space we have seen a move away from a utilitarian view of engineering towards a recognition of the social, cultural and political environment which people inhabit. Postmodernism has taught us that engineers cannot dictate the nature of space. It is people who produce spaces (Lefebvre, 1983). The analogy with cognitive engineering is that cognitive design is concerned with creating information spaces which consist of agents and devices that represent things which are meaningful to people. People engage in cognitive and social activities within these information spaces. Certainly some information spaces exist to control aspects of the physical world so that we can fly in planes or generate heat to keep us living. But it would be an impoverished view of cognitive engineering which saw this as its only purpose. Even the simplest of cognitive artefacts has a social impact and it is important that cognitive engineers recognise the wider ramifications of their designs.

Dowell and Long are right to present a conception of cognitive engineering which emphasises the development of agents and devices and abstractions. They are right to see the development of these joint cognitive systems as the basic unit of analysis for cognitive engineering. If we add to this an explicit definition of the system’s purpose, the perspective from which that system is identified as a coherent whole and a recognition of the social impact which engineers have, we have the foundation for a discipline of cognitive engineering. The designers of physical spaces have come to recognise their inability to mould people to fit their constructions and have long recognised the importance of aesthetics. The designers of physical spaces have learnt to recognise the cultural and political impact which physical engineering has. For cognitive engineers – the designers of information spaces – these lessons must be learnt too and must be seen as fundamental to their discipline.

References

Benyon, D. R. and Höök, K. (1997) Navigation in Information Space. Hammond, J. (ed.) Proceedings of Interact 97, London: Chapman and Hall

Checkland, P. B. (1981) Systems Theory, Systems Practice Chichester: John Wiley

Ehn, P. (1989) Work-Oriented Design of Computer Artifacts Falköping, Sweden: Pelle Ehn & Arbetslivscentrum: (ISBN 91-86158-45-7)

Hutchins, E. (1994) Cognition in the Wild MA MIT press

Lefebvre, H. (1983) The Production of Space Oxford: Blackwell

Nardi, B. A. (Ed.), (1996). Context and Consciousness: Activity Theory and Human-Computer Interaction. Cambridge, MA: MIT Press.