Complex Systems, Cooperative Work, and Usability

Peer-reviewed Article

pp. 100-112

No PDF available for download.

Modern operating systems are increasingly complex and require a large number of individual subsystems and procedures; operators also must cooperate to make them function. In this paper the authors consider usability from a broad perspective based on this understanding, recognizing the challenges a team of operators, complex subsystems, and other technical aspects pose as they work together. It seeks to expand usability by adding insights from Computer-Supported Cooperative Work (CSCW)-based fieldwork in offshore operations. To contribute to the current usability literature, we investigated and analyzed through a network-based approach how operators, ship bridge hardware and software, and other physical environments work together.  We propose a process for evaluating the usability of complex systems: field observation and interviews to determine how work is organized and executed by human and nonhuman actors and to identify whether additional artifacts are being used to supplement the nonhuman components. The use of those artifacts often identifies usability issues in complex systems.

Keywords

usability, complex systems, CSCW, offshore operations


Introduction

Maritime operations are increasingly complex, involving advancing technologies for oil production, collection, and transportation; fishing; rescue; oil and gas platform constructions; and telecommunication. In recent years, human errors have caused many maritime accidents (Røyrvik & Almklov, 2012), as well as accidents in other fields with complex systems such as healthcare, air traffic control, and nuclear platforms (Online course, 2013). Perrow (1999) coined the term “system accidents” to refer to errors caused by inefficiency in system design, ineffectiveness of usability testing, and complexity of operations (see also Mills, 2005). In this paper the authors consider usability measurement with respect to a ship bridge system as a means to further our understanding of usability measurement in complex systems.

The ship bridge system is geographically distributed, with crewmembers on the ship bridge cooperating with each other and with crewmembers on, for example, oil and gas platforms. Figure 1 shows multiple, distributed interaction interfaces with several interacting screens on a ship bridge. Each maritime task requires crewmembers on the ship bridge to work together on multiple computer subsystems/displays and added artifacts such as paper-based forms, alarm clocks, and calculators. Another complicating factor is that the ship bridge system consists of subsystems. Coupling mechanisms make these subsystems and operators mutually interactive, which can result in chaining of one operating error from one subsystem or operator to another. All these considerations give the ship bridge system a cooperative complexity, which other information technology (IT) applications, such as web-based systems for banking, health care information systems, enterprise resource planning (ERP), and mobile applications may lack.

In this paper we explore the interaction of software, hardware, and a team of operators that comprise a complex ship bridge system. We also consider the role of added artifacts that are not part of the original design of the ship bridge system but which the team of operators use. As we discuss in this article, these artifacts play an important diagnostic role in the usability of complex systems. The case in this paper focuses on a fundamental offshore operation—dynamic positioning. Dynamic positioning operations, which are used to position an offshore vessel in the proper place at sea, are a critical part of offshore operations (Det Norske Veritas, 2011), and they represent a critical task in a complex system. A typical dynamic positioning system has six subsystems: control, position reference, maneuvering, power management, environmental reference, and heading reference (IMAC, 2009). To maintain the desired position and heading of a vessel, different operators on a ship bridge have to monitor the position reference subsystem (i.e., the environmental sensors) and control the vessel’s propellers and thrusters (Det Norske Veritas, 2011). A dynamic positioning operation usually needs two operators on the bridge, two crewmembers on deck, and two engineers in the engine room. In such an operation, a team of operators, the ship bridge subsystem, and added artifacts must all work together to accomplish the task. The interactive relations among operators, the ship bridge subsystem, and added artifacts during tasks comprises a “network” (Law, 1992). As Cordella and Shaikh (2003) noted, this is an “actor network” wherein both humans and nonhumans are actors.

pan_image001

Figure 1. Modern ship’s bridge, 18 screens in an operation room (Photo: Yushan Pan).

Current usability methods for maritime IT design can be adopted to evaluate maritime offshore operational systems. However, most usability methods have been validated only on simple tasks (Redish, 2007), which makes it difficult to adapt them to complex systems. Some researchers have proposed scenario-based usability frameworks for complex systems (Rosson & Carrol, 2002). Given that those frameworks are a non-waterfall-based evaluation process and  are meant to analyze individual tasks and subtasks by identifying and organizing them as successive choices and actions, effectively defines the cooperative work as a hierarchical structure—which may not be apt for a complex maritime system. For example, the process of operating banking and ERP systems is treated as hierarchical (Rosson & Carrol, 2002); however, as Sørgaard argued, cooperative work is nonhierarchical: “Cooperative is to work or act together for a shared purpose. The work is done in an informal, normally flat organization” (1987, p. 3).

We acknowledge this argument and understand the cooperative relations between computers’ work and humans’ work is also important. This relationship has a nonhierarchical, flat network structure (Whittle & Spicer, 2008). Computer functions become meaningful in interacting with humans in task performance; in turn, humans’ minds adapt to computer functions during operational tasks (Svan¾s, 2013). The complex system consists of a network of humans and nonhumans, which also includes added artifacts. In this case study, we consider operators, complex subsystems, and added artifacts as equally essential to the accomplishment of a task. Therefore, we understand usability as a connection with a team of operators, assisted by hardware and software. This process represents the structuring of work via interactive relations, crossing from humans to nonhumans. Given complexity, usability problems may not reflect the failure of an individual human or nonhuman within a group, but rather problems in the interactive relationships between them.

Usability analysis has traditionally been based on scenarios (Redish, 2007). But it is difficult to simulate a rich, complex environment with a clear boundary to formulate a scenario for operations on ship bridges. Other researchers have documented the limitations of scenarios in relation to, for example, train operators’ work studies in Sweden (Olsson & Jansson, 2007, cited by Redish, 2007). Scholtz (2006), focusing on metrics in complex systems, defined complex environments as visual and analytic, and suggested that evaluating them using the concepts of situation awareness, collaboration, interaction, creativity, and utility illuminates their functioning. Other researchers have proposed that cooperation between usability and domain experts will illuminate complex systems (Chilana, Wobbrock, & Ko 2010; Howard & Greer 2011). However, we believe that the proposed metrics are of limited utility in evaluating ship bridge systems, as they do not encompass the complex social context and behavior involved. This context includes how cooperative work in a team influences operations in complex systems and in turn, how complex systems affect the process of a team’s cooperative work. Research on these aspects has been limited. Further, as Chilana, Wobbrock, and Ko (2010) argued, domain experts may provide biased understandings of complex IT systems because sometimes they are familiar with those systems and may, therefore, lose their ability to see problems. Jarrett (2007) suggested that usability experts conduct fieldwork and trap cases in order to examine systems that include complex IT systems (cited by Redish, 2007). It is difficult to simulate an offshore oilfield work environment, for example, that includes changing wind direction, unstable waves, the social and organizational contexts of operators, the complex software and hardware systems, and any added artifacts. These components, in combination, affect the performance of both the ship bridge subsystem and humans during operations. They could invalidate both a simulated environment and controlled laboratory experiments (Forsythe, 1999).

We propose that usability expand its scope to include evaluations of complex systems that include a team of operators that are distributed geographically cooperating together to accomplish a common task by using nonhuman actors. Such an extended scope will allow for new ways of examining interactive relationships between humans and nonhumans. This present study draws on such ideas and provides the data analysis (inspired by the Actor–Network Theory (ANT; Law, 1992) of a complex system in an effort to contribute to the usability community. Hence, our guiding question is “How can we evaluate usability issues for a broader consideration of complex systems wherein a team of operators work with ship bridge system?

In this case study we used an ethnographic approach to understand the usability problems of ship bridge systems. We did so by systematically participating in observing the work among a team of operators in an offshore oilfield. We examined the interaction (Schmidt, 1994) between humans (operators) and nonhumans (hardware, software, and added artifacts) by asking who was doing what, where, when, by what means, and under which requirements. We aim to illustrate a process for discovering usability problems, creating a larger picture of complex systems as environments. We believe this will help researchers and practitioners understand interactions between humans and nonhumans more thoroughly and enable superior design of digital environments for teams of operators who work in complex systems.

This paper is organized as follows: the literature review briefly presents works on usability evaluation in complex systems, with a particular focus on cooperative work and a brief overview of ethnography and theory-based usability measurements for complex systems. Next, a task from an offshore operation is presented. This task illustrates interactions in maritime operations to uncover usability problems. The discussion of findings shows how we interpret fieldwork data as a way of thinking of interactive relationships in dynamic positioning. We conclude with tips for usability researchers interested in evaluating usability of complex systems in similar fields.

Literature Review

Recent research on usability evaluation for what some authors consider complex systems focuses on evaluating system segments or single user systems. For example, Oja (2010) used 10 usability heuristics to evaluate single user human–computer collaboration in complex systems. Papachristos, Koutsabasis, and Nikitakos (2012) proposed a usability framework for ship bridge evaluation. Their framework focused on part of a bridge’s distributed layout and workstation for a navigation scenario. Geogoulis and Nikitakos (2012) argued that adding more scenarios related to navigation would enrich understanding of the usability of navigation systems and provide a better approximation to reality. Bjørneseth, Dunlop, and Strand (2008) addressed individual interactions during dynamic positioning operations by discussing usability testing for paper prototypes, highlighting many issues of relevance for the present study, such as individual operator’s interactions with ship-bridge systems. However, Bjørneseth et al. did not focus on the entire dynamic positioning operation.

Other researchers have suggested ways to expand usability, such as scenario-based usability testing (Rosson & Carrol, 2002), thinking aloud (Rosson & Carrol, 2002; Scholtz, 2006), situation awareness (Redish, 2007; Scholtz, 2006), and building simulations (Redish, 2007; Rosson & Carrol, 2002). These articles, however, base their insights on actions reported by users rather than observations (Scholtz, Antonishek, & Young, 2004a, 2004b; Scholtz, Morse, & Potts, 2006). Others evaluated existing data from experiences and activities self-reported during evaluation (Scholtz et al., 2006). Forsythe (1999) argued that these methods describe human problem-solving experiences and therefore add to the general understating of how the human mind works. However, as Blomberg, Burrell, and Guest (2003) argued, reported behavior can be unreliable, for a host of reasons. Further, even laboratory tests may not provide a full picture of usability (Bødker & Sunblad, 2008; Redish, 2007). Hollan, Hutchins, and Kirsh (2000) presented a theory of distributed cognition to understand interactions among people and technologies, pointing out the importance of testing usability in the field in relation to “a complex networked world of information and computer-mediated interaction” (p. 147 online).

Ethnographic studies are a type of field study that bases findings on observed, rather than reported behavior; several scholars point out the need for it to be systematic and theory based (Forsythe, 1999; Suchman, 2007). Ethnographic approaches are useful to research on human-computer interactions because they help to address a network-based analysis of interactive relations. It may also help balance the bias of the domain experts when conducting usability evaluations (Ryan, Schyndel, & Kitchin, 2003), and interviews based on scenarios (Forsythe, 1999).

Savioja and Norros (2013) proposed a theory-based method for evaluating nuclear platforms called “systems usability.” Researchers use the activity theory to analyze different levels of operations and actions of individual users in nuclear platform operations. This approach seeks to cover a system’s overall meaningful role in an activity and considers sub-activities in a hierarchical relationship. Savioja and Norros recognized that mediation of computer tools involves both instrumental and psychological functions in communication. They created a questionnaire based on their understanding of those activities to investigate cooperative work through collective individual activities. Their theory of activity may be restricted because some activities can only be understood in relation to others (Kaptelinin & Bannon, 2011). Savioja and Norro’s (2013) understanding of cooperative work differs from that of the others in the CSCW community. For example, Schmidt and Bannon (1992) defined cooperation as dependency in work, and their understanding of systems includes people working together to accomplish a task. That insight applies to work in the maritime domain because every offshore operation needs a team of distributed operators working cooperatively with ship bridge system.

Gutwin and Greenberg (2001) developed a conceptual framework for evaluating a team of operators in complex IT systems applying it to groupware. That conceptual framework covers four “discount” usability methods: heuristic evaluation, walkthroughs, usability testing through observations, and questionnaires. Multiple evaluators engage in the evaluation process. However, as Pinelle and Gutwin (2001) argued, the criteria of multiple evaluators and task-based analysis alone may decrease the effectiveness of this method. As Schmidt (1994) argued, with respect to mutual critical assessment in a team of operators and computer-mediated work:

Different decision makers will typically have preferences for different heuristic (approaches, strategies, stop rules, etc.). Phrased negatively, they will exhibit different characteristic ‘biases’. By involving different individuals, cooperative work arrangements in complex environments become arenas for different decision making strategies and propensities where different decision makers subject the reliability and trustworthiness of the contributions of their colleagues to critical evaluation… (p. 350).

This statement questions the value of using multiple evaluators in cooperative work. While researchers question the use of multiple evaluators, Bødker and Sundblad (2008) challenged designers to involve all stakeholders in the design process to increase a system’s usability. Inclusion of all the stakeholders would represent a shift toward understanding the process in an actor network by considering the impact of technologies on human practices. In offshore operation, when a vessel approaches oil- and gas-gathering platforms, humans (operators), subsystems (dynamic positioning), and artifacts (paper-based forms, alarm clocks, and calculators) must work together to make the tasks successful. Therefore, analyses that involve interactive relationships in a larger system assist in illuminating a work environment in terms of a better understanding of usability issues.

This literature review has aimed at sketching what we believe is a broad perspective on complex systems. As stated above, evaluation after the fact by users provides an incomplete framework. In this case study we examine the interactive relationships that structure the work situation of an offshore operation at sea as a network. This network is dynamic in that it changes depending on the task. As the next section illustrates, the shape of this network reflects the interactive relationships between humans and nonhumans in a network and determines its functionality.

Fieldwork Findings: A Situation From an Offshore Operation

This section describes an offshore dynamic positioning operation based on our fieldwork, in order to address how we can consider usability issues for a work environment where a team of operators work with both the hardware and the software of a ship bridge system and other artifacts they add to it. The dynamic positioning operation is a fundamental function for all types of ship bridge systems, designed to maintain safety through effective positioning. The complexity that operators face includes the work conditions imposed by waves and winds. These include the threat of a collision between the vessel and the oil platform and the problem of maintaining balance for a vessel that carries balancing mechanisms using water and mud containers, and the positioning of supplies on deck.

Our fieldwork was conducted on an offshore vessel in the North Sea. We observed dynamic positioning operations 20 times over the course of seven consecutive days of this offshore work with two teams on the same offshore vessel. Each team had six people work on offshore operations with a shift change every six hours. The operators that we observed worked in distributed locations. For example, an operator and first officer were on the bridge. The engine operator was at the bottom of the vessel. Another crew worked outside the bridge on the deck. The platform operator was on the oil and gas platform rather than on the ship. All these operators worked together.

The following excerpt from a dialogue among dynamic positioning operators illustrates the interdependence of their work and the fact that operators adapt to the subsystems in order to operate them. Usability problems are identified when the operators have to add artifacts that should have been part of the dynamic positioning subsystem.

Operator 1 on the ship bridge (O1): Please report the weather, first officer (verbal communication).

A loud noise comes from the bottom of the vessel—the engine room—where the first officer is unable to hear clearly what the operator is saying.

First officer (1O): Hah (throat-clearing) É repeat, please.

O1: Check the wave and wind.

This time, Operator 1 speaks loudly. Then the first officer walks to a computer, which is not part of the dynamic positioning system, turns on the display, and pulls up a weather report page. The weather report is important because if the wave and winds are too high, all operations have to be postponed. However, such a weather report page is not part of dynamic positioning subsystem.

1O: Wave is three; wind is six.

As he speaks, he logs the information on a checklist, including the current time and place, and brings it to the operator.

O1: Okay. Thanks.

Operator 1 then picks up one of the communication channels and speaks:

O1: Engine room, report the engine status?

Engine room operator (E; responding via the audio communication channel): Power is okay.

O1: Report containers (the containers are at the bottom of the vessel; engine officers usually take care of them). Please report [them] to me.

We observe that the system only shows the pressure of the piping tube on the ship bridge display and is too far away for Operator 1 to observe. This is a significant limitation of the ship bridge subsystem. Two added artifacts, an alarm clock and a calculator, have to be used. It also increases risk because there is no double-check procedure by another operator.

E: Pipe pressure is okay.

At the same time, the first officer walks back to the dynamic positioning subsystem and sits on the duty operator’s chair. He looks at the screens above his head and says:

1O: Pressure is okay.

E: [Container] one, two, three, and four are okay, but you need to pipe out water from five and take some mud from five to six.

Our observations reveal that engine crews must always check the balance when conducting dynamic positioning operations to avoid capsize due to unbalanced weight on either side.

O1 (talking to the platform operator via the audio communication channel): I need to prepare for a while. I need to balance the ship before I operate.

Platform operator (P): Okay. Call me back when you are ready.

O1: Thanks.

1O: Deck crew (talking via the communication channel).

Deck crew (D): Yes.

1O: Open the valve of container five.

Two deck crewmembers walk on deck and open the valve of the container on the left side of the vessel. Then Operator 1 records all information on the checklist that the first officer gave him. Next, Operator 1 calculates roughly how much mud he needs to move to container six (see Figure 2). When he finishes his calculation, he fills out a paper form and delivers it to the first officer to sign.

pan_image002

Figure 2. Calculator (left) and alarm clock (right). (Photo: Yushan Pan)

Operator 1 then sets a five-hour alarm on the clock (see Figure 2) before he pipes out water from container five. Operator 1 operates the container systems to pipe water and stops when he hears the alarm from the clock. He sets another 15-minute alarm and starts to move mud. After that, he asks the first officer again:

O1: Officer, check the wave and wind again.

The first officer walks to the computer again, checks and writes down the information, and speaks to Operator 1. He writes down the information again and brings it to the operator.

1O: Wave [is] three. Wind [is] five.

O1: Engine, are the containers okay?

E (via the communication channel): Checked, [they are] okay.

O1 (via the communication channel): Deck crew. Report the position when I am doing the dynamic positioning.

Deck crew (D): Okay.

O1 (via the communication channel): Platform, could you put down the crane? I can’t see it.

Operator 1 turns off the communication channel and speaks to the first officer.

O1: Can you have a look?

1O: Okay.

The first officer stands up and looks outside to find the position of the crane on the platform (see Figure 3). Then gesturing, he guides Operator 1 to position the vessel.

During the dynamic positioning operation, the deck crew also reports information to the operator on the bridge because it is hard to observe risks on a 100-meter-long vessel if, for example, it is too close to the platform.

 

Discussion of Findings

In analyzing the dynamic positioning operation described above, we found that some information that operators want or need was hard to read from the display. This included how much water should be piped out and how much mud should be moved to balance the vessel. We confirmed this finding in an interview with operators. This example illustrates that when the operator experiences some deficiencies in the hardware and software, it may reflect the fact that it was tested before its assembly on the ship bridge and operators were trained a few times with simulators. Usability testing could also have been conducted after the dynamic positioning subsystem was developed. However, operators adjust their behavior when they use it in actual operation.

pan_image003

Figure 3. First officer looking outside to help the dynamic positioning operator position the vessel under the middle of the crane on the platform (Photo: Yushan Pan).

Operator 1’s general sense of how the subsystem functions prior to our observation contrasted with his opinion after our observation, which may reflect more contemplation on his part after the task. Scholtz et al. (2004a, 2004b) developed a situation awareness assessment to address this dynamic as a way to practice how an individual processes information (perception), how information is used and combined by an individual to determine an individual’s goals (comprehension), and how an individual understands future situation events and dynamics (Endsley, 2000), but nonetheless relying on self-reports may be misleading. Further, self-reports also limit researchers’ understanding of the dynamics of interactions among human and nonhuman actors. For example, the operator is also the captain of the vessel, which means he is legally responsible for crew and ship safety. We do believe his statements were not biased because he signed a consent form before we started our observations. On the form, we clearly state that his comments will be protected by the privacy regulation of Norway. Also, the fieldwork is approved by Norsk samfunnsvitenskapelig datatjeneste (Norwegian Social Science Data Services) to collect data. Hence, he has the right to withdraw all his statements whenever he thinks it is needed. Thus, his comments prior to our observation would likely resemble what a questionnaire would elicit by the maritime industries. Later on, he expressed more confidence.

According to Schmidt (2002), awareness in cooperative work exists in connection with action. As we observed, operators on the ship bridge knew what their peers were actually doing because they made it known. The operator and the first officer knew the setting, understood the processes and issues, and knew what could happen during dynamic positioning operations. They did not want accidents to occur. The first officer knew that the operator could not see the pipe pressure on the display because it was too far from him. Therefore, he reported this information when he believed that the operator might need it. Under our observation, the operator never asked for it; the first officer exhibited what Rønby Pedersen and Sokoler (1997) called “mutual awareness.” In a similar manner, actors’ behavior in reporting pipe pressure, looking for the position of the crane, and checking the waves and the winds reflect mutual awareness as to another actor’s needs. This type of awareness during complex systems’ operations is not a process involving collective situations of individual work, but being aware of a particular work procedure (Schmidt, 2002). Self-reports may not be able to capture this type of dynamic.

Our intention is to shed light on usability problems via the work that takes place in the field. Thus, we aim to interpret and understand the context of ship bridge system and the process whereby they influence and are influenced by work practices. We believe this approach may be a path to illuminate usability problems through human sense-making, which is a process where humans figure out which objects to use,  based on their work experience and as situations change in work practices, to successfully complete tasks (Kaplan & Maxwell, 1994). We found that all operators, crewmembers, officers, systems, and other physical artifacts work cooperatively toward the goal in using the dynamic positioning subsystem. Further, each actor had distinct functions. Hence, we evaluated the complex systems as following the interactions in a network during operational tasks.

A Network Is Built Through Network-Based Analysis

A team of operators working with the ship bridge systems comes with multiple added artifacts. Thus, we understand humans and nonhumans to be equal—all actors (humans and nonhumans) have their respective roles in each specific activity. Added artifacts can also influence usability problems on ship bridge systems. In the situation presented in this paper, the operator, the first officer, and other crewmembers work with the ship bridge system together. Operating a dynamic positioning operation requires recording all information that the operator needs to transmit across the communication channel. The recorded information represents a core point for the checklist, which has to be filled in for each dynamic positioning operation. The checklist should be present during the whole offshore operation as well. The status of containers should be reported to ensure that the vessel maintains a good balance during the operation. Additionally, the engine status has to be reported. Otherwise, the vessel may approach too close to the platform, which increases the risk of colliding with the oil platform due to inertia. The containers’ pressure should be displayed in the ship bridge systems in order to pipe water or mud without problems. These inherent attributes of the dynamic positioning system require changes in the work practice in that they introduce work tasks for the operator, crewmembers, and first officer, both on deck and in the engine room. The operator has to connect with engine and deck crewmembers by communication channels to know the required information for dynamic positioning operations. Crewmembers in the engine room and on deck, the first officer, and the operator work together to maintain these lines of communication in dynamic positioning operations, for example, when checking the distance between vessel and platform, it is important to check the engine. The operator records this information on the checklist before starting the dynamic positioning operation. The first officer has to connect with the deck crew to turn on the container’s valve on deck and assist the operator in monitoring the crane on the platform. Hence, the checklist in the dynamic positioning operation links to all other actors (humans and nonhumans) in different locations, as well as artifacts; such links build up the dynamic positioning operation as a network.

The relationships between human and nonhuman actors are important (Callon, 1986; Latour, 1996). In one example in our study, an operator uses his clock to set up an alarm to ensure vessel safety, which expands the interactive relationship. Similarly, the first officer has to stand near the window (see Figure 3) and tell the platform crewmember to put down the crane when the checklist is finished. Neither of these actions is part of the ship bridge subsystem itself nor of the training course onshore because simulators do not require calculating the quantity of water and mud to balance the vessel. Only field observation reveals the usability problem in the system that these modifications solve. They reveal that dynamic positioning systems—both the subsystem and modifications—work together to finish a task. Such usability problems occur, not because of the system itself, but because of the complex operation procedures and the interactive relations between humans and nonhumans.

Thus, the clock becomes part of the network; the first officer waits for the alarm, and the operator asks him to help find the crane due to safety considerations. When we informally interviewed the operator after the participant observation he explained:

Since the dynamic positioning system cannot show this information [quantity of water and mud] and because I am not a mathematician [a long laugh], I don’t know when to stop piping. I also can’t remember when I started piping. So I use the clock as a memory alarm to tell myself what I have done and what I need to do.

The dynamic positioning system is just a fragment of offshore operations; the ship bridge system may have many such expansions. When the platform pipes oil to the vessel or exchanges mud at the same time during the maritime operation, the operator has to maintain the dynamic positioning operation for a long time because the waves and the winds may make the vessel lose balance. The alarm is a tool that may help align the tasks of the first officer and other crewmembers with the operator because they all are in a ready mode and waiting for inquiries from the operator. Evaluations of the network need to encompass such dynamics. Hence, we posit that usability issues accrue because the built network requires the added artifacts. Such added effort can come from a human or nonhuman, but must formulate and merge itself into the network to ensure the safe accomplishment of the work. Thus, we are able to draw a conclusion about the usability problems for a larger picture of complex systems by identifying the need to add nonhuman or human actors in the network.

Conclusions and Future Work

In this paper, we argue that traditional usability methods fall short when evaluating complex systems, such as maritime operating systems, because these systems are so complex field observations and testing must be used to grasp all the actors and situations that occur. We have demonstrated how to include interactions in complex systems as an actor network, that is, that human and nonhuman actors work together to accomplish a task. We propose the following three steps to understand complex systems for better identification of potential usability problems:

  • Step one: Visit the field to understand how a team of operators works with a complex system. The aim is to gain an understanding of the relationship between humans and nonhumans.
  • Step two: Through observation and interviews, figure out how the work is organized and executed and identify the added actors that an individual or a team brings into the teamwork.
  • Step three: Confirm with the team that the added actors are necessary. If they are, they point to usability problems. In these analytical processes, we believe that a network-based approach should be used to understand the meaning of fieldwork data.

Our own future empirical studies will use a network approach, and we urge others to use it to evaluate complex systems.

Tips for Usability Practitioners

The following tips can help practitioners who plan to undertake usability studies of complex cooperative systems:

  • Researchers should examine cooperative work of a team of operators; in turn, it can contribute to an understanding of a system’s usability as a whole.
  • Different usages of the concept of “cooperative work” in complex systems bring about different understandings, which have implications for the evaluation of such systems. Hence, we suggest going beyond individual work and beyond even assembling insights from individual work. Studying complex systems requires acknowledging the workings of operators, subsystems, and perhaps added tools.
  • Perceiving usability as interaction will help researchers understand how operators, subsystems, and added tools are connected in the field. Understanding the ecology of this relationship will render better knowledge of interactive relationships among these components.
  • A network-based approach of data analysis can assist knowledge about interactions in complex systems. By identifying abnormal operations that may expand the network, we can learn more about usability problems. Moreover, the examination does not have to always include the whole system. Practitioners can examine a small piece of the network to investigate usability issues in a single case. But those smaller studies must always be interpreted within the context of the broader network.

Acknowledgements

The authors would like to thank Captain Arne, First Officer Andreas, crewmembers onboard, and Mr. Ole Andreas Holm for the field study. The authors would also like to thank Dr. Alma CulŽn, Dr. Kjetil Nordby, Dr. Tone Bratteteig, Dr. Hans Petter Hildre, and anonymous reviewers for their insightful comments. The first author would like to thank the third author for her valuable knowledge about Computer-Supported Cooperative Work, Ethnography and Actor-Network Theory. The first author also thanks the second author for his knowledge about maritime operations, ethnography, situation awareness and strategy of the fieldwork. This paper is constructed on the recent PhD study by the first author. This research is funded by The Research Council of Norway.

References

Bjørneseth, F. B., Dunlop, M. D., & Strand, J. P. (2008). Dynamic positioning systems: Usability and interaction styles. Proceedings of the 5th Nordic Conference on Human-Computer Interaction (pp. 43–52). Lund, Sweden.

Blomberg, J., Burrell, M., & Guest, G. (2003). An ethnographic approach to design. In J. A. Jacko & A. Sears (Eds.), The Human-Computer Interaction Handbook (pp. 964–986). Hillside, NJ, USA: L. Erlbaum Associates Inc.

Bødker, S., & Sunblad, Y. (2008). Usability and interaction design—New challenges for the Scandinavian tradition. Behaviour and Information Technology, 27(40). 293–300.

Callon, M. (1986). Some elements of a sociology of translation. Cambridge, MA: The MIT Press.

Chilana, P. K., Wobbrock, J. O., & Ko, A. J. (2010). Understanding usability practices in complex domains. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2337–2346). Atlanta, GA, USA: ACM.

Cordella, A., & Shaikh, M. (2003). Actor network theory and after: What’s New for IS Research. European conference on information systems. Naples, Italy.

Det Norske Veritas (2011). Dynamic position systems—Operation guidance. Internal report.

Endsley, M. (2000). Theoretical underpinnings of situation awareness: A critical review. In M. R. Endsley & D. J. Garland (Eds.), Situation Awareness Analysis and Measurement (pp. 3–28). Mahwah, NJ, USA: Lawrence Erlbaum Associates, Inc.

Forsythe, D. (1999). It is just a matter of common sense: Ethnography as invisible work. Computer Supported Cooperative Work, 8(1-2). 127–145.

Geogoulis, G., & Nikitakos, N. (2012). Bridge ergonomics and usability of navigational system as a safety and quality feature. Proceedings of IMLA 16th conference on MET (pp. 241-263), Izmir, Turkey.

Gutwin, C., & Greenberg, S. (2001). A descriptive framework of workspace awareness for real-time groupware. Computer Supported Cooperative Work, 11(3), 411–446.

Hollan, J., Hutchins, E., & Kirsh, D. (2000). Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Transactions on Computer-Human Interaction, 7 (2), 174–196.

Howard, T., & Greer, M. (2011). Innovation and collaboration in product development: Creating a new role for usability studies in educational publishing. In M. J. Albers & B. Still (Eds.), Usability of complex information systems: Evaluation of user interaction (pp. 67–88). Boca Raton, FL, USA: CRC Press, Taylor & Francis Group.

IMAC (2009). The International Guidelines for the Safe Operation of Dynamically Positioned Offshore Supply Vessels. Retrieved from http://www.amcsearch.com.au/wp-content/uploads/sites/7/IMCA-182-International-Guidelines-for-The-Safety-Operation-of-DP-Offshore-Supply-Vessels1.pdf

Kaplan, B., & Maxwell, J. A. (1994). Qualitative research methods for evaluating computer information systems. Evaluating Health Care Information Systems: Methods and Applications, 45–68.

Kaptelinin, V. & Bannon, L. J. (2011). Interaction design beyond the product: Creating technology-enhanced activity spaces. Human–Computer Interaction 27(3), 277–309.

Latour, B. (1996). On actor-network theory. A few clarifications plus more than a few complications. Soz. Welt, 47(4), 369–381.

Law, J. (1992). Notes on the theory of the actor-network: Ordering, strategy, and heterogeneity. Systems Practice 5(4), 379–393.

Mills, S. (2005). Designing usable marine interfaces: Some issues and constraints. Journal of Navigation 58(1), 67–75.

Oja, M-K. (2010). Designing for collaboration: Improving usability of complex software systems. CHI ’10 Extended Abstracts on Human Factors in Computing Systems (pp. 3799–3804). Atlanta, Georgia, USA: ACM.

Online course (2013). User error: Who is to blame? Interaction Design Foundation. Retrieved December 2013 from https://www.interaction-design.org/courses/psychology_of_interaction_design-_the_ultimate_guide.html 

Papachristos, D., Koutsabasis, P., & Nikitakos, N. (2012). Usability evaluation at the ship’s bridge: A multi-method approach. 4th International Symposium on Ship Operations, Management and Economics. Athens, Greece.

Perrow, C. (1999). Nomal accidents: Living with high-risk technologies. Princeton, NJ: Princeton University Press.

Pinelle, D., & Gutwin, C. (2001). Group task analysis for groupware usability evaluations. Enabling Technologies: Infrastructure for Collaborative Enterprises. WET ICE 2001. Proceedings. Tenth IEEE International Workshops on, (pp. 102–107). Cambridge, MA.

Redish, G. (2007). Expending usability testing to evaluate complex system. Journal of Usability Studies 2(3), 102–111.

Rønby Pedersen, E., & Sokoler, T. (1997). AROMA: Abstract representation of presence supporting mutual awareness. CHI’97 (pp. 51–58). NY, USA: ACM Press.

Rosson, M. B., & Carroll, J. M. (2002). Usability engineering—Scenario-based development of human-computer interaction. San Francisco, CA: Morgan Kaufmann.

Røyrvik, J., & Almklov, P. G. (2012). Towards the gigantic: Entification and standardization as technologies of control. Culture Unbound 4, 617–635.

Ryan, C., van Schyndel, R. G., & Kitchin, G. (2003). Design and usability evaluation of a GAAP flight progress monitoring system in a simulated air traffic environment. Proceedings of the Australasian Computer-Human Interaction Conference (OzCHI03).

Savioja, P., & Norros, L. (2013). Systems usability framework for evaluating tools in safety–critical work. Cognition, Technology & Work, 15(3), 255–275.

Schmidt, K. (1994). Cooperative work and its articulation. Le Travail Collectif – Travail Humain, 57(4), 345–366.

Schmidt, K. (2002). The problem with Ôawareness’ – Introductory Remarks on ÔAwareness in CSCW’. Computer Supported Cooperative Work, 11, 285–298.

Schmidt, K., & Bannon L. J. (1992). Taking CSCW seriously: Supporting articulation of work. Computer Supported Cooperative Work, 1(1), 7–40.

Scholtz, J. (2006). Beyond usbility: Evaluation aspects of visual analytic environments. IEEE Symposium on Visual Analytics Science and Technology, (pp.145–50). Baltimore, USA.

Scholtz, J., Antonishek, B., & Young, J. (2004a). Evaluation of a human-robot interface: Development of a situational awareness methodology. Proceedings of the 37th Annual Hawaii International Conference on System Sciences, (pp. 1-9), Hawaii, USA.

Scholtz, J., Antonishek, B., & Young, J. (2004b). Implementation of a situation awareness assessment tool for evaluation of human-robot interfaces. System Man and Cybernetics Journal Part A 35(4), 450–459.

Scholtz, J., Morse, E., & Potts Steves, M. (2006). Evaluation metrics and methodologies for user-centered evaluation of intelligent systems. Interacting with Computers, 18(6), 1186–1214.

Sørgaard, P. (1987). A cooperative work perspective on use and development of computer artifacts. 10th Information Systems Research Seminar in Scandinavia. Finland.

Suchman, L. (2007). Human-machine reconfigurations: Plans and situated actions. New York, NY: Cambridge University Press.

Svanæs, D. (2013). Interaction design for and with the lived body: Some implications of Merleau-Ponty’s Phenomenology. ACM Transactions Computer-Human Interaction 20(1), 1–30.

Whittle, A., & Spicer, A. (2008). Is actor network theory critique? Organization Studies, 29(4), 611–629.