In the last few issues we’ve been looking at automation in the workplace, speculating on its effects now and in the future. We reported estimates that in the next two decades, as many as 47% of the jobs in this country will be candidates for automation. We talked about how technology is changing the workplace, helping humans problem solve and accomplish tasks more efficiently and cost effectively. And, computers—which increase our individual capacity to perform discrete skills—have the potential to make us more expert at what we do, even as what we do changes in approach and content.
In my trek through this brave new world, I recently stumbled on two concepts that seemed ripe for investigation: SA (situational awareness) and HMI (human machine interface).
From the military, where it made its first appearance post-World War I and reached zeniths among Korean and Vietnam war fighter pilots and has now become critical in anti-terrorism combat, comes a definition of situational analysis that defines its strength and purpose: “Identify, process, and comprehend the critical elements of information about what is happening to the team with regards to the mission.”
We’ve put together this handy guide to cover “rent vs. lease vs. buy” questions. Included are three decision trees along with a recap of pros and cons of each option. How to Choose Between Rent vs. Buy. Download it now!
In other words, be aware of what’s happening in your immediate vicinity to understand how information, events, and one’s own actions will impact personal and group goals and objectives, immediately and in the near future. What this doesn’t include is any mention of satisfaction for taking charge of one’s circumstances and influencing outcomes.
How this unfolds is significant. Developing and applying situational awareness is not an exclusively intellectual enterprise. To be effective, it requires our senses—sight, smell, hearing, touch. Once the details of a situation are established (perception), this data is used to understand what’s going on and determine how the circumstances vary from what’s normal (comprehension). This is followed by consideration of what kind of action should be taken, and its potential outcome (projection). In short, SA involves knowing what’s going on around you—being precise as to such specifics as time, place, and environmental details in preparation for taking whatever action is required, if any. It reminded me of the early days of human communication theory when we were encouraged to imagine one person encoding a message (signal) and dispatching it through a channel (radio or telephone, this being long before the Internet and Social Media) to another person or persons tuned into the same channel who snatched it up, decoded it, and made use of what we sent. The confounding element was noise, which could garble or affect a message’s accuracy. In electronics, this was static, in human channels personal, institutional, and all varieties of situational factors.
Once I got SA straight in my mind, I was impressed with its value. What could be more effective, consoling, and gratifying than to know and understand the dynamics of your workplace and acting on the basis of information rather than on innuendo and speculation?
The concept of the human machine interface developed out of an effort to characterize and address the effects when humans elect or are required to adopt technology. On the whole, it has mostly focused on quasi-effective strategies to make that interaction more effective and non-threatening, and ease the transition from manual processing, to where technology dominates or, at minimum, lends a hand. A lot of the initial work in HMI was centered on design: how do we make technology “easier” for humans to use? (The not-so-subtle implication being that the technology was the value in this equation and the human liability.)
Unfortunately, much of this work bypassed assessment and understanding of how humans actually get things done. Reputations were established, theorizing how things should go, rather than analysis of how humans actually “work.” Accordingly, much of the original work focused on the mechanism of the interface—improving dropdown menus on computers, replacing keyboards with touchscreens, flashy dashboard design.
Not everyone fell happily on the bandwagon. Once it became obvious that theoretical design constructs might not apply as effectively to real-world situations as construction industry software writers might have envisioned—and that people weren’t necessarily anxious to abandon expertise they’d spent years developing—software providers began working with clients to document existing procedures so that their software packages reflected what was happening on the ground and sticky wickets in regards to training could be sorted out. Heavy equipment manufacturers took to offering more operator-centric familiarization in advancements, such as machine control, as part of their sales packages. Training began to take on an expanded role, effectively becoming a substitute for handing down information and expertise from more- to less-experienced employees.
The problem is the march of technology has accelerated. More and more employees are being asked to influence situations from a position of one or more steps removed. The expertise they’re being asked to develop has more to do with watching than doing. They monitor rather than participate. Effectively, what this amounts to is a virtual form of situation analysis, and it’s spawning significant side effects. First, being physically removed from the act of production deprives humans of the physiological assessment tools we have typically relied on such as sounds, vibrations, and smells, which have helped us “tune into” equipment and sense process status. Lost also is the opportunity to interact with co-workers and share knowledge based on experience. The job is not so much to complete a task or affect a process, but to verify that the software managing the technological resource is operating as planned and if not, only then to intercede.
Second, the machines, with which today’s humans are expected to interface, are of a different order and magnitude of complexity. Software makes it possible to link multiple systems, and technology has made it possible for the elements in those systems to generate vast amounts of data. Operators are expected to know multiple systems intimately, internalizing the operational perimeters of each, and then apply a kind of virtual SA so as to be immediately able to recognize and act when a system is running of-spec.
The result of this challenge of accessing, comprehending, and acting on increasingly dense and complex data without physical access to the systems that generate it has a name—operator overload—and it results in what has come to be thought of as the worst of possible outcomes: human error. Human mistakes are now estimated to account for 42% of abnormal situations in industrial systems. This evolution of breakdown at the HMI has spawned a new generation of system experts who are busy spinning theories and producing models designed to improve communication between these new and complex systems, and people who are charged with using them. They’re doing this by engineering alarms and interlocks designed to keep humans on their toes.
The approach is similar to early HMI work, a focus on humans as the cog in the wheel of efficiency and neglect of how technology effects valuable human capacities. The result is that humans are reduced to reactive mode while being deprived of traditional forms of job satisfaction. Instead of knowing how to make widgets and being proud of it, the operator must navigate a sea of control loops and process sequences that alert him to situations he can’t prevent and may be ill equipped to affect.
What used to be a challenge of sorting out the factors that affect communication between managers and worker bees, between the worker bees themselves to address challenges and solve problems, has been reduced to understanding diagrams and flow charts and crying foul when things are out a balance, even though the human running the show might not understand what the system is actually doing or the effects of the announced anomaly. As before, the systems designers are focusing on the mechanisms of the interface—how tasks should be structured, the look of control panels—without substantive analysis of what humans bring to the table.
A situational analysis of this changing workplace, whether the human works on a piece of heavy equipment or in an office on project management or takeoff software, suggests a need for different types of training and employee development programs designed to help employees develop the expertise required to take advantage of these interconnected, data-spewing systems, and reimagining new modes of job satisfaction. A suggestion I picked in my wanderings is not hopeful. Acknowledging that experienced system operators “will have memorized system perimeters and familiarized themselves with the expected normals,” one expert suggested that it’s wise to provide new operators with this information upfront.