By David Mendonca
Professor of Industrial and Systems Engineering at Rensselaer Polytechnic Institute
Data is all around us — but we need to understand data in its context. This is especially important when data is associated with hazards and disasters. Without context, we run the risk of drawing false conclusions, or failing to find true conclusions.
One of the most exciting developments in data in the past 10 to 15 years has been the ability to collect observations on human behavior in detail at a very large scale. Twenty years ago, I was working in New Jersey and doing research on the cleanup of Ground Zero in New York City. Everyone was struggling to understand how to use data to manage this massive operation. Remarkably, and despite the fact that the operation was unprecedented, there were no deaths and no serious injuries.
Fast-forward a few years, and I was working again with many of the same people, this time on understanding how information technology could support post-disaster cleanup (debris removal) after Hurricane Katrina. Debris removal happens beneath the awareness of many people, yet it’s enormously expensive, time-consuming, and intrinsically human. There is simply no other way to remove debris after a hurricane or tornado than to rely on legions of people and thousands of trucks and other equipment to load, haul, and dispose of debris.
After Katrina, the federal government was very concerned with waste, fraud, and abuse. They decided to “instrument” the teams that performed the work. We have been lucky to work with them to understand the data they collect, and to offer recommendations on how to use (and improve) that data.
I learned quite a bit from studying a series of tornadoes in Alabama in 2011. Data from those tornadoes, combined with advanced modeling and simulation techniques, have helped me identify implications in three broad areas:
- First, there was considerable variation in team and organizational performance over the lifetime of this mission — and presumably others like it — suggesting the need to monitor and control variability of members within a team to improve performance.
- Second, computer-based simulations suggest that market-based control policies — like competition — are simpler than those used for this mission and may yield better, less costly results.
- Third, further instrumentation of the mission ought to yield even more robust models for supporting and evaluating mission performance.
These insights, and the improvement in debris removal that could come from them, were made possible by the fundamental advances our research team made.
The use of objective and detailed operational data represents a paradigm change in the study of team-centered organizations, allowing researchers to explore how dynamics and control within a team affect outcomes — in ways that questionnaire-based methods couldn’t do.
This research also combines model- and data-driven approaches to create validated models of system response to dispatcher behaviors, which, in turn, allow exploration of counterfactual policies of control — something that has traditionally been too expensive to employ.
Our new approach brings a wide array of advanced methodological approaches to model the linkages from team-level behavior to organizational-level performance over time, again opening up an important frontier in the study of teams within organizations.