Jane Chappell is the vice president of Raytheon’s Global Intelligence Solution.
You check your mailbox and there’s a free subscription to a parenting magazine, a sample of baby formula and coupons for store-brand diapers, but you and your spouse just renewed your AARP membership.
Obviously, something has gone wrong, and the algorithm used to process trends and habits to build a unique profile revealed its limitations. Perhaps you shopped for baby clothes for a new grandchild and the resulting consumer data told the retailer you were likely a parent-to-be rather than a retiree. Mailing samples of baby formula rather than coupons for wine-of-the-month club is expensive to both a store’s reputation and bottom line. But, with a human touch, analytics and machine learning have unlimited potential to enhance the effectiveness of any organization.
» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.
While recent advancements in artificial intelligence and data analytics are impressive, the results of their collective role in our technological age are still being fine-tuned. Even as we approach a more capable version of artificial intelligence, the human-machine partnership is still critical to ensure we avoid mistakes with potentially greater consequences than a misdirected sales flyer. Artificial and human intelligence are not mutually exclusive concepts. The ultimate goal is a human-machine collaboration, enabling us to make better decisions faster.
When we remove the human from the data equation and operate on machine results alone, we end up with bad data. One example of this is the alarm fatigue that occurs at manufacturing plants when thousands of alerts from sensors and systems across a facility overwhelm operators and disguise the real risk that needs immediate attention. When system users review the alarm data without context and analysis, it’s impossible to prioritize actions and make recommendations for maintenance or shutdown.
The sheer volume of data available today often drives organizations to focus solely on the numbers; but, if we put too much weight on statistics alone, we may be missing critical insights that require a careful analysis of the results. According to Gartner, 27 percent of data from top companies worldwide is flawed. In the U.S. alone, decisions made from bad data cost the economy roughly $3.1 trillion dollars each year.
So how do we mitigate these losses? Here are three ways human oversight can multiply big data efficiency and improve analytics and decisions:
- Bring skepticism: Human analysts create algorithms and approach data sets with a skepticism that machines lack. The ability to critically review the data ensures flaws and information gaps that could skew results are identified. If inputted data has been coded incorrectly or there are missing fields, for example, the automated results may paint a vastly different picture than what’s actually happening within an organization. A critical eye must guide the analysis before drawing conclusions.
- Validate results: Beyond creating algorithms, data scientists and analysts validate the copious volumes of data generated by them. Having more than one person involved in the process prevents confirmation bias, which results when someone interprets a conclusion based on a preconceived assumption. Innovations are turning artificial intelligence and autonomy into advantages on every front, but the machines aren’t the only ones learning. Like technology, data is a tool, and humans must learn how to validate its resulting analysis in order for it to be most effective.
- Generate conclusions: The final, most critical component of human involvement in the data process is drawing insightful, actionable conclusions. Humans conduct complex, higher-level analysis that data systems cannot yet achieve. This is especially crucial for military leaders conducting operations around the globe. When lives are on the line, big data alone is not enough. Real value will be found in the human-machine partnership to determine an appropriate, measured response.
When effectively managed and validated on an ongoing basis, big data can equate to big opportunities. As technology advances to the point where machines can make better decisions independently, those of us who operate them will have to evolve and focus on ensuring the integrity of the process and the big data chain itself. Advanced machine-learning algorithms help businesses make sense of all the data being captured very quickly and efficiently, but ultimately human input and action are required for accurate processing.
As organizations identify more efficient ways to track customers, equipment, performance and energy spend, they must continue to factor in the objective nature of data to achieve the highest, most prudent level of productivity. With massive amounts of data, comes massive responsibility, and organizations still need the human touch to take on new challenges for greater insight and effectiveness.