Unintended Consequences: Oops, There's a Bug!

This week's increased attention on the disclosure that Carrier IQ's mobile software may have captured personal information, including key strokes and private SMS messages, is only the latest incident of technology "analytics" having bugs or being misappropriated for uses not initially contemplated.

In its case, Carrier IQ has alleged that a bug in its diagnostic tool "may" have allowed for the capture of data that was not intended to be captured. While various federal agencies (law enforcement, FTC, FCC) are engaged and determining how responsible Carrier IQ is for its technology and conspiracy theorists are wondering about FBI usage of the service, there is a larger issue arising from bugs and unintended consequences.

As companies increasing use automated analytic tools to evaluate the data they are processing and possibly collecting on their customers, what safeguards -- voluntary or regulatory -- should be in place to counter the "oops" factor?

With technology, adding new capabilities onto existing platforms sometimes has a chain reaction. Likewise, as systems, software, and hardware are being developed, the geeks behind them -- hardly privacy or legal experts -- may install "cool" tools to assist them in their efforts, only to forget about them as products roll out.

What should companies be required to do with information and data that they may accidentally gather with such methods? To date, some companies have assessed that they never knew they were collecting the information and had not stored or utilized it anyway. Other companies, such as in the Carrier IQ case, created technologies for third parties who then would have access/control of such data, thereby complicating the legal/regulatory regime.

Yes, there are general consumer privacy laws in place that allow the FTC (and possibly the FCC) to take action and there are criminal laws such as Electronic Communications Privacy Act, wiretap laws, and the Computer Fraud and Abuse Act that may be relevant but those may have limited value depending on nature of the company-customer relationship and terms of service.

The oops factor, I fear, will only grow as a problem in the push for innovation and analysis capabilities. Yet, for the sake of both security and privacy, it should be addressed. Companies who have taken the lead in addressing their oops could provide valuable lessons to others and possibly could help create a voluntary framework on addressing this issue. If it is addressed, we can then distinguish between bad actors who are blatantly violating consumer privacy and security expectations and those who, oops, made some very human mistakes with very unbending inhuman technology.