Has modern brain wave technology made the Thought Police of George Orwell’s fictional novel, 1984, a possible reality? One may find the idea extraordinary but Nita Farahany, in her presentation at the 2023 World Economic Forum, demonstrated that using AI-powered technologies for brain transparency is not only possible, it is already here. While this ability provides fascinating opportunities for medicine and how we interact with technology, it also brings forward serious ethical questions about privacy and the potential for these capabilities to violate civil rights.
These technologies have advanced far beyond what is commonly understood by the American public. While the use of wearables to measure vital signs, sleep patterns, emotional states, and stress levels is a popular practice, artificial intelligence can now reach much deeper to interpret the electrical activity of the brain using electroencephalography or EEG. These interpretations can transform thoughts taking place in the brain into tangible images with remarkable accuracy. These images could take the form of a face, a shape, or a bank pin number.
An animated video featured in Farahany’s presentation portrays an employee daydreaming romantic thoughts about a co-worker when she is supposed to be focused on work, implying that initial applications might come in the form of job requirements. While such requirements might be considered intrusive, employment is voluntary, which would make brain transparency in this case, avoidable.
In contrast, later discussion of how this technology can be used to make the highways safer, suggests that legally-mandated applications to participate in basic freedoms might be on the table as well. Tracking brain waves can determine if a driver is paying attention to the road or getting sleepy and nodding off. Increased regulation on long-haul truckers in the name of safety has become an accepted standard practice over the years which might predict that brain transparency would become an additional requirement in that industry.
There are already precedents in place that limit or impose upon bodily autonomy in order to be permitted to drive on public roadways. While mandatory seat belt use may seem a relatively benign mandate, subjecting oneself to the possibility of a forced blood draw is an invasive practice that is performed on those suspected of driving under the influence. How great of a leap would it be from these practices to make the provision of brain data or driver eye tracking a requirement for the average citizen to drive? Mandatory technology requirements for manufacturers of new vehicles such as kill switches, breath-lock technology, and surveillance could include apparatus to collect brain-wave data.
Technology using facial recognition and analysis to identify pedestrians and determine their emotional states is already being used in airports, city streets, and other public spaces. Individuals can be subjected to law enforcement scrutiny or enhanced interrogation simply for having a bad day. If airplane travel can require intrusive biometric scans or pat downs, would a brain analysis to detect intentions of terrorism be that much of a stretch?
There are a number of possibilities where involuntary, or coerced, relinquishment of brain wave data might be employed. How about requirements for professional licensing? Should teachers and daycare workers be subjected to brainwave analysis to eliminate those with inclinations to pedophilia? The rights of those who are incarcerated are routinely curtailed in the name of security. Should brain-wave technology be employed to further mitigate violence in prisons?
Several states are proactively seeking to address these crucial issues. California, Colorado, and Montana have already passed neural privacy laws with 15 bills also being considered in six other states. While addressing this issue prior to widespread use of these practices could ultimately avoid abuse, it is arguable whether or not these laws go far enough.
These preemptive laws lean heavily in regulation of commercial use of this data with gaping exceptions often being allowed for government and law enforcement use when obtained under a warrant. This raises important questions about whether involuntary collection of this data should be permitted under any circumstance and whether using one’s own thoughts against them would be considered a violation of 5th Amendment protections against self-incrimination. While protection of consumers from targeted marketing and security breaches might be a worthy goal, how much more important is it to protect them from institutions that have been given life-or-death authority over them? Thoughts, after all, are something over which individuals have limited control.
As law enforcement increasingly coordinates with tech companies to obtain personal data of users, even including information obtained from data breaches, legal limits on that access are becoming more relevant. Before the question of whether law enforcement should have unfettered access to private texts and conversations can be resolved, we have already entered into new territory with the collection of biometric and brain wave data. The issue requires not only preemptively regulating access to data that has been voluntarily given to a tech company by a customer for their own use, but also determining whether an individual can be subjected to involuntary data collection, similar to the forced blood draws previously mentioned.
While this type of overreach would potentially meet many legal challenges, the privacy rights of countless people could be violated before the issue is adjudicated. While any efforts to protect neural privacy are encouraging, more robust legislation is needed to fully protect civil liberties.
In the meantime, consumers can take greater personal responsibility for their neural data by fully understanding what is collected, who can access it, and how it is stored and protected, before making decisions to share it. Aside from the obvious concerns of having this information fall into the hands of hackers, buried deep within the terms-of-service agreement of the latest wearable device, might be a clause giving permission to share your user data with any government agency that simply requests it.
Elizabeth Melton
Elizabeth Melton is a founding member of Banish Big Brother, focusing on education and strategy against government surveillance—a passion ignited after attending a shocking local smart city meeting. She is also the founder of The Gray Matter Project, producing documentaries on critical issues that deserve greater attention.
