×

Are You Sharing Your Brain With Apple? 

Advancements in technology continue to simplify our lives, but with this, there is a growing sentiment that our private information has become too accessible. Look around any waiting room or bus stop, how many people are wearing headphones? It is estimated that 62% of Gen-Z in the United States own AirPods. New concerns about data privacy arise every day, which become more daunting if the information being collected is our brain waves. The trillion-dollar tech company recently applied for a patent that would mean exactly that: attaching electroencephalogram (EEG) electrodes to AirPods allowing for the collection of brain waves. This development raises ethical issues of privacy which may undermine the potential benefits of such a product. 

As outlined in their patent application, Apple plans to use a “Processor configured to identify a subset … of active electrodes for sensing a biosignal.” Put simply, this biosignal is the “electrical activity of a brain of a user”. EEGs can get a significant amount of data from small inner-ear sensors, like those that would appear on AirPods. Our brain cells communicate through generating electrical activity and EEG results are generated by the metal sensors detecting these changes in electrical charge. The waves generated from the sensors produce a graph like the one shown in Figure 11A. The brain waves being measured are all associated with different kinds of brain activity: alpha waves are representative of being relaxed or meditative, beta waves are active during intense brain activity, delta activity is indicative of deep sleep, gamma means one is engaged in learning, and theta shows different levels of drowsiness as well as creativity. Some may know these brain waves as those that are measured in polygraph tests that decide if someone is lying, but they can also reveal a surprising amount of information about our state of concentration, awareness and stress levels. 

AirPods equipped with EEG sensors could revolutionize the wearable tech market much like fitness watches and virtual reality (VR) headsets have in recent years. However, consumers may be wondering just how useful this new product is. In its raw forms, EEG waves are utilized for medical or research purposes. For example, there could be developments in diagnoses for patients. One can be diagnosed with epilepsy and other seizure disorders based on EEG results alone. This would mean that hypothetical regular usage of the new AirPods and proper transferring of the results to doctors could help solidify a diagnosis without unnecessary tests and wait times. Additionally, EEGs detect abnormal brain activity that are now being used to assist in the diagnosis and study of degenerative diseases including Alzheimer’s. 

 Furthermore, generating EEGs while the AirPods user enjoys some screen time could be a great way to get the most accurate information about attention and how it is actively affecting our brains. These methods are already being used in some attention state studies. A common concern in the research field is that putting people in a “lab setting” alters the results as a lab isn’t a natural environment. Being hooked up to machines and asked questions is not the same as leisurely scrolling on your phone. Getting EEG results from these natural settings could give us much more accurate representations of brain activity. Researchers are already using in-ear EEGs to simulate this effect. For example, researchers can now offer sleep study patients a more comfortable option using in-ear EEGs as opposed to current methods that require sensors attached to the scalp and face. Having this technology be accessible in AirPods could be advantageous due to the usefulness and diversity of in-ear EEGs. 

While there are benefits to wearable technology that is accessible, there also exists the potential for abuse. There is no way to know how Apple intends to use these brain waves, but the largest concern is customer privacy. While some states have laws that protect biometric data, it is not something that is federally regulated. Illinois specifically has passed a Biometric Information Privacy Act which has specific guidelines for businesses regarding security, storage, use and destruction of biometric data. The United States Senate introduced a version of this nationally in 2020, but there are no current federal laws to regulate data collection. 

To explore the issue of privacy, let us refer back to polygraph tests. Polygraphs are currently highly controversial in the field of criminal justice, often questioning if one should be held accountable for information gathered from lie-detector tests. Though a person accused of a crime cannot be convicted on polygraphs alone, there are arguments about polygraphs violating the rights against self-incrimination. Polygraph tests were invented in the 1920s, yet there still are not extensive policies in place that protect the biometric data. Given the polygraph policies as well as international biodata leakage- there could be mass concerns globally for Apple’s access to biometric data. Biometric data was just recently leaked in the UK. A company called Suprema used for facial recognition and fingerprints was unprotected and information from 1.5 million locations globally was at stake. Not only has biometric information been stolen internationally, Apple itself has had leaks of facial recognition data within the last year. Brain waves would be just as vulnerable to these types of data breaches. 

Though all of the disadvantages are hypothetical, it is important to call attention to the potential threat imposed if Apple has access to such important data. As described previously in the context of Privacy Acts, the federal protection of biometric data can only go so far. Companies like Facebook and TikTok- and many others- have been embroiled in privacy scandals for mishandling consumer data. If they cannot protect our browsing history from abusive third parties, should we entrust them with our EEG data? Since the information gathered from brain waves is arguably more personal and sensitive than that being addressed in Facebook and TikTok disputes, it would be expected that Apple would protect the information being gathered. The ethical concerns, however, are rooted in the fact that this protection is not federally guaranteed to thousands of consumers. Therefore, we must be vigilant about data protection- to allow Apple’s technology to help people and make expensive testing more accessible. This development could mean great advance or harmful distrust in corporations worldwide. Ask yourself, before investing in new wearable tech- are you willing to share your brain waves?