The MIND Act: Balancing Innovation and Privacy in Neurotechnology
Yesterday, three US senators announced that they will soon introduce a novel bill in Congress that, if passed, would set forces in motion to address concerns that some have about the rapid advancement of neurotechnologies that can “read and write” to the human mind. Senators Chuck Schumer, Maria Cantwell and Ed Markey plan to introduce the “Management of Individuals’ Neural Data Act of 2025” (MIND Act), which will apply to both implanted brain-computer interfaces (BCIs) and wearable neurotech, such as headbands, ear buds, helmets and wristbands that detect activity from the central or peripheral nervous system.
In many cases, neural data alone is benign and is not being used for any purposes that would raise concern. On the contrary, in many cases, neurotechnologies are being used to genuinely help people, including to help paralyzed people move their exoskeletons and to help people who can’t speak orally communicate through a BCI. However, neural data can also be used to infer sensitive personal information about a person, such as their feelings about something, whether they are paying attention and, in some research studies, even their inner speech.
These technologies can not only detect brain activity, but also they can stimulate the brain, and therefore have an impact on the brain. Brain stimulation is used, for example, to treat depression. But it can also theoretically be used to cause a person to feel an emotion that they did not feel absent the stimulation. In the 1960s, Spanish researcher Dr. José Delgado shocked the research community by using a remote control to stimulate the brain of a bull to cause the bull to become docile amid charging toward Delgaldo holding a red cape. As soon as Delgado stopped the brain stimulation, the bull began its charge again.1 One can imagine how a technology like that could be used in warfare, sports or a host of other use cases.
BCIs promise enormous benefit to those who need them. By recording and stimulating the brain in a targeted manner, BCIs can drastically improve the lives of people impacted by injuries and ailments. A man rendered quadriplegic after a car accident was fitted with a set of electrode arrays that could read neural impulses; he can now use the computer, play video games and manipulate a robotic arm.2 A stroke victim who was left paralyzed and unable to talk could once again speak and make facial expressions through a virtual avatar, thanks to an implanted BCI that could rapidly decode her neural data.3 A person with a paralyzed hand achieved typing speeds of 90 characters per minute, thanks to a BCI that decoded their thoughts as they imagined writing by hand.4 These are just a few examples of the transformative potential of BCIs to restore sensory experience and independence.
Based on a careful reading of the MIND Act and accompanying press release, the senators want the Federal Trade Commission (FTC) to explore regulating the use cases that are concerning, not the many beneficial uses. The MIND Act references several specific concerns that the senators want the FTC to explore. These include mind and behavior manipulation, monetization of neural data, neuromarketing, erosion of personal autonomy, discrimination and exploitation, surveillance and access to the minds of US citizens by foreign actors. They are also concerned about misuse of neural data in the employment, healthcare, financial services, housing and education industries, and in law enforcement and the criminal justice system. They specifically ask the FTC to identify any gaps in the protection of children and teens.
The senators also want the FTC to analyze potential security risks associated with neurotechnology. Without cybersecurity measures in place, ultra-sensitive neural data could be compromised and susceptible to access by unauthorized parties and threat actors. Implanted BCI hardware could be directed by an unauthorized threat actor to stimulate the brain in undesired ways or cause physical movements or sentiments by the patient. A cyber-attack on an implant could render the implant dysfunctional, eliminating the benefit to the patient. To address these concerns, lessons can be learned from other kinds of internet-connected devices that have been on the market for years.
- Software updates can be checked for integrity at the point that they are downloaded, at the point that they are transferred from the external device to the implant, and at the point that the software update is installed in the BCI device. Patients can retain the ability to roll back a software update to a prior version if something goes wrong.
- All connections to and from the implanted device can be authenticated with a secure login process, preferably one using multifactor authentication. Patients can be enabled to reset or block logins and previously trusted device connections, as well as switch off the wireless connectivity of the implanted devices when the connectivity function is not being used.
- Technical safeguards, such as encryption, can be put in place to protect data stored, processed and transmitted by BCI implants.
- Off-device artificial intelligence (AI) used to process data that is collected by the BCI can be trained against – and to detect – adversarial AI inputs to the BCI device.5
Throughout the MIND Act, it is clear the senators recognize the many benefits that neurotech has been developed to serve. They ask the FTC to weigh those benefits against the risks, and to determine ways that the risks can be mitigated while still realizing the many groundbreaking benefits of neurotechnologies, which can detect epileptic seizures, enable a person with ALS to communicate and help a paralyzed person to use their limbs.
While these applications have benefits, they can also be used in ways that an individual might not want. That is what the MIND Act is asking the FTC to study: What harms could this technology be used for, and is there already a sufficient legal regime in the US to address those harms?
The MIND Act also contemplates, without giving specific examples, that some uses of neurotech might be suitable to be prohibited, regardless of individual consent, because their harms outweigh their benefits.
There are currently four state laws in the US that expressly protect neural data,6 as well as the federal health privacy law, the Health Insurance Portability and Accountability Act (HIPAA), and federal clinical trial regulations that only protect neural data in narrow circumstances.7 But, there is currently no comprehensive legal regime to protect the sensitive personal information that can be collected from our brains using neurotechnologies.
The MIND Act would require the FTC to spend one year conferring with stakeholders to explore whether our existing legal regime already adequately addresses neurotechnology, what gaps exist in our current laws, and what additional protections should be put in place to protect individuals from neurotechnologies that can be used in ways that are not favorable to the individual who uses the technologies. At the end of the year, the FTC is to report to Congress and the public on its findings and repeat the study annually. The MIND Act would allocate the FTC $10 million for its work.
Among the many noteworthy aspects of the MIND Act, the law would apply not only to data collected from the nervous system, but also to other data that can infer, predict or reveal cognitive, emotional or psychological states or neurological conditions. This could include heart rate variability, eye movement, voice analysis, facial expressions and sleep patterns.
Also, the MIND Act is welcomingly collaborative – it requires the FTC to confer with a wide array of stakeholders as it conducts its yearlong study, including relevant federal agencies, the private sector, academia, civil society, consumer advocacy organizations, labor organizations, patient advocacy organizations and clinical researchers.
An important thing to remember about the MIND Act is that even if it is passed by Congress, it will not require businesses or researchers to do anything. It will merely require the FTC to conduct a study on the topic and report the results to Congress and the public.
The senators hope the MIND Act will incentivize companies to self-regulate – that is, “to help shape responsible standards so innovation can thrive safely.” In the current legal climate, businesses in this space would benefit from deciding on self-regulatory standards to follow now, even in the absence of laws. By doing this, they may or may not stave off legislation, but they would likely shape the ultimate legislation.
To incentivize neurotech businesses to self-regulate, the MIND Act asks the FTC to explore several financial incentives for businesses to do so, such as research and development tax credits, financial support and expedited regulatory pathways for product approvals.
And finally, there is something in the MIND Act that recognizes the extraordinary risk that today’s experimental BCI participants are taking to help businesses develop this technology for the benefit of future victims of ALS, stroke, paralysis and other ailments. The MIND Act asks the FTC to explore policies that would support long-term support for users of BCIs so they can continue to enjoy the benefits of their implant after a study has completed. Such policies include interoperability standards between various BCI technologies and post-trial BCI maintenance.
One-hundred and eighty days after the FTC’s report to Congress, the MIND Act would require that the director of the Office of Science and Technology Policy, in consultation with the FTC and the Office of Management and Budget (OMB), develop guidance on federal government agency use of neurotechnologies. Sixty days later, the OMB will make those guidelines binding on all federal agencies.
As the MIND Act progresses through the legislative process, it is crucial for all stakeholders to stay informed and engaged. That includes neurotech companies, researchers, clinicians and individuals who use, and in some cases rely on, neurotech. We will continue to monitor developments and provide updates on how the MIND Act may impact the neurotech industry.
For a one-stop shop of written, video and audio resources about this important topic, see Cooley’s neural privacy resource center.
Email your thoughts and feedback on this important topic to:
Cooley associate Nate Kim also contributed to this alert.
Notes
- Timothy C. Marzullo, “The Missing Manuscript of Dr. José Delgado’s Radio Controlled Bulls,” Journal of Undergraduate Neuroscience Education, June 15, 2017.
- Emily Mullin, “This Man Set the Record for Wearing a Brain-Computer Interface,” Science, August 17, 2022.
- Pam Belluck, “A Stroke Stole Her Ability to Speak at 30. A.I. Is Helping to Restore It Years Later,” The New York Times, August 23, 2023; S. L. Metzger, et al., “A high-performance neuroprosthesis for speech decoding and avatar control,” Nature 620, 1037 – 1046 (2023).
- F. R. Willett, et al., “High-performance brain-to-text communication via handwriting,” Nature 593, 249 –254 (2021).
- Tyler Schroder, Renee Sirbu, Sohee Park, Jessica Morley, Sam Street and Luciano Floridi, “Cyber Risks to Next-Gen Brain-Computer Interfaces: Analysis and Recommendations,” Neuroethics, July 14, 2025.
- These states are California, Colorado, Connecticut and Montana. Kristen Mathews, “Comparing New Neural Data Privacy Laws In 4 States,” Law360, June 27, 2025.
- Suzanne Smalley, “As scientists show they can read inner speech, brain implant ‘pioneers’ fight for neural data privacy, access rights,” The Record, September 22, 2025.
This content is provided for general informational purposes only, and your access or use of the content does not create an attorney-client relationship between you or your organization and Cooley LLP, Cooley (UK) LLP, or any other affiliated practice or entity (collectively referred to as "Cooley"). By accessing this content, you agree that the information provided does not constitute legal or other professional advice. This content is not a substitute for obtaining legal advice from a qualified attorney licensed in your jurisdiction, and you should not act or refrain from acting based on this content. This content may be changed without notice. It is not guaranteed to be complete, correct or up to date, and it may not reflect the most current legal developments. Prior results do not guarantee a similar outcome. Do not send any confidential information to Cooley, as we do not have any duty to keep any information you provide to us confidential. When advising companies, our attorney-client relationship is with the company, not with any individual. This content may have been generated with the assistance of artificial intelligence (Al) in accordance with our Al Principles, may be considered Attorney Advertising and is subject to our legal notices.