Former CDT intern Sydney Brinker contributed research to this article.
Imagine being forced to disclose sensitive health and personal data in order to access basic activities and services you need to fully participate in society. Then imagine having to repeatedly disclose that information every time a new service you want to use is offered online. Unfortunately, this scenario is exactly what many people with disabilities experience. Not only is it essentially impossible for anyone to understand what happens to their online data, but people with disabilities in particular often find themselves in situations where they are unable to protect information about their health conditions, as they are often forced to choose between accessing the services and technologies they need (standardized tests, rideshares, assistive apps, etc.) and keeping their information private. Of course, this “choice” is hardly a choice at all.
People with disabilities may willingly disclose sensitive and private health information in order to obtain accommodations such as ride-sharing or standardized testing. They may also accidentally (and largely unintentionally) share their disability status simply by using assistive or adaptive technology. A person with a disability’s “choice” is either to disclose that information, receive an inaccessible version of the service, or not use the service at all, which impacts that person’s ability to lead a full and independent life. Not only do people with disabilities have little control over whether to share this information in the first place, but they may also have little knowledge of how that data is processed, stored, and shared after the initial disclosure. For these reasons, disability rights and disability justice advocates must prioritize protecting personal and digital privacy as a key issue and organize around solutions.
There are ways to protect the data and personal privacy of people with disabilities, just like everyone else. Data minimization (the idea that companies should collect only the data necessary to provide the essential functionality of a service) and purpose limitation (the idea that companies should only use data for the original purpose for which it was collected) can allow people with disabilities to receive the accommodations and services they need while better protecting their privacy. The inclusion of these protections is one of the many reasons my organization, the Center for Democracy and Technology (CDT), supported the American Data Privacy Protection Act of 2022 (ADPPA), a comprehensive federal privacy bill that would protect the privacy of “health information,” including disability information.
There are a variety of circumstances in which individuals with disabilities must disclose health or disability information in order to enjoy certain rights and benefits. For example, students with disabilities may have to disclose their disability status (and other sensitive information) to third parties when requesting accommodations for exams such as the ACT, SAT, or AP exams. In theory, some students could choose not to ask for accommodations for these high-stakes exams, jeopardizing their educational futures. But they shouldn’t, and for many students with disabilities this is not an option. Instead, students with disabilities are forced to disclose their disability status to both schools and separate testing companies, but they have no say in what happens to that information afterward, and too often this information is used in discriminatory ways. In fact, a group of affected students sued ACT in 2019, alleging that prospective students with disabilities were excluded from college recruitment efforts because the company disclosed students’ disability status to colleges to which they applied and sold “personally identifiable data of students, including detailed disability data” to the colleges. ACT settled in 2020 without admitting fault, but agreed to pay $16 million to affected individuals and to end the practice of flagging student aid on score reports to colleges.
People with disabilities may also face pressure to disclose their disability when using transportation. For example, Uber allows people with disabilities to file a complaint if a driver refuses a ride because of a service animal or assistive device, and provides wheelchair-accessible vehicles upon request (if available) through its “WAV” feature. Additionally, Uber allows people with disabilities to receive reimbursement for long wait times related to their disability and can apply for a permanent waiver of wait-time fees through a disability “proof.” While these are welcome policies and features that make the app more accessible to users, questions remain about how data disclosed through the use of these features will be used. For proof, Uber has stated that it will not use the information for any purpose other than to provide fare waivers. However, for other information, such as those who use the WAV service, the privacy policy is not clear about what will be done. The policy informs users that Uber collects information related to some users’ declared and inferred demographic data, transaction data, communication data, etc., and it is unclear which category disability-related data falls under. For example, disability data could be part of the demographic data inferred based on the use of the WAV service. Accessible transportation requests could also be classified as transactional data (as a request for a wheelchair accessible vehicle could be considered a “type of service requested”) or communications data (particularly if you are communicating with a driver or Uber support about a disability-related request or accommodation).
If Uber collects disability-related data, that data is subject to Uber’s privacy policy. The policy generally states that Uber shares and discloses some data (depending on which category the data falls into) with several third parties, including “social media advertising services, ad networks, third-party data providers, and other service providers” to “reach and better understand our users, and measure the effectiveness of our advertising.” Uber also shares some data “with social media companies, such as Facebook and TikTok, in connection with your use of tools on Uber’s apps and websites.” Data shared with these platforms through Uber is subject to the privacy policies of each company, and data collected may be further shared, for example, to facilitate sales or advertising. Because the policy is vague, people with disabilities have no way of knowing whether their sensitive data is being shared more broadly than originally intended, or whether it is being used to facilitate third-party advertising. This could be a serious privacy concern for people with disabilities.
Individuals with disabilities also face privacy concerns about disclosure of their disability status or other disability-related data when using assistive technologies, where use of the technology itself may lead to unintended disclosure of their disability or related data, rather than proactively disclosing information in order to receive an accessible version of a service.
For example, “smart” medical wearables like hearing aids could pose a significant privacy threat to users with disabilities. One study found that hearing aid wearers are not provided with “information about the full extent of data collection, transmission, storage, and potential purposes,” and that some data, including the wearer’s location, is shared with third parties. The study noted that the companies that manufacture these devices are often not subject to existing privacy laws, such as the Health Insurance Portability and Accountability Act (HIPAA), so users of these devices have limited protections. Similarly, apps for the blind, such as Be My Eyes (which allows users to send video and audio recordings to sighted users for identification), inform users that they collect and share users’ personal information, use some of it “to serve and inform targeted advertising,” and store it “as part of the normal functionality of the app,” without mentioning any automatic or standardized deletion. Be My Eyes discourages users from submitting personal information via the app, but it is unclear whether the company has other safeguards in place to protect that information in the event that disclosure of that information is unavoidable, and whether that information is stored or deleted.
While giving people with disabilities control and protection over their data may help to some extent, the primary responsibility for keeping their information private should not be placed solely on the individual. Instead, the companies that collect, monetize, and sell this data should be responsible for protecting the privacy of that data, and lawmakers and companies should be committed to protecting disability-related data first as a way to protect individuals online, especially those from marginalized communities. Companies and lawmakers should also think specifically about the harms that people with disabilities face when their sensitive information is disclosed, and prioritize specific federal privacy laws that protect these individuals, including but not limited to bills that include strong data minimization and purpose limitation provisions. These kinds of solutions should also be prioritized by disability rights and justice advocates.
People with disabilities should not have to choose between access to technology that helps them live fulfilling and independent lives and protecting their personal information. And people with disabilities should be able to benefit from technology without worrying that their health-related data will be used for malicious or unknown purposes. It is critical that leaders in the disability community embrace internet privacy as a disability rights and justice issue and advocate for changes in laws, regulations, and industry policies that help protect the data of all individuals, including those with disabilities.