The Intersection of Technology, Disability Rights, and Worker Rights
By Jennifer M. LaGrow, Santiago F. Orosco, Kehsi Iman Wilson (New Disabled South);
Monika Krol, Lydia X. Z. Brown, Ramonia Rochester (National Disability Institute)
PDF Download of Plain Language Summary
Executive Summary
People with disabilities often face more barriers when looking for work or using workplace technology.
Today, many employers use new tools to manage workers and make decisions. These tools include:
● Artificial intelligence (AI): Computer programs that try to act like humans by learning and making decisions.
● Surveillance technology: Tools used to track what workers are doing, like video cameras or software that records your keystrokes.
● Robotics: Machines that do tasks usually done by people.
These tools can help make jobs more accessible. But they can also cause problems if they are not built or used fairly. For example, they can make it harder for disabled workers to get hired, or they can collect private data without permission.
The National Disability Institute and New Disabled South studied how these technologies affect disabled workers. Their work included:
● Interviews
● Focus groups
● A national survey
● A review of policies and laws
● A meeting with experts and community members
They found several problems:
● Employers use tools that haven’t been tested well.
● The tools often aren’t made with disabled people in mind.
● Automation, like robotics, may take away jobs.
● Some tools collect personal data that shouldn’t be shared.
● There are not enough ways for workers to adjust or opt out of using these tools.
The study also looked at how these problems get worse for people who face more than one kind of unfair treatment—like being both disabled and part of a racial minority.
The report recommends stronger protections for workers and better laws. It also says companies and tech developers should include disabled people when they design tools. Everyone involved in workplace technology—employers, lawmakers, and researchers—should make sure the tools help people instead of hurting them.
Key Research Takeaways
● Technology isn’t good or bad by itself. It depends on how people choose to use it.
● Disabled workers can be affected unfairly when employers use technology without thinking about accessibility or fairness.
● There is no clear or agreed-upon definition of AI. This makes it harder to create rules or judge how it’s being used.
● Tech companies aren’t required to follow shared rules or values. They don’t have to think about ethics or fairness unless the law says so.
● All new technology should be built with disabled people included from the start. Their input is key to making sure tools work for everyone.
● AI tools that are made well can help. For example, they can help with communication, time management, or writing. But badly designed tools can make things worse.
● Surveillance tools can increase pressure at work. These tools often watch disabled workers more closely than others. This can lead to stress or injury.
● Robotics can replace people. In places with few job options, this can leave disabled workers without any employment.
● Workplace adjustments help more than just disabled workers. These are changes to help someone do their job better. When offered to everyone, they can improve morale and reduce turnover.
● Employers should clearly explain what adjustments are available. Workers should know what they can ask for and how.
● Local knowledge matters. The needs of workers change depending on where they live and what kind of community they’re part of.
● Stronger coalitions make change possible. To support disabled workers, different groups—like labor unions, disability advocates, and tech experts—need to work together.
● There isn’t enough research yet. These issues are still being left out of most laws, company policies, and public conversations.
Introduction
In the last 10 years, new technology has changed how people live and work. Some experts call this the "fourth industrial revolution." This means jobs now use tools like:
● Artificial Intelligence (AI): Computer programs that try to think and make decisions like humans
● Autonomous technology: Machines or systems that do tasks without needing people to control them
These tools are now used in many jobs. But not everyone can access or use them the same way.
More than 4 out of 10 disabled people in the U.S. say they have a hard time using technology. Only 1 in 4 say they have high-speed internet. That makes it harder to get and keep jobs—especially as more work depends on digital tools.
Even though workplace technology could help, most research, policy, and company decisions don’t focus on disabled workers. Even when new policies talk about fairness or civil rights, disability is often left out. Many conversations focus on things like the economy or national security—but they don’t think about how technology affects disabled people. This makes it harder to create rules that protect everyone.
We also know:
● In 2022, only 21 out of 100 disabled people had a job
● That same year, 65 out of 100 nondisabled people were working
● Long COVID and other health conditions have caused a big increase in the number of disabled people
There are many reasons why new technology can make things harder:
● Tools might not work with screen readers or assistive devices
● Jobs may not provide options for adjustments
● AI and automation may replace jobs that disabled people rely on
Project Overview
The National Disability Institute and New Disabled South worked together on a year-long project. They looked at how new workplace technology affects disabled workers.
The team:
● Read past research
● Talked to workers
● Held focus groups
● Ran a national survey
● Met with experts and advocates
● Studied laws and company policies
They wanted to learn what’s working, what’s not, and how to make things better.
Most research looks at hiring tools—like how computers scan job applications. But this project looked at what happens after someone gets the job. That’s where the real problems often begin.
They focused on tools used on the job, like AI, surveillance systems, and tracking software.
The research also followed two important values:
● Disability rights: Making policies more fair and inclusive
● Disability justice: Changing how people think about disability, power, and value
A lot of policy talks about tech in terms of money or security. This study focused on fairness, safety, and the rights of disabled workers.
The team made sure disabled people were involved in every part of the project.
They focused on jobs where many disabled people work, like:
● Warehouses
● Delivery jobs
● Retail stores
● Manufacturing
These jobs are often low-paid and high-risk. Workers are monitored closely, pushed to move faster, and not always given the breaks or support they need. This puts disabled workers at higher risk for injury, burnout, and job loss.
The study also looked at how things are harder for people who are both disabled and part of other communities that are treated unfairly—for example, people who are Black, Indigenous, or part of the lesbian, gay, bisexual, transgender, queer or questioning, intersex, and asexual (LGBTQIA+) community.
To make this project stronger, the team also invited experts in labor rights, disability justice, technology, and public policy to guide the research and share advice.
Field Research Summary Analysis
This part of the research looked at how disabled workers feel about using technology at work. The team focused on people who work in:
● Warehouses
● Delivery jobs
● Retail stores
● Manufacturing
They asked questions about safety, stress, job security, and how it feels to be at work when technology is used to track or manage employees.
They also looked at the experiences of people with more than one identity—such as workers who are both disabled and Black, or disabled and low-income. These workers often face even more unfair treatment.
Workers shared that:
● They feel like they are being watched all the time
● They are pushed to move faster than is safe
● They work in buildings that are too hot and don’t have enough places to rest
● They don’t feel safe asking for help
● They’re afraid to report injuries because they might get in trouble or lose their job
One person said working under these rules made them feel hopeless. Another said it caused them pain that still hasn’t gone away.
The research used a broad definition of disability. This includes physical disabilities, mental health conditions, learning disabilities, and conditions that affect movement, speech, memory, or self-care. The report uses both person-first (like “person with a disability”) and identity-first (like “disabled person”) language to respect different preferences. This part of the study used both disability rights and disability justice frameworks. That means it focused on fairness in policy and on changing how people think about disability, power, and value. The team made sure disabled people were part of every step.
Technology
This section looks at how different types of technology affect disabled workers.
Sometimes, technology helps. It can:
● Make it easier to work from home
● Provide automatic captions during meetings
● Read text out loud for people who are blind or have low vision
● Help people type using speech or by moving their eyes
These tools can make jobs more accessible. For disabled workers, technology doesn’t just make things easier—it makes things possible. But that only happens when the tools are designed with access in mind.
But technology can also cause problems. For example:
● Many products are not tested with disabled people
● Designers often don’t know how to make things accessible
● Some tools do not follow laws like the Americans with Disabilities Act (ADA) or digital access rules like the Web Content Accessibility Guidelines (WCAG) New rules about web and app access from the U.S. government mostly apply to state and local agencies. But even private companies can learn from them when designing tools.
Some people believe technology can fix everything. But that’s not true. Tools only help when they are designed the right way—and with input from the people who will use them.
Many workers said the tools they use were confusing, stressful, or made their jobs harder. Some also worried that companies use new tools to replace people or control them more, not to make things better.
The people in this study said developers, employers, and lawmakers need to do better. They should:
● Include disabled people in the design process
● Train teams on accessibility
● Build tools that are flexible, safe, and fair
Artificial Intelligence (AI)
Many companies use AI tools to make decisions about workers. These tools can be helpful when built well—but they can also make things worse, especially for disabled workers.
How AI Shows Up At Work
Some companies use AI to:
● Choose who gets interviewed
● Track how fast people work
● Decide who gets trained or promoted
● Watch what people do at their desks or stations
Workers may not know these tools are being used.
They also may not get a say in how they’re judged.
When AI Helps
Some disabled workers said AI helped them:
● Stay organized at work
● Write emails or documents
● Use captions during video calls
● Keep track of time and tasks
● Do things more easily with speech or movement tools
These tools can be helpful when they are made to include all kinds of users.
When AI Causes Problems
Many tools are built without asking disabled people what they need.
That leads to problems like:
● The tool doesn’t work with assistive technology
● It’s hard to understand or use
● It makes decisions that hurt someone’s chances at work
Some workers said they lost jobs or missed out on chances because of decisions made by AI—without any warning or explanation. Some AI tools can give wrong or made-up answers that sound correct. This can cause real problems when workers are being judged by those tools.
Problems With Hiring Tools
AI tools used in hiring often make unfair decisions.
For example:
● They might ignore someone’s job application if there’s a gap in work history
● They might sort people out if they don’t have a driver’s license—even if the job doesn’t require one
● They might treat someone as a problem for needing more breaks, which could be related to a disability
These tools don’t look at the full person.
They just follow patterns based on old data—and that can hurt people who already face unfair treatment. New rules about web and app access from the U.S. government mostly apply to state and local agencies. But even private companies can learn from them when designing tools.
Managers Left Out
In some cases, even the managers don’t know how decisions were made.
One worker said they were fired after a mistake, but no one had ever told them anything was wrong.
The AI system had been keeping track, but no person had ever spoken to them.
That kind of system makes workers feel unsafe. They don’t know what’s being watched—or how to fix something before it becomes a problem.
Data And Privacy Worries
AI tools often watch workers closely.
They might record:
● How long someone is sitting or standing
● What they type or say
● How often they move away from their workstation
● Personal health information, even if the worker didn’t agree to share it
Workers said they don’t know where this information goes or how long it’s kept.
Some were afraid the data might be shared or used in ways that could hurt them.
Fear Of Being Replaced
Many workers worry that AI will take over their jobs.
This fear was especially strong in warehouse, retail, delivery, and service jobs—where disabled people are often employed.
Some said the company already started using machines or apps to do work that people used to do.
Others said managers seemed to stop listening—like the AI’s opinion mattered more than theirs.
Mixed Opinions
Some workers said AI made work easier.
Others said it made things more stressful.
Most agreed the tools are being added too fast, without enough testing or feedback. In our study, over half of disabled workers said AI helped with their health, mental health, or job security. But many others said it made things worse—or even got them fired.
What Workers Want To See Change
Disabled workers said:
● Companies should ask disabled people how these tools should work
● Workers should have ways to adjust the tools or use different options
● Managers should stay involved and not leave everything to machines
● Workers should be told what’s being tracked, and be allowed to say no
● Laws should protect people from tools that share or misuse private information
Surveillance Technology
Many companies use tools to watch workers while they’re on the job. These tools are called surveillance technology. They track what people do, how long they do it, and when they stop working.
This section explains how that kind of tracking affects disabled workers—and why it causes stress, injury, and fear.
What Is Surveillance Technology?
Surveillance tools follow workers during the day.
They may:
● Record video
● Track where you go while you work
● Use badges or wristbands to collect data
● Measure how fast you work
● Watch when you step away from your work area
These tools are usually used on people just starting their jobs or working jobs that mean standing, lifting, or moving a lot—not on managers.
What Happens to Workers
Some people said they felt like they were being watched all day. Some workers said being watched made it harder to speak up or work together. They were afraid to ask for better conditions or support each other, because it might get them in trouble. Even small actions—like using the bathroom or stopping for a few minutes—got recorded.
One person said when they took a bathroom break, the system marked their name. It didn’t matter that there was a line, or only one bathroom for hundreds of workers.
Another said their badge recorded how long they were away from their station—even if they didn’t feel well or needed time to recover.
Why It’s Hard on Disabled Workers
Some workers need more breaks, or need to move slower, to take care of their body or health.
But the tools don’t see that. They treat it like the person isn’t doing their job.
This can lead to:
● Getting in trouble
● Being passed over for a raise
● Losing hours
● Being treated like they’re not trying hard enough
Many workers said the pressure made them feel anxious, tired, or afraid to speak up. In our study, 6 out of 10 workers said they’d been injured on the job—and most said the injury was linked to technology. For many, the pain didn’t go away.
Workers Don’t Always Know
Some people didn’t even know they were being tracked.
They only found out when they got in trouble or were told they weren’t doing well.
One person said, “We feel it. We just don’t know when or how it’s being used.”
Culture and Background Matter
Not everyone feels the same about being watched.
In some communities or countries, it may feel normal. In others, it may feel unsafe or disrespectful.
Many workers in the study were immigrants. Some said they felt extra pressure to follow the rules because they didn’t want to lose their jobs or get in trouble.
Safety Isn’t Always the Goal
Companies often say they use these tools to keep people safe.
But many workers said safety only mattered when it didn’t slow down production.
One person said a deaf coworker was told to drive a forklift, but no safety changes were made—even though they couldn’t hear warning signals.
What This Means for Workers
This kind of tracking creates a workplace where:
● People are afraid to ask for help or speak up when something feels wrong
● Supervisors can make decisions based on numbers—not people
● Workers are more likely to get hurt, quit, or burn out
● Trust between workers and management starts to fall apart
There aren’t strong rules yet to protect people from how companies use tracking tools. Government agencies haven’t made clear safety rules about this, and most companies don’t report injuries caused by tech.
What Workers Want to See Change
Workers said surveillance tools should never be used to punish people for taking care of their health.
They said companies should:
● Be honest about what they’re tracking
● Tell workers how the information will be used
● Let workers ask for changes if a tool is causing harm
● Make sure safety comes before speed
Algorithmic Discrimination and Bias
Many companies use computer systems to help make decisions about workers.
These systems follow rules written by people. But when those rules leave out disabled people—or don’t account for real differences—they can cause harm.That’s called algorithmic discrimination.
It means people are treated unfairly because of how the system was built.
What This Looks Like at Work
Sometimes a tool decides:
● Someone isn’t a good fit because they move, speak, or think differently
● A person shouldn’t be hired because they took time off for medical care
● A worker is less productive because they use a screen reader or need more breaks
These decisions may not come from a person—but they still cause the same kind of harm.
This is still discrimination—even if it comes from a computer.
The People Building These Tools Often Don’t Understand Disability
Most of the people who make workplace technology haven’t been trained on disability, accessibility, or fairness.
They may not know what it’s like to use assistive tech, live with chronic pain, or have a brain or body that works differently.
Some said they didn’t even realize disabled workers would use their tools.
Because of that:
● They don’t build tools that can be adjusted to different needs
● They don’t check to see if the tool could hurt someone
● They only test it on people who are not disabled—and then assume it works for everyone
Some companies say their tools are fair for everyone. But that’s not always true. A computer test might say you’re not a good worker just because you take longer to answer questions, or because you think or move differently. These tools are not always built to understand disabled people—and that can lead to unfair decisions.
Who Gets Left Out
These systems are often built using old information that doesn’t include disabled people at all.
So the tools don’t know how to handle people who work differently or need something changed.
Even when disabled people are in the data, the system may treat their needs like mistakes—something to ignore or fix.
This gets even harder for people who are also Black, Indigenous, or part of the LGBTQIA+ community.
The more someone’s life doesn’t match the pattern the system expects, the more likely they are to be treated unfairly—or not counted at all.
What Workers Experience
Workers said these tools:
● Keep people from getting jobs
● Judge people for things that don’t affect how well they work
● Make choices based on old patterns that were never fair in the first place
● Hide unfair decisions behind computer rules that no one can question
This kind of design hurts workers and makes it easier for companies to ignore real problems.
What Needs to Change
Disabled workers said:
● Every tool should be tested with disabled people before it's used at work
● The people building systems need to learn about access and fairness
● No one should be judged by a system that wasn’t made with them in mind
● Workers should always be told how the system works and be able to ask for changes
● Fairness rules should be public—not hidden in secret software or behind company policies
Robotics and Automation
This section explains how robotics—especially automated machines used at work—affect disabled workers. Some workers said these tools helped them. Others said they caused stress, health risks, and job loss.
What Are Robotics at Work?
Robotics and automation are machines that can do tasks without needing a person to control them all the time. These machines are used to:
● Move products
● Pack boxes
● Deliver items around large buildings
● Replace people in jobs that involve lifting, carrying, or moving items
Companies say robotics make work faster, cheaper, and safer. But not everyone agrees.
Fewer Jobs, More Pressure
Some workers said they were afraid of losing their jobs to machines.
They had seen robots take over tasks that used to belong to people—and those jobs didn’t come back.
One person said a single robot could replace two or three workers, and cost less in the long run. Some workers said they weren’t replaced—but they were expected to move as fast as the machines. That made them feel tired, sore, and more likely to get hurt. Even workers who said the machines helped still worried they might be next.
Health and Safety Risks
Some workers said the machines made their jobs more dangerous.
They had to work near robots that moved fast, didn’t stop when they should have, and didn’t give clear signals before shifting direction.
One worker said they almost got hit because the machine rolled up behind them without warning.
Others said they were never trained on how to stay safe around the equipment.
Mixed Feelings About Robots
Some disabled workers said the machines helped.
About half said robotics made the workplace better or helped them feel more stable in their jobs.
But nearly 4 out of 10 still said the machines made their mental health worse.
Even people who said the robots were helpful still worried—about being replaced, being expected to move at machine speed, or being treated like they didn’t matter.
The Problem with How Robotics Are Explained
Workers said companies often talked about the good parts of robotics but didn’t explain the risks.
Some said they were told the robots were there to help—but later found out the machines were replacing jobs or changing the pace of work.
Companies often say robots help disabled workers—but they don’t always ask what disabled people actually need. There isn’t enough research yet to know how these tools really affect people with different types of disabilities.
What Needs to Change
Workers said:
● Companies need to be honest about what robotics are meant to do
● New machines should be tested to make sure they don’t put workers at risk
● Disabled workers should help design how robots are used
● Mental and physical health must be part of safety planning—not just how fast or how much people are expected to work
Health, Safety, and the Work Environment
Many disabled workers said their jobs were unsafe—especially in warehouses and factories.
One big problem was heat. Most buildings didn’t have air conditioning. Some workers passed out. Some workers said the heat made them feel faint or sick. A few were allowed to cool down in air-conditioned rooms—but that time was often tracked, and workers worried it would count against them. For people with disabilities like diabetes, or those who take medications that affect body temperature, getting too hot can be dangerous. It can lead to headaches, dizziness, confusion, or even passing out.
Some workers said they didn’t feel safe telling their boss when they got hurt. They were afraid they might lose their job or get fewer hours. One person said they were fired after a drug test showed medicine they got at the doctor’s office. Others said it was hard to get help, even when they were in pain.
A few workers talked about joining a group to try to make things better. These groups are sometimes called unions. They help workers speak up together and ask for safer conditions. But many people said that felt too risky. In some places, the rules make it harder to join a union. And some companies try to stop workers from speaking up at all.
Accommodations
An accommodation is a change at work that helps someone with a disability do their job. This can include changes to schedules, equipment, the work area, or job tasks.
A Difficult Choice for Workers
In the United States, a worker must tell their employer they have a disability if they need changes to help them do their job. This is called disclosing a disability. It can be a hard decision: if they don’t say anything, they can’t get help; if they do, they risk being treated unfairly.
Workers said the process for asking for changes is often tiring, slow, and frustrating. Some employers make workers ask again every month, even if the disability is long-term and will not go away. Many workers don’t even know what changes they can ask for or how to ask for them. This makes people feel like they have to constantly prove they are telling the truth, which creates an unwelcoming workplace.
Some companies take so long to respond that workers feel pushed to quit. When changes are approved, they are sometimes generic solutions that don’t fit the worker’s actual needs. People who work only part of the year are often left out completely.
Costs and Misunderstandings
Many employers think making changes for workers will be costly. However, research shows more than half of the changes cost nothing. When there is a cost, it is usually a one-time amount of around $300.
In our national survey, 61% of workers said they had asked for changes at work. Of those, 27% were fully denied, and 33% said their needs were still not met even if the request was approved. One worker said there weren’t enough chairs for everyone who needed one, so sometimes they had to work without one.
Impact on Keeping Workers
Making the process hard or saying no to needed changes can cause employees to leave. One Human Resources (HR) professional — the part of a company that handles hiring and employee needs — said companies don’t want constant turnover, and making changes for employees is key to keeping them.
Workers shared examples of leaving jobs because they couldn’t get the changes they needed. Reasons included needing breaks for diabetes care, schedule changes for medical appointments, or adjustments to equipment that caused pain.
Changing Workplace Culture
There is hope that more leaders with disabilities will help change company culture so workers feel safer asking for what they need. Workplaces designed with accessibility in mind reduce the need for individual requests.
Changes like captions for meetings, speech-to-text software, or curb cuts help everyone, not just workers with disabilities.
Ideas from the Research
● Have a set budget for making changes that help workers with disabilities do their jobs.
● Have one trained person responsible for handling all requests for changes.
● Work with local disability rights groups to better understand what workers may need.
● Make the process for asking for changes simple, clear, and consistent.
● Think of changes in a broad way so they can help all workers, not just those with disabilities.
How Multiple Identities Affect Workers
In many jobs, especially in warehouses, factories, and stores, disabled workers are not treated as people with their own needs. Employers often act as if every disabled worker needs the same things.
If a worker has a disability and is part of another group that is often treated unfairly, their problems can be even bigger. This can include being a woman, a person of color, LGBTQIA+, from a low-income background, or the first in their family to work in a certain industry.
Some technology makes these problems worse. For example, facial recognition systems are computer programs that scan a person’s face to identify them. These systems can make more mistakes with women of color because they are often built and tested using pictures of white men or certain Asian men.
Tools like artificial intelligence (AI) and workplace monitoring systems are usually designed to focus on speed and productivity. They may not take into account a worker’s health, comfort, or personal needs. This can lead to workers being treated as if they do not matter as individuals, instead of being valued for what they bring to the job.
First Nations, Native, and Indigenous Communities
People from First Nations, Native, and Indigenous communities in the United States belong to their own nations. These nations make their own laws. People from these nations are also U.S. citizens.
The term “Indian country” includes many different cultures and languages. In some communities, there is no word for “disability.” People in these communities may think about disability in different ways. Some see it as something bad. Others see it as a strength because of the skills it can bring.
Not everyone has the same access to technology. Some reservations, especially in rural areas, do not have good internet service. Some also do not have training for how to use computers or other technology. This can make it hard for people to work in jobs that use technology.
Tribal nations do not have to follow the Americans with Disabilities Act (ADA) unless they choose to. Even if they do choose to, it can be hard to make the rules work if they do not have enough money or other resources.
Why This Matters
When companies bring in new technology without thinking about all the different groups of people who work there, some workers get left out.
Workplaces and laws need to include and respect people from every background. This helps make sure everyone has the same chance to get and keep a job, no matter where they are from or what their needs are.
Good Practices for Working Together
This section explains how to bring people together to help workers with disabilities. It gives simple steps for working in a fair way.
Why This Matters
Technology is changing very fast. Rules and laws are not keeping up. Workers with disabilities can be left out. If different people work together, they can make sure these workers are treated fairly.
A strong group should include:
● People who help others in their town or city
● People who work for the rights of workers with disabilities
● People who work for the rights of all workers
● Workers with disabilities
● People who know about technology
● Businesses
● People who study work and jobs
● People who help make laws and rules
These people should be included from the start and stay involved until the work is done.
Problems When Working Together
Groups that support workers with disabilities and groups that support all workers have not always worked well together. Sometimes they disagree about rules. Sometimes people are treated unfairly because of their disability. Some bad experiences from the past have not been fixed.
These problems still affect how people work together today. But the two groups need each other to make real change.
In this project, we listened to both groups. We asked what they needed and how this report could help them. When they work together, they can get more done.
Steps for a Strong Group
People in this project suggested these steps:
Clear leader – Someone should guide the group, set goals, and tell members exactly what to do.
Respect time and energy – Some people have limited time or energy. Move at a pace they can keep up with.
Pay people for their work – Pay for their time, skills, and ideas.
Include different people – Bring in people with many types of experience.
Learn from each other – Take time to explain your own view and listen to others.
Use clear and kind words – Use words about disability that are easy to understand. Say what you mean when you talk about disability.
Summary
Groups that have a clear leader, treat people fairly, and include different members can make real change. Working together and sharing resources can help them succeed. More tips are in Section 5 of this report.
Research Limits
This section explains why it was hard to find workers with disabilities to join this study.
Why It Was Hard to Find People
We wanted to hear from workers with disabilities at big companies in warehouses, stores, factories, delivery, and customer service.
It was hard to find people for these reasons:
● Some companies try to stop workers from joining together to ask for better conditions. This makes workers nervous about talking to researchers.
● Workers with disabilities are more likely to lose their job or be punished.
● Talking about problems at work can feel unsafe.
● Many workers do not trust research groups because of bad treatment in the past.
● Some workers do not want to tell others they have a disability because of how people might treat them.
● One worker said it can feel scary to talk about work in a way the company might not like, because companies sometimes punish people for it.
Other Reasons:
● Attitudes toward disability are different in different parts of the country. In some states with fewer worker protections, it was harder to get people to join.
● We only studied workers in large cities. Workers in small towns or rural areas were not included. These workers may face different problems.
What We Did
We worked to build trust with workers and groups who support them. We focused on respect and giving back to the people we spoke with.
What Should Happen Next
There is not much research about technology, disability, and work. To make sure more workers are included in the future, research groups should:
● Build trust with groups and people who already support workers with disabilities.
● Work directly with unions, local leaders, and workers.
Federal Guidance: AI
This section explains how artificial intelligence (AI) is used in hiring and at work, the problems it can cause for workers with disabilities, and what the government says about it.
No Single Federal Law for AI and Disability
In 2024, the U.S. does not have one main law to protect people from problems caused by AI. Instead, different rules are spread across several areas, like civil rights, consumer law, and job laws. Disability rights protections still apply, but there are gaps.
AI can be unfair to workers with disabilities in ways that are different from how it can be unfair to people based on race, gender, or age. Disabilities can be very different from one person to another. Even people with the same disability may have different needs. This makes it harder to create fair AI systems.
Why AI Can Be Biased
AI works by:
Collecting data.
Selecting which data to use.
Detecting patterns in that data.
If disabled workers are not included in the data, the AI will not learn how to treat them fairly. This can lead to fewer job offers, fewer promotions, and fewer leadership roles for people with disabilities.
Government Focus Areas for AI and Disability
Federal agencies have mostly looked at:
● Hiring discrimination caused by AI tools.
● Fairness in data collection.
● Accessibility – making AI tools usable by people with disabilities, or using AI to make other tools accessible (like better captions).
Advocates are concerned about AI in both hiring and on-the-job situations, such as surveillance or preventing workers from organizing. There has been little focus on how AI affects worker health, safety, or rights to organize.
EEOC Guidance: AI in Hiring
In 2022, the Equal Employment Opportunity Commission (EEOC) and the Department of Justice (DOJ) gave guidance on AI hiring tools and the Americans with Disabilities Act (ADA).
Key points:
● AI tools can screen people out unfairly – for example, by rejecting anyone with gaps in their work history, which can affect people who took time off for disability or medical care.
● Some tools are not accessible – such as chatbots that don’t work for people with vision or communication disabilities.
● Tests and games can be biased – personality tests or “cultural fit” games may exclude people with conditions like depression, trauma, or autism, even if those conditions don’t affect job performance.
● Employers are still responsible – Even if a company buys an AI tool from someone else, they are responsible for discrimination caused by that tool.
● Medical information limits – AI tools should not ask for or reveal medical or disability information unless the law allows it.
The EEOC suggested that employers:
● Talk to developers to make sure the tool was designed with disability in mind.
● Avoid tools that don’t measure job skills directly.
● Tell applicants they can ask for accommodations, and explain how to do so.
● Train staff to respond quickly to accommodation requests.
● Give clear, plain language information about how AI is used and what it measures.
They also gave advice to workers on what to do if they need accommodations or believe they were treated unfairly.
Privacy, Security, and Civil Rights
AI in the workplace can collect:
● Location data.
● Biometric data (fingerprints, facial scans, voice data).
● Activity tracking (movements, keystrokes).
Workers with disabilities may not have agreed to this, and may not know the risks. Employers should:
● Limit how much data they collect.
● Use strong security and encryption.
● Let workers opt in or out when possible.
AI can also break civil rights laws if used unfairly, including the ADA and Title VII of the Civil Rights Act. Several states have their own laws or proposals on AI and discrimination.
Final Points
If AI is built on biased data, it will continue to exclude people. It is very hard to fix AI after it’s built. The best approach is to:
● Include people with disabilities from the start.
● Make AI fair, accessible, and transparent from the beginning.
● Keep testing and improving it over time.
State Legislation: AI
This section explains how states are making laws about artificial intelligence (AI) and other automated systems, and what these laws mean for workers with disabilities.
Overview
Federal lawmakers have been slower to act on AI. States are moving faster to write rules. Many state proposals aim to:
● Reduce bias in AI systems.
● Protect people from unfair treatment.
● Control how AI is used in jobs, public benefits, housing, and other areas.
However, most bills do not directly address how AI affects people with disabilities. Some list disability as a protected group but don’t include detailed rules for preventing or fixing disability discrimination.
Three Main Areas of Protection
General worker protections – Labor laws, union rights, and civil rights that apply to all workers.
Disability protections – The Americans with Disabilities Act (ADA) and similar laws that protect workers with disabilities from discrimination in hiring and employment.
Technology-specific rules – New laws about AI and automated decision-making, which may or may not mention disability.
The first two already have some legal structures. The third area is still developing and varies from state to state.
Industry-Backed Bills
In at least 9 states, technology industry lobbyists have helped write AI laws. These bills often:
● Look protective on the surface but have major gaps.
● Use “disparate impact” testing, which does not work well for disability discrimination because disabilities vary widely.
● Leave out accessibility and accommodations requirements.
● Delay accessibility by making companies “retrofit” systems later.
● Give enforcement to state Attorneys General instead of civil rights agencies.
● Create a new definition of “algorithmic discrimination” separate from other discrimination laws.
State Examples
California
Has many AI-related bills. Examples include:
● Worker Rights: Workplace Technology Accountability Act (AB1651) – Worker data protections and review rights.
● California Consumer Privacy Act changes (AB1824) – Requires businesses to respect “opt-out” instructions when personal data changes hands.
● AI Transparency Act (SB942) – Requires AI content detection tools and a state registry.
● High-Risk Automated Decision System Act (AB302) – Requires a list of high-risk AI systems used by the state.
● AI Research Hub Act (SB893) – Would create a state research hub for AI.
● Watermarks for AI Content (AB3050) – Would require watermarks on AI-generated material.
● Generative AI Accountability Act (SB896) – Requires risk reports for certain AI uses.
● AI Technology Act (SB970) – Regulates synthetic media and warns consumers about risks.
● Political Ads AI Act (AB2355) – Requires AI disclosures in political ads.
● Legal Professionals AI Act (AB2811) – Would make lawyers disclose AI use in court filings.
● Universal Basic Income: AI Job Loss (AB3058) – Would give $1,000 per month for a year to people losing jobs to AI.
● Automated Decision Tools Act – Requires impact assessments; has enforcement gaps.
Connecticut
SB2 – Wide-ranging AI bill that includes:
● Development rules for AI.
● An AI Advisory Council.
● Rules for synthetic media and election-related deep fakes.
● Training programs and a public “AI Academy.”
Does not specifically address disability in depth.
Georgia
HB890 – Would ban AI discrimination based on disability and genetic information, but has no enforcement process.
Hawaii
HB2152 – Would require risk checks before the state buys AI tools and create training for fair AI use.
Illinois
Multiple bills, including:
● Automated Decision Tools Act (HB5116) – Requires annual impact assessments. Has size and enforcement limits.
● Commercial Algorithmic Impact Assessments Act (HB5322) – Similar to HB5116 with some differences.
● State Agencies-AI Systems (HB4836) – Requires state AI systems to follow national standards.
● AI Use in Government Contracts (HB5228/HB5099) – Vendors must disclose AI use.
Maryland
HB1255/SB957 – Would ban most private employers from using AI tools to make hiring and pay decisions.
New Jersey
Several bills (A3855, S2964, A3854, A4030, S1588) – Would require bias audits and regulate AI in hiring.
New York
LOADinG Act (A9430/S7543) – Would regulate AI use by state agencies and require public disclosure. Other bills would limit electronic monitoring, require notices to job applicants about AI, and set sale rules for AI hiring tools.
Oklahoma
HB3453 – “AI Bill of Rights” that includes privacy protections and anti-discrimination rules. Did not pass the Senate. Other bills (HB3835, HB3293) include disability as a protected class.
Rhode Island
Two major new laws:
● Data Transparency and Privacy Protection Act (HB7787/SB2500) – Broad data privacy law with limits on targeted ads, profiling, and sensitive data use.
● Automated Decision Tools Act (HB7786/SB2888) – Would require risk management for high-risk AI systems.
Key Issues Across States
● Few bills center disability – Even when listed as a protected group, there’s often no detail about how to prevent or fix AI-related disability discrimination.
● Weak enforcement – Many bills lack strong penalties or give oversight to agencies without civil rights expertise.
● Vague language – Broad terms like “bias” and “discrimination” without clear definitions make it hard to enforce protections.
● Private right of action missing – Many bills prevent individuals from suing for violations.
What Advocates Can Do
Disability and worker advocates can:
● Push for AI laws that treat algorithm-based discrimination the same as other discrimination.
● Support bills that allow private lawsuits and strong public enforcement.
● Require accessibility and accommodations to be built in from the start.
● Connect AI protections to existing worker rights, health and safety, and union protections.
Recommendations for Action
This section gives steps for employers, workers, policymakers, researchers, and the tech industry to make sure new technology in the workplace is fair, safe, and inclusive for workers with disabilities.
Artificial Intelligence (AI)
AI can help make jobs more accessible, but it can also create bias, invade privacy, and cause harm if it is not built and used carefully.
For researchers, non-profits, and organizers:
● Collect accurate, ethical data that includes workers with disabilities.
● Store and share data so all relevant groups can use it.
For companies and employers:
● Use AI to help collect data, not to replace human decision-making.
● Keep humans involved in important job decisions.
● Take responsibility for harm caused by AI decisions, including health, safety, money, and privacy impacts.
● Have strong policies to protect worker safety and privacy and handle personal data responsibly.
For hiring managers:
● Write job descriptions that are realistic and inclusive.
● Be open about how your company uses technology and what happens with collected data.
● Push for transparency in hiring practices.
For disabled workers:
● Share concerns with trusted co-workers, advocates, and unions.
● Work with supportive managers to improve conditions.
For the tech industry and developers:
● Understand how bias works and plan to avoid it.
● Build tools that can be used in different ways so more people can use them.
● Test technology for accessibility before release, especially if it is bought from outside vendors.
● Include people with disabilities in all stages of design and testing.
● Work with experts in ethics, social science, and accessibility.
● Build systems that make it easier—not harder—for qualified disabled people to get jobs.
● Use independent audits to check for bias and access problems before launch.
Surveillance Technology
If used in the workplace, surveillance tools should be used to improve safety and working conditions, not to punish workers.
For companies:
● Be clear about what is being monitored, who sees the information, and how it is used.
● Focus on workplace conditions like temperature and air quality, not on constant worker tracking.
● Avoid systems that judge individual productivity without context. Consider team-based measures over longer time periods.
General Workplace Practices
For all organizations:
● Create privacy and security tools for workers who are organizing, especially smaller groups working online.
● Redefine productivity to keep experienced workers longer and reduce turnover.
For non-profits, organizers, and advocates:
● Provide direct help—financial, legal, technical—to workers and organizers.
● Build partnerships with groups in disability rights, worker rights, and technology ethics.
For researchers:
● Study how inclusive workplaces can benefit companies over time.
● Work with people directly affected by policies and tools.
● Identify where resources are missing and make information easy to find.
● Look for funding from multiple sources, not just disability-specific grants.
Policy Changes
For lawmakers and regulators:
● Write clear rules for AI and automated systems that protect rights, health, safety, and privacy.
● List disability, health conditions, and genetic information as protected categories in law.
● Protect workers from retaliation when they organize or bargain collectively.
● Fund worker support centers and resources.
● Limit harmful workplace demands caused by surveillance or automation.
● Offer tax incentives for hiring and keeping workers with disabilities and for developing accessible technology.
● Allow both public agencies and private individuals to take legal action when their rights are violated.
● Support laws like the Protecting the Right to Organize (PRO) Act to strengthen union rights and penalties for violations.
Research Priorities
More study is needed to understand how technology affects different workers with disabilities in different settings.
Key questions:
● How does technology affect disabled people in gig, seasonal, and contract work?
● How does it affect people who are also part of other marginalized groups?
● How do impacts differ in rural vs. urban areas?
● How can worker centers and unions better protect disabled workers?
● How can education for software developers include ethics and inclusive design?
● How do different Native, First Nations, and Indigenous communities experience these changes?
● How does buying power influence disability inclusion in tech companies?
● What does “inclusion” really mean, and how can we measure it?

