Skip to main content

Just Cause for NYC Gig Workers Provides Human Review for Algorithmic Firings

by Andrew Wolf

App workers provide essential services to customers—such as delivering food to the sick and transporting the elderly to appointments—yet receive minimal benefits and protection. App-companies classify their workers as independent contractors, allowing them to avoid responsibility for paying the minimum wage or complying with basic employment standards and benefits, such as unemployment insurance. 

Likewise, while workers in the United States can be fired at any time due to “at-will” employment, for most, termination is typically decided by a human manager. For app-based workers, however, this is rarely the case. Instead, they face termination by algorithm with little or no human review, let alone recourse.  

Given that both app-taxi and app-delivery industries are dominated by two to three employers, it can be devastating when workers lose their access to these apps through wrongful deactivation. 

Two new laws proposed by the New York City Council (Int. No 276 and Int. No 1332) would change this for Uber, DoorDash, and other app-based workers in New York City who are wrongfully deactivated from the apps. 

These bills would provide drivers with “just cause” protections, allowing them to appeal and arbitrate wrongful terminations. Recent academic research out of the University of Washington suggests a strong need for these protections in the gig economy, where workers are frequently terminated due to decisions of opaque algorithms and arbitrary, and often discriminatory, reprisal from customers. Passing just cause protections for app workers ensures human review of algorithmic firings, thus providing much-needed oversight for this precarious workforce.

The proposed laws seek to address the devastating effects of wrongful deactivation by extending the just cause protections the City first adopted for app-based fast food workers in 2021. These laws would require companies to clearly articulate their deactivation policies, outline a progressive discipline process, require a fourteen-day notice of deactivation, and provide a clear written explanation for why the worker is being deactivated. These protections would be overseen by the Department of Consumer and Worker Protection (DCWP)’s office of Labor Standards Enforcement and would include an appeals process before a panel of neutral arbitrators. 

The need for just cause protections with human oversight is acute, as termination decisions, which are outsourced to algorithms, open the door to a large amount of error and discriminatory customer abuse. This result was clear in a recent study of 350 drivers, conducted by the Asian American Legal Defense and Education Fund (AALDEF), which found that most deactivated drivers were high-performers and workers of color who faced vague allegations of “customer complaints” resulting in their deactivation. They found 90% of drivers never got their jobs back.  

In the most recent NYS Empire Poll, conducted by Cornell University’s School of Industrial and Labor Relations, app workers reported experiencing discrimination on the job at much higher rates (22%) than non-app workers (6%) in New York State. The same poll found app workers (38%) have experienced, or likely experienced, verbal or physical mistreatment at a much higher rate than non-app workers (19%). Of app workers experiencing this mistreatment 47% attributed their mistreatment to discrimination compared to 32% for non-app workers. 

In preliminary results from a national sample compiled by Andrew Wolf, assistant professor at the Cornell ILR School, 66% of Uber drivers in the United States have experienced, or likely experienced, verbal or physical mistreatment on the job. Of those, around two-thirds said the treatment was due to discrimination. Wolf’s analysis found that Black, Latino, Muslim, and Jewish drivers were statistically more likely to experience discrimination while working for the app compared to White and Christian drivers. 

This discrimination leads to deactivation when customers leave bad reviews for drivers. 

Wolf found that Black, Latinos, and Muslims were statistically more likely to report receiving unfair customer reviews due to discrimination. Black drivers in particular were also more likely to report being punished and face payment issues by the apps due to discrimination. They were also more likely to be sexually harassed and have issues with the apps’ facial recognition software. 

The rampant encoding of customers’ biased reviews into the algorithmic deactivation decisions highlights the pressing need for app workers to have just cause protections. 

Just cause protections – like the one passed in 2021 in Seattle – have been shown to work even for independent contractors like app workers. A review of the first year of implementation for Seattle’s Uber driver law, in the aforementioned research by the University of Washington, found that the law had a huge impact in preventing wrongful deactivations. The report found that in 80% of arbitrations the driver’s deactivation was overturned, this compares to the 10% found in the AADLEF study of Uber’s internal system in NYC. The Washington study found that most drivers were deactivated for minor issues, like paperwork, and that the just cause system was most likely to prevent these errors. The University of Washington study found divers of color were reactivated at higher rates than their peers, indicating that these workers faced more incidents of discriminatory customer complaints. The study also found that 76% of drivers reported that the apps did not verify the passengers’ complaints that resulted in their deactivation. They found that fear of customers’ accusations also resulted in drivers underreporting verbal, physical and sexual abuse from customers. 

The explosion in algorithmic and AI systems being used in employment decisions is generating a growing need for greater oversight. Nowhere is this problem clearer than in the gig economy, where the apps have eliminated human managers altogether, instead relying on algorithmic management and customer reviews to manage workers. The research shows this opens the door to large amounts of discriminatory behaviour being encoded in the algorithms’ decisions. A problem that is likely to worsen as AI’s machine learning is based on past practices, further encoding and obscuring discriminatory customer inputs. Furthermore, the lack of human review is also causing many errors and terminations for technical issues, as the University of Washington study shows. 

New York City’s proposed just cause laws for app-based workers would go a long way in rectifying these issues and ensuring these essential workers do not lose their livelihoods. The laws point to a path forward for ensuring greater fairness in the new world of workplace algorithmic management. 

Photo credit: Hispanolistic

Andrew Wolf

  • Assistant Professor, Global Labor and Work