Contract Workers Allege Unfair Treatment and Organizing Retaliation in the AI Industry

TL;DR:

  • A group of contract workers training Google’s AI chatbot claims they were fired for speaking out about low pay and unreasonable deadlines.
  • The workers filed a complaint against their employer, Appen, with the National Labor Relations Board.
  • Concerns were raised about the exploitation of chatbot raters and the potential risks of a faulty product.
  • Google stated that the worker’s rights to unionize or engage in organizing activities are between them and Appen.
  • The workers’ fight has expanded beyond better working conditions to address the societal impact of AI.
  • The AI industry’s rapid expansion has led to increased competition and concerns over biased and false information.
  • Google and other Big Tech companies rely on external contract workers to enhance their AI tools.
  • The workers hope that their NLRB complaint will lead to job reinstatement, back-pay, and a discussion with Google about AI concerns.
  • Contract workers are vital contributors to AI development but often go unrecognized.

Main AI News:

A group of contract workers entrusted with training Google’s groundbreaking AI chatbot claims to have been dismissed for expressing grievances regarding inadequate compensation and unreasonable deadlines, which they argue hindered their ability to effectively perform their duties and ensure that the bots operate without causing harm.

Six workers have filed a complaint with the National Labor Relations Board, accusing their employer, Appen, a prominent provider of contract workers for leading Big Tech companies, of illegal termination in response to their organizing efforts. These workers had spent nearly a year advocating for improved pay and working conditions. Astonishingly, just two weeks after one of the most prominent worker organizers among them sent a letter to Congress, warning about the potential hazards arising from Google’s chatbot, known as Bard, they were promptly fired.

One of the workers, Ed Stackhouse, 49, emphasized in a letter to two senators conducting a congressional hearing on AI risks that those responsible for rating the chatbots often faced insufficient time to evaluate longer responses. He expressed concern that the exploitation of these raters could result in a flawed and ultimately more perilous product.

Stackhouse stated that Appen cited “business conditions” as the reason for their dismissal. However, Appen has not responded to requests for comment on the matter.

Google spokesperson Courtenay Mencini clarified that Appen holds responsibility for the working conditions of their employees, encompassing pay, benefits, employment changes, and assigned tasks. While Google respects the workers’ right to join a union or engage in organizing activities, the spokesperson affirmed that it is ultimately a matter between the workers and their employer, Appen.

By addressing Congress, these workers have joined a growing chorus of concerned voices alarmed by the rapid proliferation of AI tools to millions of people. AI researchers, politicians, and technology advocates have raised apprehensions about the infusion of bias into tech products, the facilitation of cybercrime, the displacement of workers, and the potential loss of human control. What initially began as a fight for better working conditions has evolved into a larger conversation about the societal impact of AI, as Stackhouse noted.

The burgeoning interest in AI from both businesses and consumers has incited intense competition between Google and its archrival Microsoft in developing and selling AI tools, integrating the technology into existing products such as Google Search and Microsoft Word. The boom was triggered by OpenAI, a comparatively smaller company that astounded the world in November by releasing its ChatGPT chatbot. Capable of conducting coherent conversations, passing professional exams, and writing computer code, OpenAI’s success can be attributed, in part, to the involvement of human testers and trainers, refining the bot and making it less offensive and more captivating than previous versions.

The swift pace and cutthroat nature of the AI boom have raised concerns among AI ethics experts who assert that the technology mirrors the racist and sexist biases entrenched in the vast troves of internet data used to train these systems. Furthermore, the bots routinely generate false information and present it as authentic.

In response to these challenges, Google and other major tech companies have turned to external contract workers, part of the extensive workforce amassed over the years, to fulfill various tasks, ranging from running cafeterias to writing computer code. For instance, Appen contractors like Stackhouse have played a vital role in enhancing Google Search by assessing the helpfulness and accuracy of search results.

Even before the public release of Bard in March, which the company labeled as an “experiment,” Google had already begun shifting the workload of these contract workers to Bard since February.

Michelle Curtis, 43, another terminated Appen contractor, described the experience as highly stressful. Curtis highlighted the time constraints imposed on raters when evaluating assigned tasks. With Bard, raters might have had just five minutes to evaluate a detailed response about the origins of the Civil War, for example. Curtis asserted that such tasks were simply impossible to complete within the given timeframe.

Both Curtis and Stackhouse were part of a group of Appen workers who had been striving to organize their colleagues and demand better pay and benefits, with support from the Alphabet Workers Union, a collective of Google employees and contractors affiliated with the Communications Workers of America. Curtis, a mother from Idaho, and Stackhouse, who lives in North Carolina and has a disability, viewed their part-time work-from-home arrangements with Appen as a lifeline for supporting their families.

In 2019, Google announced that contractors would be required to provide employees with a minimum wage of $15 per hour. However, Stackhouse and Curtis claimed that Appen failed to meet this requirement. While the company eventually agreed to raise wages to $14.50 per hour, employees were not given sufficient hours to qualify for benefits, according to the workers.

By filing a complaint with the NLRB, these workers aim to escalate their fight and shed light on the realities of low-paid labor underlying cutting-edge AI development. The process, which will likely take several months given the NLRB’s extensive caseload, offers hope for the workers to regain their jobs, receive back-pay, and possibly engage in a constructive dialogue with Google to address their concerns about AI, as expressed by Curtis and Stackhouse.

Stackhouse aptly referred to contractors as “ghost workers” who often go unrecognized for the immense value they bring to Google. He emphasized, “Without people doing our jobs, there would not be any AI.”

Conclusion:

The allegations made by the contract workers against Appen and their fight for better treatment and recognition highlight the challenges faced in the AI industry. The industry’s rapid growth and competitive nature, coupled with concerns over bias and false information, require careful attention to ensure ethical and responsible AI development. Companies like Google must address these labor issues and engage in meaningful discussions to foster a fair and inclusive market for AI technologies.

Source