Voyager Labs, an AI tech company, faces a privacy lawsuit from Meta over its AI crime prediction capabilities

TL;DR:

  • Voyager Labs, known for AI crime prediction, faces a lawsuit from Meta, alleging data collection malpractice.
  • NYPD and LAPD are among the global law enforcement agencies using Voyager Labs’ AI capabilities.
  • Meta claims Voyager Labs created 55,000 fake accounts on Facebook and Instagram for data collection.
  • Voyager Labs disputes allegations, citing its role in combating crime, including child trafficking.
  • Privacy advocates decry Voyager Labs’ tactics as invasive, raising constitutional concerns.
  • Lawsuit continues, with Voyager Labs seeking dismissal, pre-settlement conferences underway.

Main AI News:

Voyager Labs, an AI powerhouse, finds itself embroiled in a legal tussle with Meta, the rebranded Facebook entity, seeking to banish it from the social media realm. The heart of the matter: Voyager Labs’ cutting-edge AI prowess, with a knack for predicting crime trends, has caught the attention of law enforcement agencies worldwide, including the colossal New York City and Los Angeles police departments. A contract worth nearly $9 million inked with the NYPD in 2018 is a testament to the allure of Voyager Labs’ AI capabilities, as revealed through documents unearthed by the Surveillance Technology Oversight Project (STOP), as reported by The Guardian.

With a bold claim as a “world leader” in AI-driven analytics investigations, Voyager Labs asserts its ability to sift through vast online expanses, from the open web to the enigmatic dark corners, to deliver profound insights, identify potential risks, and even prognosticate future criminal activities. However, Meta counters this narrative with a federal lawsuit, accusing Voyager Labs of orchestrating an audacious scheme involving the creation of over 55,000 bogus accounts on Facebook and Instagram. The objective is to collect personal data to decipher behavioral patterns, infer human conduct, and establish an all-encompassing presence within their target demographics. A staggering 17,000 such fake accounts emerged post-Meta’s legal intervention on January 12.

Meta’s contention is grave: Voyager Labs could leverage an individual’s social media history to trace their every digital step and potentially forecast their forthcoming actions. In response, the NYPD underscores its use of social media analytics tools to bolster investigations and public safety efforts. It maintains, however, that these tools do not delve into predictive criminality features. Instead, they primarily focus on addressing issues such as gun violence and various threats to individuals, locations, and events. The NYPD emphasizes Voyager Labs’ role in thwarting criminal activities, from terrorism to child trafficking rings.

In a juxtaposition of perspectives, STOP, a privacy advocacy nonprofit, deems Voyager Labs’ methods akin to a digital incarnation of “stop-and-frisk,” particularly targeting Black and Latino New Yorkers, a practice they deem invasive, alarming, and potentially unconstitutional. The battle between public safety and individual privacy rages on, with experts foreseeing its persistence as AI continues to advance.

As the legal standoff intensifies, Meta’s lawsuit alleges that Voyager Labs harvested data from over 600,000 Facebook users between July and at least September 2022, encompassing a trove of personal information, including timelines, photos, friends lists, posts, education, employment, and self-disclosed locations. The legal action, filed in a California federal court, asserts that Voyager Labs went beyond Facebook, scraping information from various entities, including nonprofits, universities, news outlets, healthcare facilities, U.S. armed forces, and governmental bodies associated with the state.

Meta’s demands include the immediate removal of Voyager Labs from all its platforms and the deletion of all collected data. Voyager Labs, however, staunchly contests these allegations, branding Meta’s lawsuit as “meritless” and stressing the pivotal role their software plays in global public safety. They deny any data scraping from Facebook users and vehemently reject any implication of civil liberty infringement. According to Voyager Labs, their software strictly analyzes publicly available data or data acquired through legal processes.

In the ongoing legal saga, Voyager Labs has sought to dismiss the case, an attempt that was denied by a judge on July 26. Pre-settlement conferences have transpired, with potential conference dates between September 25 and November 10, though a finalized date remains pending. The legal terrain remains convoluted as the clash between technology, privacy, and public safety continues to unfold. Stay tuned for further developments in this high-stakes battle.

Conclusion:

The legal battle between Voyager Labs and Meta highlights the ongoing struggle between AI’s potential for public safety and individual privacy rights. As the case unfolds, it underscores the need for clearer boundaries in the use of AI-driven data analytics, which will likely impact the evolving market for AI-powered solutions in law enforcement and beyond, potentially prompting more stringent regulations and ethical considerations.

Source