OpenAI Inc., the company that made ChatGPT, has been sued for getting “vast amounts” of personal information to teach its artificial intelligence models for profit.
Sixteen people with fake names recently sued the company and Microsoft, which is its biggest backer, saying that ChatGPT-based AI products got their personal information and shared it without their permission.
In the case that was made in federal court in San Francisco, it is said that the two companies got data for their AI models in a way that was against the law.
The Clarkson Law Firm doesn’t use the full names of the claimants because they are afraid of getting into trouble. The case was filed with the federal court in San Francisco a while ago. Based on the millions of people who were hurt, they think the damage is worth $3 billion.
Read also: Sam Altman, CEO of OpenAI, ChatGPT visits Lagos, Nigeria
The lawsuit against OpenAI
According to the 157-page lawsuit, OpenAI illegally scraped 300 billion words from the internet, including personal information. “Defendants took a different approach: theft,” the complaint states.
They took 300 billion words from the internet without permission, which included personal information. The case says OpenAI did this without telling anyone and without registering as a data broker, which is what the law requires.
The lawsuit says that the AI products of the two companies “collect, store, track, share, and disclose” the personal information of millions of people. This includes product details, account information, names, contact information, login credentials, emails, payment information, transaction records, browser data, social media information, chat logs, usage data, analytics, cookies, searches, and other online activities.
The complaint accused the businesses of risking “civilizational collapse” by collecting, storing, and using so much data in their AI products.
“With regard to personally identifiable information, defendants don’t do enough to filter it out of the training models, putting millions of people at risk of having that information shared quickly or otherwise with strangers around the world,” the complaint states, citing The Register’s March 18, 2021, special report.
OpenAI created the GPT-2, GPT-4, and ChatGPT text-generating large language models. Microsoft loves this technology and uses it in Windows and Azure.
The 157-page lawsuit mentions many scholarly and news items. It worries about AI models and ethics but doesn’t provide many examples of AI harming people. Microsoft and OpenAI have not addressed the $3 billion case.
NITDA to initiate Code of Practice on Artificial Intelligence (AI)
AI doubts
ChatGPT, a popular AI language model, is exciting, but privacy and misleading problems surround it, and other generative AI uses.
Experts, corporations, groups, and governments worldwide are limiting its use. As AI products raise concerns about creative industries and fact-checking, the US Congress is discussing their pros and cons.
The designers of ChatGPT and OpenAI also asked for stronger controls for “super-intelligent” AIs a few months ago to prevent them from destroying the world.
Greg Brockman, Ilya Sutskever, and Sam Altman, the company’s CEO, say that something like the International Atomic Energy Agency (IAEA), which keeps an eye on how atomic and nuclear energy is used and applied around the world, is needed to keep people from accidentally making something that can destroy the world.
A New York attorney recently confused a US court by using fictitious cases as precedents. He asked ChatGPT for proof, but ChatGPT had a hallucination and made up a lot of them.
These and other events made more people around the world doubt creative AI. Many people worry that it will take their jobs and replace them. The most recent OpenAI data scraping makes people more skeptical.
The claimants say that OpenAI steals a huge amount of personal information to win an “AI arms race.” According to the lawsuit, these connections let the company collect information like Snapchat photos and locations, Spotify song choices, Stripe financial data, and secret conversations on Slack and Microsoft Teams.
The people who filed the lawsuit say that OpenAI gave up on its original goal of making AI better “in the way that is most likely to benefit humanity as a whole” in order to make money. The lawsuit says that ChatGPT will make $200 million in 2023.