Wednesday, October 16, 2024

TikTok users paid over privacy violations — Google, Snap could be next

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

This week, TikTok users across the country who created videos on the app before September 30, 2021, began receiving payments between $27.84 and $167.04 following a $92 million class-action data privacy settlement with the social media platform.

The largest checks went to short- and long-term residents of Illinois, where TikTok was sued for violating the state’s strict biometric data laws by collecting and implementing facial recognition data into its algorithms without user consent.

Not everyone who uses TikTok in the U.S. is getting a check, because a comparable federal law doesn’t currently exist. But the lawsuit “asserted a variety of common law and other types of claims” in state and local courts to maximize the number of people who could get a payout, Katrina Carroll, a founding partner at Lynch Carpenter LLP and one of the case’s co-lead prosecutors, tells CNBC Make It.

Up to 89 million people qualified to submit a claim, according to the settlement.

This isn’t the first time TikTok has come under fire for exploiting user privacy: In 2019, it agreed to pay $5.7 million to settle Federal Trade Commission allegations of illegally collecting and storing personal information of minors.

The social media platform, owned by Chinese company ByteDance, is just the latest tech business to pay fines for violating Illinois’ biometric data law.

In May, 1.4 million current and former Illinois residents received checks and virtual payments up to $397 from a similar $650 million lawsuit against Facebook, which allegedly used facial recognition data without consent to prompt users to tag their friends in photos.

More checks from privacy lawsuits are likely on their way. Last month, a judge approved a $100 million settlement against Google, with 420,000 Illinois residents set to receive about $150 each, according to the Chicago Tribune.

In August, some Snapchat users received notice to submit a claim by November 5 to take part in a similar $35 million lawsuit against the company. Sandwich chain Pret A Manger and photography company Shutterfly have also settled similar lawsuits over the past 13 months.

On the surface, facial recognition features on social media seem harmless, if not beneficial to the user experience — but experts say there are underlying consequences.

For instance, New York-based software company Clearview AI claims to have scraped more than 20 billion images from sites like Facebook, YouTube and Venmo to “help law officials accurately and rapidly identify suspects, protect victims and keep communities safe,” according to its website.

In a May court settlement in Illinois, Clearview agreed to stop selling its information to both individuals and private businesses in the U.S. The company is also banned from doing any business in Illinois for the next five years.

That same month, Clearview was fined £7.5 million (currently equivalent to $8.66 million) by a U.K. privacy regulator. France and Italy’s data protection agencies each fined the company 20 million euros ($19.91 million) within the last year, too.

But experts are still concerned, about both Clearview and other companies like it. Matthew Kugler, a privacy law professor at Northwestern University told CNBC Make It in May that such businesses hold the potential to eliminate our anonymity.

“As we walk down the street, everyone can see our face, but only some people can link our face to our name,” Kugler said. Easy-access facial recognition data could make it easier for people to harass their local barista, or jeopardize the lives and safety of domestic violence victims, sex workers and people in witness protection programs, he added.

In 2019, a study Kugler authored found that 70% of its participants were uncomfortable with companies using facial recognition data to track individual’s locations and serve target ads.

The Illinois law, along with ones in Texas and Washington, help limit the collection of that data: Users in those states can’t access Meta’s “face filter services” on Instagram or Facebook, for example.

Similar laws are set to go into effect in California, Colorado, Connecticut, Utah and Virginia next year.

Sign up now: Get smarter about your money and career with our weekly newsletter

Don’t miss:

You should still apply for student loan forgiveness even while it’s on hold

You can now ask Google to remove your personal data from its search results—here’s how

- Advertisement -spot_imgspot_img
Latest news
- Advertisement -spot_img
Related news
- Advertisement -spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

%d bloggers like this: