Why the FTC’s case against Weight Watchers means the death of algorithms

0

The Federal Trade Commission has struggled over the years to find ways to combat deceptive digital data practices using its limited set of enforcement options. Now he’s landed on one that could have a big impact on tech companies: algorithmic destruction. And as the agency gets more aggressive on the technology by slowly introducing this new type of penalty, enforcing it in a settlement for the third time in three years could be the charm.

In a March 4 settlement order, the agency demanded that WW International — formerly known as Weight Watchers — destroy any algorithms or AI models it built using personal information collected through its Kurbo app for healthy eating with children as young as 8 without parental permission. The agency also fined the company $1.5 million and ordered it to delete the illegally collected data.

When it comes to today’s data-centric business models, the algorithmic systems and the data used to create and train them are intellectual property, products that are central to operating and generating revenue. many companies. While in the past the FTC has required companies to return ill-gotten monetary gains obtained through deceptive practices, forcing them to take down algorithmic systems built with ill-gotten data could become a more common approach, one that modernizes the the FTC’s application to directly affect the way companies do business.

A slow deployment

The FTC first used this approach in 2019, in the middle outrageous headlines which exposed Facebook’s privacy vulnerabilities and brought down political data and campaign consultancy Cambridge Analytica. The agency uses Cambridge Analytica to destroy data it had collected about Facebook users by deceptive means as well as “information or work product, including any algorithms or equations” constructed using such data.

It took another two years before the algorithmic disgorgement returned when the commission settled a case with photo-sharing app company Everalbum. The company has been accused of using facial recognition in its Ever app to detect people’s identities in images without allowing users to turn it off, and of using photos uploaded through the app to help develop its technology of facial recognition.

In this case, the commission asked Everalbum to destroy photos, videos, and facial and biometric data collected from users of the app and to remove products built using it, including “all models or algorithms developed in whole or in part” using such data.

Technically speaking, the term “algorithm” can cover any piece of code that can make a software application perform a set of actions, said Krishna Gade, founder and CEO of AI surveillance software company Fiddler. When it comes to AI specifically, the term usually refers to an AI model or a machine learning model, he said.

Paving the way for algorithmic destruction within the FTC

It hasn’t always been clear that the FTC could use algorithmic bleed more regularly.

“Cambridge Analytica was a good move, but I wasn’t sure it would become a model,” said Pam Dixon, executive director of the World Privacy Forum, of the company’s requirement to remove its algorithmic models. Now, Dixon said, algorithmic disgorgement will likely become a standard enforcement mechanism, as will monetary fines. “It is now to be expected whenever it is applicable or the right decision,” she said.

The winds inside the FTC seem to be changing. “Commissioners have already voted to allow violators of data protection law to retain algorithms and technologies that derive much of their value from ill-gotten data,” former FTC Commissioner Rohit wrote. Chopra, now director of the Consumer Financial Protection Bureau. declaration related to the Everalbum case. He said requiring the company to “renounce the fruits of its deception” was “a significant course correction”.

“If Ever meant course correction, Kurbo means full speed,” said Jevan Hutson, partner at Hintze Law, a privacy and data security law firm.

FTC Commissioner Rebecca Slaughter has been a strong proponent of algorithmic destruction as a way to penalize companies for unfair and deceptive data practices. In an article in the Yale Journal of Law and Technology published last year, she and FTC attorneys Janice Kopec and Mohamad Batal pointed to it as a tool the FTC could use to foster economic growth and algorithmic justice.

“The principle is simple: when companies collect data illegally, they should not be able to profit from the data or any algorithm developed from it,” they wrote. “The power to seek this type of relief derives from the power of the Commission to order a remedy reasonably tailored to the violation of the law. This innovative approach to law enforcement should send a clear message to companies that engage in illicit data collection to train AI models: it’s not worth it. »

Indeed, some believe that the threat to the value of intellectual property and the viability of technology products could cause companies to think twice before using data collected by unscrupulous means. “Big fines are the price to pay for doing business. Algorithmic booze related to illicit data collection/processing is a real deterrent,” said David Carroll, associate professor of media design at The New School’s Parsons School of Design, in a statement. Tweeter. Carroll Cambridge Analytica sued in Europe to obtain his 2016 voter profile data from the now-defunct company.

Forecast: Future Use of Privacy

When people sign up to use the Kurbo app for healthy eating, they can choose a fruit- or vegetable-themed avatar, such as an artichoke, pea pod, or pineapple. In exchange for health coaching and help with tracking food intake and exercise, the app requires personal information about its users such as age, gender, height, weight and their food and exercise choices, which enhances the app.

In its lawsuit against WW, the FTC said that until the end of 2019, Kurbo users could sign up for the service either by indicating that they were a parent signing up for their child or that they had more 13 years old and were registering for themselves. The agency said the company failed to ensure that people signing up were actually parents or adult guardians rather than children pretending to be adults. He also said that from 2014 to 2019, hundreds of users who signed up for the app originally saying they were over 13 later changed their profile date of birth to say that ‘they were actually under 13, but continued to have access to the app.

The fact that algorithmic rendering was used by the FTC as part of one of the only federal privacy laws in the country could be a sign that it will be used again, legal experts and experts have said. policies. While the Cambridge Analytica and Everalbum cases accused those companies of violating FTC law, the Kurbo case added a significant wrinkle, alleging that WW violated both the FTC law and the Privacy Act. children’s privacy online. Both are important pieces of legislation under which the agency can bring consumer protection lawsuits against companies.

“This means that for any organization that has collected data illegally under COPPA, that data is at risk and models built on it are at risk of being returned,” Hutson said.

The use of COPPA could be a fundamental precedent paving the way for the FTC to require the destruction of algorithmic models under future legislation, such as a future federal privacy law. “It stands to reason that it would be exploited in any other arena where the FTC has enforcement authority under the law,” Hutson said.

Enforcement of algorithmic restitution in the context of COPPA is “a clear jurisdiction and trigger for enforcement through a law that exists and explicitly protects children’s data, [so] if there was a corollary law for everyone, it would allow the FTC to enforce it that way for companies that don’t just collect children’s data,” said Ben Winters, an attorney for the Electronic. Privacy Information Center.

He added: “It goes to show that it would be really great if we had a privacy law for everyone, in addition to children.”

Share.

Comments are closed.