A Cambridge Analytica-style AI scandal is coming

0

Wojciech Wieviorowski, the European Union’s data watchdog, said the rapid pace of development meant data protection regulators had to be ready for another scandal like Cambridge Analytica.

Wiewiórowski is the European Data Protection Supervisor and he is a powerful figure. Its role is to hold the EU accountable for its personal data protection practices, monitor technological advances and help coordinate application across the Union. I spoke with him about the lessons we need to learn from the last decade in technology and what Americans need to understand about the EU’s data protection philosophy. Here’s what he had to say.

What tech companies need to learn: These products should have built-in privacy features from the start. However, “it’s not easy to convince companies that they should adopt privacy-by-design models when they need to deliver quickly,” he says. Cambridge Analytica remains the best lesson of what can happen if companies cut back on data protection, says Wiewiórowski. In what became one of Facebook’s biggest advertising scandals, the company deleted the personal data of tens of millions of Americans from their Facebook accounts in an effort to influence their vote. It’s only a matter of time before we see another scandal, he adds.

What Americans should understand about the EU’s data protection philosophy: “The European approach is related to the purpose of data use. So when you change the purpose for which the data is being used, and especially when you do something against the information you give people, you’re breaking the law,” he says. Take Cambridge Analytics. The biggest violation isn’t that the company collected the data, but that it claimed to collect the data for scientific purposes and quizzes, and then used it for other purposes, mostly to create political profiles of people. This is an issue raised by data protection authorities in Italy, who temporarily banned ChatGPT. Authorities say OpenAI collected data it wanted to use illegally and didn’t tell people how it wanted to use it.

Does Regulation Stifle Innovation? This is a common statement among technologists. Wiewiórowski addresses the real question we should be asking ourselves: Are we sure we want companies to have unrestricted access to our personal data? “I don’t think regulations really stop innovation. They are trying to make it civilized,” he says. After all, GDPR protects not only personal data, but also trade and the free flow of data across borders.

Big Tech’s Hell on Earth? Europe is not alone in playing hardball with technology. As I reported last week, the White House is considering AI liability rules, and the Federal Trade Commission has even gone so far as to require companies to delete their algorithms and any data that may have been illegally collected and used about weight watchers. . Wieviorowski says he’s glad to see President Biden calling on tech companies to take more responsibility for the security of their products, and he’s encouraged that American policy is aligned with European efforts to prevent intelligence-related risks and blame companies for the damage caused. . “One of the biggest players in the tech market once said, ‘The definition of hell is European law with American forces,'” he said.

Learn more about ChatGPT

The inside story of how ChatGPT was created from the people who created it

Tech

All news on the site does not represent the views of the site, but we automatically submit this news and translate it using software technology on the site, rather than a human editor.

Leave A Reply