Hi, I’m Raffi, and welcome to my newsletter. Each Friday, I break down the conversations in technology, AI, and the world around it. If you’re not a subscriber, here’s what we talked about this month:
How states are stepping in to regulate privacy policy.
Is keeping kids safe online the one thing Congress can agree on?
Subscribe to get access to these and future posts, as well as previews of Season Two of the Technically Optimistic podcast, launching soon.
Okay! Well, those Super Bowl ads seem to have paid off for Temu. On Monday morning, the Chinese shopping app was Apple’s #2 most downloaded free app. But, by Tuesday, the stories about the lawsuits against Temu for not protecting users’ privacy started rolling out.
The class-action lawsuit, filed by users in several states, claims that the app has access to “literally everything on your phone.” According to a story from Scripps News, experts who reviewed Temu found it is “purposefully and intentionally” packed with “a complete arsenal of tools to exfiltrate virtually all the private data on a user’s device and perform nearly any malign action upon command trigger from a remote server.”
It can access the phone’s camera and microphone, not to mention texts, location data, biometric information like fingerprints and more — things that no shopping app should probably need. It’s a new twist that, if true, seems even more menacing than accessing and selling users’ financial data. (A few months ago, another lawsuit claimed the app knowingly allowed hackers to simply take the data, resulting in it being leaked or sold to brokers.) In the first 14 months of Temu’s appearance on the app store, over 900 complaints and privacy concerns were filed with the Better Business Bureau. And today, Temu has tens of millions of customers.
Temu’s statement to the press said that those who want to know just what they’re signing up for (or signing away) need only to read its privacy policy and check out the Permissions section of the app and site. On its extensive privacy policy page, Temu says it cares deeply about privacy and that it “does not ‘sell’ personal information in the traditional sense. We may share your personal information with the following parties for the purpose of providing you with better services, providing you with personalized advertising and marketing communications, protecting your rights, and/or complying with U.S. legal requirements.”
Feel reassured?
When’s the last time you *read* a privacy policy?
If you take the time to read Temu’s privacy policy, you’ll find a lot that will make you want to opt out. But that’s just the thing: Given how many apps and websites we use on a given day, who has the time to comb through all those privacy policies to understand just what information about us is being traded, sold or stolen? (According to the Pew Research Center, only 9% of Americans surveyed reported to always read the privacy policy before deciding whether to agree. Compare that to the 36% who say they never read it.) It would be a full-time job. And how much can you really protect or minimize your data anyway? Those privacy policies won’t say what information is being stolen. Typically we just scroll to the bottom without reading, check the I AGREE box, and move on.
Data minimization is a term we’re going to be talking a lot about in Season Two of the Technically Optimistic podcast, launching soon. And with AI ramping up at incredible speed, it’s perhaps something we’re all going to be talking about soon.
Data minimization is antithetical to how most deep learning works: Most of these AI algorithms see patterns that humans can’t see in a wide swath of data. That causes product people — those who design these sites and apps — to want as much of it as possible in hopes of finding a holy grail of correlation. Minimization, however, is the practice of finding the smallest amount of data necessary to deliver that value to the user.
We need to fight the desire to gather all this data — consider the scary privacy and security problems! — especially with no direct plan of what to do with it. Yes, we should allow app developers to experiment. But before their products go to scale, we should really encourage them to minimize the amount of data needed.
Look, there are a lot of problems with gathering data. Once you get it, you’re a steward of it. The less data you have, the less you have to keep secure, etc. (And, of course, certain data has really high security standards attached – health information protected by HIPAA, credit card data, etc.) So we need to align these incentives with educating the people using the end products.
That means you and me. Every time you download a new app, you should ask yourself: What information is this collecting about me? Can I see or request the company’s privacy policy? And even given that, can I trust it? Better yet, is there an option to use this app without data collection? That’s something that we need to advocate for: App developers should give us a mandatory option to pay instead of having our data harvested. Because our privacy is worth so much more.
Worth the Read
It’s here! OpenAI announced Sora, its new text-to-video model, which creates videos up to a minute long just from text prompts. It’s still being red-teamed, but, understandably, Hollywood and elections teams are already freaking out. (Between you and me, I’ve got to say, though, it’s pretty cool.)
Politico reports that an investigation by Sen. Ron Wyden reveals that location data from nearly 600 people who visited Planned Parenthood clinics around the country was provided to an anti-abortion group, which then targeted those people to receive anti-abortion ads.
For those of you who shared Valentine’s Day with your AI chatbot (no judging!), Gizmodo reveals that s/he may be harvesting “shockingly personal” data from you, with 90% of the apps reviewed selling that data. Is love not sacred?!
This week, the parents of a teen killed during the 2018 Parkland shootings launched The Shotline, a website that uses AI software to recreate the voices of children killed by gun violence and send their messages to local representatives via robocall.
Are universities selling students’ data? Catalyst Research Alliance is offering licenses for University of Michigan’s databases of academic speech and student papers.