So Insecure
The AT&T breach affects 100m users. It’s just beginning.
If you’re reading this newsletter, chances are you're an AT&T user. If so, the chances are even higher that this week, you received an email informing you of a little data breach. Make that a massive breach: Over 100 million AT&T customers were affected.
Hackers broke into Snowflake, the third-party cloud that AT&T uses for storage, and stole more than 100 million phone numbers — as well as the numbers of the people they called or texted — during a six-month period in 2022, plus a day in 2023. It’s possible that location data may have also been stolen via cell tower IDs.
Just to be clear: It doesn't seem like names, the contents of the text messages, or other personal information, such as date of birth or Social Security number, was stolen. Still: Hackers can use these numbers for phishing texts and calls. And they’re patient (and smart) enough to wait months, until we’ve forgotten the news.
While allegedly only AT&T users were notified, the impact of this hack could be way more widespread. Even if you're not an AT&T customer, your number could be in this breach because an AT&T customer called or texted you. (Or, and this may be a small thing, you may be on a mobile virtual network operator that uses AT&T as its backbone. So if you're also a Cricket customer, for example, you're actually on the AT&T network.)
There are lots of databases that will map phone numbers to names, and that means that a pretty full picture of call history, personal networks, etc., is now out there.
What your data really means
This is both an example of us not knowing what data really means — the information that you texted a friend, a co-worker, an illicit lover, an ex, etc., is now out there, and you probably never thought that “data" on those interactions was being collected — and our implicit trust of companies like AT&T with this potentially highly sensitive information.
This isn't the way technology should work.
But, I mean, this is the way it works, right? Not just with data breaches: If AT&T got subpoenaed, they would have given over the data. We don’t just see it in the movies; in the days following Saturday’s assassination attempt, every news report ended with the sentence, “The police are seeking permission to unlock the gunman’s phone” (which they weren’t able to access until Monday). The same thing happens with most platforms we talk about these days — X, Facebook, etc. That’s because our phones and computers contain just about every key to our lives.
(I'll note that when I was at Twitter, we prided ourselves that if law enforcement wanted data, we would alert the account owner, even if we didn't have to. And we fought these requests. We posted them on our transparency page. On top of that, we had lots of conversations about what data we should even gather, because we didn't want to be stewards of it. All credit goes to Alex Macgillivray and his team.)
A truly private network
Sure, now senators are calling for AT&T to explain what went on. But maybe there are solutions so that we don't have to trust companies to get it right?
On the episode about how to save social media for Season Two of the podcast, we spoke to Meredith Whittaker, the president of the Signal Foundation, a nonprofit dedicated to open-source privacy technology. They have designed the system and network behind their Signal Messenger app so that even they don't know who you are talking to or what your messages say. If law enforcement came to them, they would have to say, "Sorry, we literally don't have it." If hackers broke in, there is nothing for them to steal. That's just the way the system was built.
We need more of these systems in the world. And we need to think more deeply about how we expect technology to work. We wouldn't expect a pencil or pen to be able to tell us who the last user spoke to and what it wrote. Why should our phones be able to betray us?
I need you to demand more of our technology providers.
Got hacked?
Okay, putting that aside for some concrete advice: What should you do if you were in the AT&T hack?
Right now, your entire call and phone history is out there. I would have said this before the hack, but now it’s more important: If you get a text from your aunt asking for money in the coming months, don't assume it's true.
Your number isn’t the only data floating around out there. Thanks to AI, I think this is the year that most biometric identifications will fail. if you hear somebody's voice on the phone, is it really them? Call them back, or find another way to confirm it’s them. Consider having a safe word with your family.
if you haven't done it already, make sure two-factor authorization (2FA) is turned on on all your online accounts (bank, social media, etc.). And don't use SMS- based 2FA — you know, when you get texted an authorization code. Not secure.
And honestly, consider using Signal from here on out for text messaging — and probably even calling.
Please let me know what you’re doing to protect yourselves. Comment below, or write to me at us@technicallyoptimistic.com.
Shout-out to Justin Hendrix at Tech Policy Press, who wrote a thoughtful response to my post citing that hot-button essay going around about situational awareness in the decade ahead. Justin is technically optimistic, too: He pointed out that we need to invest in alternative versions of the future, and I agree: We need to zig while others are zagging.
Worth the Read
Let the US-EU tech showdown begin: Meta announced it is holding off on releasing Llama, its next multimodal AI model, in the EU “due to the unpredictable nature of the European regulatory environment." (Apple made a similar move last month.)
The Andreesen Horowitz podcast backs Trump over Biden through the lens of little tech. “Sorry Mom,” says Horowitz. “I know you’re going to be mad at me, but we had to do it.”
You may have noticed during your summer travel that TSA is now using facial recognition scans. It saves time, yes, but: data. You can, however, opt out. Here’s how.
Michigan is joining the list of states to protect our data. (Rhode Island, too! And don’t forget Texas.) Just imagine if we could have a federal data privacy law….
Toys “R” Us made a promo film entirely using OpenAI’s Sora — a model that can create videos in 60 seconds — to which its agency had early access. Cute? Or terrifying.
The Times reports that an algorithm used by police in Spain to assess the risk of gender violence has led to devastating deaths.


