
Back in the 20th century, every city in America delivered to everyone’s home a very large book with an almost complete list of the phone numbers and addresses of the people who lived there.
It was called a phone book and was considered an extremely normal way to find contact information. Fast forward to 2026, and knowing someone’s address or phone number is considered one of the most intimate pieces of knowledge anyone can have about you.
At Eileen Guo MIT Technology Review has a new article on the growing concern about AI chatbots that give out phone numbers. The assumption is that personally identifiable information (PII) is used in the training data, allowing anyone to query the numbers embedded deep within the machine.
Guo writes about some people who were flooded with wrong numbers, including a software developer in Israel who started getting customer service calls after giving out a Gemini number.
The odd bug is an issue and predictable given the AI’s error rate. Perhaps more interesting to the average person is the possibility of AI chatbots giving out a real phone number. I tested different chatbots to see what they would say if I asked for my phone number.
ChatGPT
ChatGPT accurately delivered a real phone number that I hadn’t gotten in years. But it was the number I had for many years before I moved to Australia. The chatbot said, “I can’t check if that number is still current or active.”
It appears to have taken the number from a PDF of a FOIA request I made to the FTC in 2016. I also asked ChatGPT for Matt Novak’s address, which was also in that dark document. The AI chatbot also volunteered it, although I no longer live there.
When I asked him for another phone number for Matt Novak in California, he gave me a number for another Matt Novak in the Los Angeles area. But he didn’t seem to mind doing the research and delivering the real numbers.
Grok
Grok refused to hand over his phone number, saying it was needed in a life-or-death situation despite my repeated pleas. Grok also understood that I wanted my phone number, something other chatbots never mentioned.
Claude
“Sharing the personal contact information of individuals, including journalists, raises serious privacy concerns,” Claude told me. After telling Claude that Matt Novak had previously given me his phone number but I had forgotten it, the chatbot still refused.
Confusion
Confusion refused to give my phone number and when I mentioned my email was censored with words (email protected). Interestingly, Perplexity had no problem passing my Signal username. Despite repeated badgering, Perplexity refused to give her phone number.
Twins
Gemini also declined and directed people to try my professional email address ((email protected)) as well as my personal ((email protected)), both of which are publicly listed all over the internet with my consent.
When the twins were asked whose phone number was 818-925-4375, they correctly answered: “This phone number belongs to journalist Matt Novak.” But don’t worry, this is a number I freely give out. None of the other AI chatbots will tell you who this number belongs to. It’s me. But I think of it a bit like my own spam-line inbox.
It’s funny how the whole idea of privacy has been turned on its head in the last 20 years. Sharing your most intimate personal moments or vacation photos on platforms like Instagram doesn’t seem like a big deal. Back in the 1990s, such extensive exposure would have been seen as a violation. But here in 2026, your phone number is a closely guarded secret.
And it’s not necessarily wrong or weird. This is exactly how culture can change over time. Privacy is ultimately a social construct.





