Recently, many users of ChatGPT have encountered a peculiar situation during their interactions with the chatbot: it addresses them by their names, a practice that was not common in the past. Several users have reported that ChatGPT seems to remember their names, despite the fact that they never explicitly provided this information.
A Creepy Encounter with Personalization
This unexpected feature appears to be limited to certain models of ChatGPT, particularly the OpenAI o3 version, which displays its reasoning process when generating responses. Simon Willison, a software developer and AI commentator, shared on X (formerly Twitter) a screenshot showing ChatGPT using his name in its reasoning. He questioned, “Does anyone appreciate the fact that o3 uses your name in its thought process, or do they find it creepy and unnecessary?”
The backlash from users highlights a significant challenge that OpenAI may face as it strives to make ChatGPT more personalized for its audience. Just last week, CEO Sam Altman suggested that future AI systems would be designed to “get to know you over your life,” aiming to enhance their utility and personalization. However, the recent reactions indicate that not all users are on board with this concept.
Psychological Insights into Name Usage
An article from The Valens Clinic, a psychiatric practice based in Dubai, provides insight into the strong reactions to ChatGPT’s use of names. Names often imply familiarity and intimacy, but when a person—or in this case, a chatbot—frequently uses a name, it can come across as disingenuous. The exact timing of this change is unclear, and it remains uncertain whether it is connected to the updated "memory" feature of ChatGPT, which allows the chatbot to draw from previous conversations to tailor its responses.
Some users on Platform X have noted that ChatGPT has started using their first names, even when they have disabled the personalization and memory settings meant to prevent such occurrences.
TechCrunch has reached out to OpenAI for clarification but has yet to receive a response. The publication recalls that last week, CEO Sam Altman hinted at the potential for OpenAI’s AI systems to become “incredibly useful and personalized.” Many users, however, express discomfort with ChatGPT using their names, viewing it as an awkward attempt to humanize a fundamentally emotionless bot.
As one article author pointedly remarks, “Just as most people wouldn’t want a toaster to call them by their name, they don’t want ChatGPT to ‘pretend’ to understand the significance of a name.”
OpenAI’s Expanding User Base
Meanwhile, OpenAI continues to see rapid growth in its user base, with Altman estimating that one in ten people globally now use the company’s systems. This surge in popularity has been bolstered by new features, such as a Studio Ghibli-inspired image generation tool.
The Discomfort of Unwanted Familiarity
So, why does it feel strange when a chatbot uses your name? The answer lies in human interactions and the significance we place on names. First names, in particular, signal attention and acknowledgment. In human conversations, the use of a name can foster trust, but excessive or unexpected usage can feel disingenuous or even invasive.
The Valens Clinic articulates this well, stating, “Using an individual’s name when addressing them directly is a powerful relationship-building strategy. However, excessive or inappropriate use can come off as fake and invasive.”
In essence, context is crucial. When an AI uses your name without your consent, it violates the social norms we adhere to in everyday life.
The Need for Consent in Personalization
While personalization can enhance user experience, it should only occur with user consent. Sam Altman has outlined a vision for AI that evolves with users over time, emphasizing systems that “get to know you.” While this concept may seem appealing, incidents like this illustrate the need for AI to navigate these waters carefully.
Users must have the option to opt in and retain the ability to disable features that feel intrusive or unwelcome.
Zainab Y.
Also on site :