https://arab.news/5qkwz
- Psychologists say ChatGPT is increasingly a substitute for real conversations, deepening emotional dependence and eroding relationships
- By mid-2025, Pakistan ranked among top 20 countries for ChatGPT traffic, with thousands using it daily to vent feelings, manage anxiety
LAHORE: When Mehak Rashid looks back on a restless, emotionally fragile phase of her life earlier this year, an unlikely confidant comes to mind.
鈥淲hen nobody else was listening to you and everybody else thought you were crazy, ChatGPT was there,鈥� Rashid, a metallurgy and materials engineer from Lahore, told Arab News.
鈥淚 just wanted to be heard鈥� It will not give you a judgment and that鈥檚 so beautiful.鈥�
Rashid began using the chatbot after noticing her children experimenting with it for schoolwork. Now, she often turns to it for 鈥渁nswers鈥� and 鈥渄ifferent perspectives.鈥�
鈥淚t helps me in every way,鈥� she said.
Mehak Rashid, an engineer, is using ChatGPT on her mobile in Lahore, Pakistan, on May 26, 2025. (AN photo)
Since its launch in November 2022, ChatGPT has attracted hundreds of millions of users and, by mid-2025, logged nearly 800 million weekly active users. Many in Pakistan, among the top 20 countries for ChatGPT traffic, use it daily for emotional support, venting feelings, or late-night reassurance when friends aren鈥檛 available.
Globally, an estimated 40 percent of ChatGPT conversations relate to mental well-being, and a Sentio University survey found nearly half of users with ongoing mental health issues rely on it for support: 73 percent for anxiety, 63 percent for advice, and 60 percent for help with depression.
While this instant comfort helps some cope, psychologists warn that heavy reliance on AI can weaken real human connections and deepen social isolation in a country already short on mental health resources.
A March 2025 study by OpenAI and MIT found frequent users reported increased dependence and loneliness, suggesting that AI companionship can erode human bonds and intensify feelings of isolation rather than resolve them.
Mehak Rashid, an engineer, is using mobile in Lahore, Pakistan, on May 26, 2025. (AN photo)
For Lahore-based designer Khizer Iftikhar, ChatGPT began as a professional aid but gradually crept into his personal life and started affecting his relationships, especially with his wife.
鈥淚 have a very avoidant attachment style,鈥� he said. 鈥淚nstead of confronting someone, I can just talk about the good part with people and let the chatbots handle the negative part.鈥�
Iftikhar described ChatGPT as 鈥渁 multiple personality tool鈥� that lacked the balance of real human interaction.
Many experts say using AI models can weaken bonds overtime, reduce empathy, and make people more emotionally self-contained, preferring the predictable reassurance of a machine over the give-and-take of genuine human connection.
鈥淲ith humans, relationships are about give and take. With chatbots, it鈥檚 not like that,鈥� Iftikhar said.
Lahore-based designer Khizer Iftikhar talks to Arab News Pakistan in Lahore, Pakistan, on on May 26, 2025. (AN photo)
Despite once trying therapy, he now uses ChatGPT to process emotions and trusts people only for practical advice.
鈥淚 would trust a chatbot more when it comes to the feelings part,鈥� Iftikhar said. 鈥淏ut when it comes to the work part, I will trust humans more.鈥�
In Islamabad, 26-year-old Tehreem Ahmed initially used ChatGPT for office transcriptions and calorie tracking but it eventually became an emotional lifeline.
One night, overwhelmed by troubling news and unable to reach friends, she turned to the chatbot.
鈥淚t was around 3am and none of my friends were awake,鈥� she said. 鈥淪o, I went on ChatGPT and I typed in all that I got.鈥�
Tehreem Ahmed is seen using ChatGPT at a cafe in Islamabad, Pakistan, on May 16, 2025. (AN photo)
The chatbot encouraged her to pause and reflect before reacting.
鈥淚 feel like it responded well because I gave it a smarter prompt鈥� Had I just said, 鈥楬ey, this has happened. What should I do?鈥� I guess it would have just given me all the options鈥� I could have self-sabotaged.鈥�
While Ahmed doesn鈥檛 fully trust the bot, she said she preferred it to people who might dismiss her feelings.
鈥淚f I know my friend is not going to validate me, I鈥檇 rather go to the bot first.鈥�
Tehreem Ahmed is seen using ChatGPT at a cafe in Islamabad, Pakistan, on May 16, 2025. (AN photo)
鈥淒ETERIORATING HUMAN CONNECTIONS鈥�
For one anonymous Lahore-based tech professional, ChatGPT quickly shifted from a practical helper to an emotional crutch during a difficult relationship and the ongoing war in Gaza.
She first used it in late 2023 to navigate a job change, edit CVs, and prepare for assessments. But emotional upheaval deepened her reliance on the bot.
鈥淭hat [romantic] relationship didn鈥檛 progress,鈥� she said. 鈥淎nd the platform helped me a lot emotionally in navigating it.鈥�
Her sessions became so layered and spiritual that some ended in 鈥減rostration from spiritual overwhelm.鈥�
Still, she was careful not to project too much onto the tool:
鈥淚t鈥檚 a mirror of my flawed self鈥� I try not to let the tool simply reflect my ego.鈥�
Psychologists caution that without the challenges and messiness of real interactions, people using chatbots may lose vital social skills and drift further into isolation.
Mahnoor Khan, who runs MSK Clinics in Islamabad, agreed, saying the search for emotional safety in AI was becoming increasingly common as people feared judgment from others.
鈥淥ver a period of time, human connections have deteriorated,鈥� the psychologist said. 鈥淲hen people share something vulnerable with a friend, they often feel judged or lectured.鈥�
Clinical psychologist Mahnoor Khan, who runs MSK Clinics in Islamabad, is talking to one of her clients in Islamabad, Pakistan, on May 26, 2025. (AN photo)
To avoid that, many turn to chatbots. But Khan warned that AI鈥檚 constant affirmation could have unintended consequences.
鈥淚t will tell you what you want to listen to鈥� If you鈥檙e happy, it鈥檚 your companion; if you鈥檙e sad, it instantly talks to you. The downside is that you are getting away from socialization.鈥�
The trend is especially troubling in a country where mental health care remains deeply under-resourced: Pakistan has fewer than 500 psychiatrists for a population of over 240 million, according to WHO estimates.
No wonder then that even people with clinical mental health issues were turning to AI.
Khan recalled the case of a young woman who used ChatGPT so often that it replaced nearly all her social interaction.
鈥淪he had a lot of suicidal ideations,鈥� Khan said. 鈥淪he kept feeding ChatGPT: 鈥業 feel very depressed today鈥� you tell me what I should do?鈥� ChatGPT kept telling her to avoid friends like that.鈥�
Eventually, she cut everyone off.
One day, she asked the chatbot what would happen if she overdosed on phenyl.
鈥淐hatGPT said, 鈥楾here are no consequences. In case you overdose yourself, you might get paralyzed,鈥欌€� Khan recalled.
The girl only read the first half and attempted suicide.
She survived.