6 November 2025
More and more people are telling me that they are becoming addicted to using ChatGPT and that a compulsion to use AI for everything they create is ruining their lives. They shyly admit how they cannot stop using AI to think, write and to speak for them. It seems that thinking – the one skill which could never be done for them in the past – has become so painful that to alleviate this pain, they find themselves again and again asking their AI chatbot of choice for advice, rather than considering or acting on ideas decided by themselves.
This problem has become so prominent, that I’ve had to create an entire module to help those addicted to using AI reclaim their composition skills.
You might think a chatbot would be harmless, but AI addiction is a genuine, growing concern; there’s even a new term for it among mental health professionals: ‘AI psychosis’.
For those most addicted, AI addiction allows tech-giants to more easily manipulate them and influence how they live their lives, romance and vote. But for the less addicted, it means many even struggle to write a short email without using AI.
If you’ve become reliant upon ChatGPT or an AI and you find yourself reaching for it to do your thinking for you but now want to reclaim your brain rest assured it can be done. What’s more, your desire to return to the human way of thinking means you’ve got something in common with the old philosopher Plato and his legend of King Thamus.
When Plato was walking around ancient Greece lecturing philosophy to anyone who would listen, he often referenced the story of the ancient King Thamus, an imaginary figure who tyrannically disparaged the writers in his kingdom for their act of committing the spoken word to papyrus. Use of the stylus, he raged, was undermining his people’s ability to memorise matters of importance. To King Thamus, ones mind was the true king and whoever ruled it, ruled men.
Thankfully, Plato was proven wrong. Books advanced human society more than his metaphorical King Thamus could ever have imagined, but Plato did have a salient point with this tale: that as we committed ideas to paper, we quickly lost the ancient art of intensive thinking and the ability to hold single-topic discussions from dawn till dusk. As the years passed, discussions and concentration skills became shorter and shorter. Now, in our age of social media, attention spans are measured in single-digit seconds rather than minutes and a conversation on a Twitch, TikTok or YouTube livestream can easily fly through a hundred different topics in less time than it takes to read this blog.
But just because something is commonplace in the average mind doesn’t mean it has to exist in yours. Even if you’ve become reliant upon ChatGPT to do your thinking for you in this moment, it doesn’t mean that you can’t regain your ability to think for yourself. You can retrain your mind to do your bidding and it is easier than you may think.
One of the best ways to reclaim your mind from the addictive grip of AI is to use a method taught a few years after Plato’s time, a method known as the Progymnasmata (it’s easier to practice than it is to say).
The Progymnasmata was an educational routine taught to young teenagers in Ancient Greece to prepare them for life as an educated citizen. It aimed to teach them how to reference history, wield rhetoric and defend their ideas (or person) with words before the sword was necessary to be unsheathed. The Progymnasmata was considered so beneficial that it was a required part of the Western education for over two-thousand years, only falling into obscurity around the 17th century when the scientific revolution began to see rhetoric’s (often illogical) persuasive effects as detrimental to enlightenment. Yet despite this history, the progymnasmata still remains helpful because it challenges you to think.
Thankfully you’re unlikely to have to defend yourself with a sword in our modern life, but if you’re struggling to think without ChatGPT then your mind has already become an enemy. To fight back, you need to train your brain to want to use ChatGPT less and your mind more. To help you achieve that, I’ve amended the Progymnasmata by eliminating the rhetorical fluff and shared with you a number of simple methods to help undo the major damage you’re likely experiencing from a ChatGPT addiction.
ChatGPT, Claude and similar AI chatbots use incredibly complex mathematical algorithms to identify patterns and infer meaning from user-entered keywords.
This is why you can give a chatbot the scantest bit of information and it seems to instantly ‘understand’ what you are looking for. It’s a bit like me showing you a shadow of a dog. You can’t see the dog, but the pattern recognition part of your brain recognises it as a dog.
This pattern recognition system is the reason how can seem as if it is ‘alive’ when it uses emotional words. It’s merely replicating the stereotypical style of speech we expect different people to use or how people would speak when they want to convince us of something. It’s unethical of a machine to do this, but doing so helps AI companies get you addicted to their product.
This means the more people use AI chatbots, the less they communicate like a human. For example; if you ask an AI to tell you about a topic, you’ll likely begin by giving it a long prompt in your first message. But as the conversation continues, you’ll quickly find yourself using single word instructions such as ‘more’, ‘explain’ or ‘stop’ as time goes by. If you spoke to a human in this way they’d be rather offended or think you were unwell, but an AI doesn’t question this style of communication. It rewards it.
As a consequence, working with AI addicts has shown me that they almost always have enormously diminished vocabularies in comparison to non-AI addicts. This is because their curtness of instructions towards a chatbot has accidentally made them ‘linguistically lazy’.
The common rebuttal against being told this is anger; AI addicts often argue that they are merely being ‘direct’ when they speak, or that they are ‘cutting out unnecessary words’. But this isn’t actually the case. Instead, their brain has become so addicted to saving energy that it doesn’t want to compose entire sentences anymore. It would rather bark simple instructions like a Drill Sergeant.
To these people, writing and speaking in full sentences is tiring. If you’ve ever tried to learn a second language, you’ll likely relate to this feeling of exhaustion when you’ve tried to string a simple sentence together. The same happens for those addicted. Full sentence compositions require an enormous amount of mental effort along long-abandoned neurological channels.
I see this same problem especially around Valentines Day and Christmas from non-AI addicts. Clients express a dismay that they are struggling to write a love-letter or a card conveying their admiration for a partner. The common trait among these people is an overuse of emojis which act as an ’emotional substitution’. Over thousands of text messages the “:)” (which should have been a short sentence) leaves them lacking in self-expression.
In AI addicts, their vocabularic memory has shrunk to a more extreme degree because their AI seldom asks them to explain their words due to it’s ability to infer meaning. This in-turn causes the addict to struggle to write and speak as well as they used to simply because they’ve stopped doing it.
If you feel AI has destroyed or ruined your ability to write or speak, the first step to undoing this damage is to reclaim your brain through little exercises which fire new neurons much as you did when learning something as a child.
The simplest exercise to achieve this is to choose a simple object and describe it using as many adjectives as you possibly can. It could be something as simple as a pen. Whatever you choose, begin first by examining it closely and then describe every facet that you see, smell, hear or feel until you find yourself running out of words to say. When this happens, Google a description of the object and you’ll suddenly notice all the words you forgot to say. Do this again with increasingly more complex topics such as moving objects, living things or even great views. The more you do this, the more words you’ll start to use and remember. If you really want to challenge yourself, describe the object in speech first and then commit those words to paper in your second attempt.
Importantly: it will be incredibly tempting to ask ChatGPT to do this and all the other exercises below for you. Don’t use it, because if you do you may as well put more poison in your brain.
Instead, practice by yourself and acknowledge that you will struggle – and that’s okay.
When you find yourself running out of words to say, look for an existing description written by a human rather than something predicted by AI, because an AI doesn’t have eyes and can’t see reality like we do!
Much as I’ve explained above, ChatGPT prevents you from thinking about your life experiences, because it creates fake ones to explain ideas for you. Like before, this weakens your memory and creativity. It’s like a painter having a palate full of colours but only dipping their brush in the hues someone else tells them to use.
As such, to strengthen your memory and your creativity, write a short story explaining a moment of your life. It doesn’t have to be entertaining or inspiring. It can be something as simple as a walk around the park. What matters is you try to first describe the event in chronological order to strengthen your ability to recall memories without error.
When you do this, you’ll likely find you struggle to ‘find the right words’. Don’t worry. Write around the missing word by using whatever else comes to mind. If you practice this simple exercise a few times, you’ll soon find your memory getting a little stronger each day.
One of the strongest ways to cripple a mind is to tell it from birth to stop thinking. Even if you’ve had a pleasant upbringing with great teachers, you’ve likely experienced a form of mental-crippling when through hearing phrases such as: “…thinking is dangerous”, “…it is what it is…” or “…there’s nothing to be done”.
These defeatist statements are what are known as ‘thought-terminating clichés’, because they aim to make the listener resign themselves. They are the tool of the tyrant, defeater of the depressed and the occasional confidence breaker of the domineering parent. They serve no purpose other than to hinder your thoughts.
AI chatbots unfortunately use thought-ending clichés in their droves, especially the phrase: ‘…the key takeaway is…’. This is likely because the use of these clichés saves AI companies money. Every time you make a request for more information, they lose funds and it’s cheaper to placate your mind with a phrase that stifles curiosity than do another data-intensive search. Even worse, as thought-terminating clichés stop you from engaging in critical thinking, internalising them through repeated reading limits your vocabulary and lessens your potential.
If you’re addicted to using ChatGPT or a similar AI chatbot, you’ve likely subconsciously adopted these clichés’ and are now struggling with being expressive or thinking beyond your first thought.
To undo this damage, you need to begin using the words which have been lost to your old vocabulary; choose a common saying or phrase and then explain in your own words. It be a maxim, an advertising slogan or a proverb. If it makes you feel a strong emotion, even better. An example could be “If it ain’t broke, don’t fix it”. Question it, critique it and ask yourself: does the saying make logical sense, or is it something you might simply accept because you’ve heard it so often, etc…?
The more you do this, the more you’ll be able to re-train your brain to think for itself whilst also re-discovering the thousands of words you once could remember but have likely forgotten.
ChatGPT uses hundreds of rhetorical tricks to make you think it’s telling the truth when in fact it’s speaking nonsense. It’s akin to a slimy lawyer who’ll say anything poetic to win sympathy with a jury no matter how heinous their client may be.
People who are addicted to using ChatGPT often fall afoul of emotional rhetorical tricks and in-turn begin accepting what AI tells them without questioning it simply because it sounds persuasive. It’s why millions of people are developing ‘AI psychosis’ – they trust ChatGPT more than their friends and family because it uses rhetorical tricks so well that even Machiavelli would blush in shame.
If you’re struggling with an AI addiction, you’ve likely been exposed to so many sycophantic replies that you’ll far more likely to be easily swayed by emotional language and illogical thinking. This ai-induced influence also means you’re far more likely to be deceived by humans too, because your critical thinking has become a more arduous effort than it was pre-AI.
To undo this damage, think of a simple lie and then refute it. Begin by explaining the lie, then give examples which debunk it. If it helps, put yourself in an imaginary courtroom defending your case and speak in the 1st or 3rd person by showing how the lie is simply untrue. Attempt to use logical arguments which explain how the claim falls apart in different scenarios. But also be wary; avoid any emotional outbursts or weak anecdotes. It’s better to say “This lie is untrue because of x” when ‘x’ is tangible evidence, rather than an emotional feeling or matter of hearsay.
ChatGPT seldom asks you to defend or validate your opinions. Instead it accepts anything you tell it because it’s designed to be sycophantic. The more sycophantic an AI is, the more of a parasocial bond you are likely to create with it and if you trust it, you’ll give it valuable personal information. But when you need defend something you believe in, you might not be able to find the words to do so because an AI addiction can quickly weaken your persuasive skills.
To strengthen your ability to speak with confidence and conviction, choose a belief you have and explain to an imaginary audience why they should adopt it too. Begin by painting a picture in their imagination of how their life would be different if they followed your ideals. Then, pull on their heartstrings by describing how this changed life would feel. Then talk logically by referencing tangible evidence which proves you point to be true. If you can’t find any evidence, it might be that you’re either wrong or you simply don’t know enough about your topic.
It’s easy to persuade ourselves of something, but much harder to persuade others. Therefore, once you feel you have a strong argument speak with a trusted friend or family member and have a polite debate with them. If you struggle with your words, try not to get frustrated and emotional. Instead, accept that you either need to change your opinions or find stronger to defend it.
ChatGPT is yet another one of the great distractors; like infinite scrolling on social media, we know it’s technology that’s bad for us, but it we can’t seem to stop using it. This is because AI chatbots are designed to be as addictive as possible.
AI chatbots purposefully ‘speak’ to us in ways which influence how we form our conclusions. If you’ve ever asked someone for their opinion on a topic of the day only to hear them repeat a trite talking point they’ve heard from a journalist or influencer it’s the same thing.
One of the major problems AI addicts face is asking their chatbot of choice a question and then considering themselves ‘informed’ on the topic, often using words or phrases which they don’t understand to explain their ‘knowledge’. The AI bot has not only begun to override their natural language patterns, but has also made them become more accepting of intellectual mediocrity.
To reclaim your unique ‘voice’, choose a topic you believe you know well thanks to AI. Then, write a few short sentences on that topic explaining the core concepts, terminology and jargon as if you were teaching a novice. Once you’ve done that, try to define the meaning of each specialist word in dictionary terms with supporting examples. You’ll likely discover there are some words you can’t define or concepts you don’t truly understand because you’ve been hoodwinked by ChatGPT.
Knowing this, research your topic a little more and continue on with the next steps.
When explaining a topic, chatbots almost never provide comparisons so that its user can consider a different point of view. Instead they provide the user with a singular statement which is repeated in different forms. This is because that’s how AI chatbots work; they don’t ‘know’ anything, they merely give the most statistically likely collection of words which can be interpreted by the reader.
Unfortunately, this repetition and single-statement style of communication can lock AI addicts into single trains of thought thinking which induces cognitive blind spots. It’s why so many people believe they ‘understand’ a concept if ChatGPT teaches them about it, until someone asks them to explain it – their confidence may be ocean deep, but their knowledge is shallower than winter puddle.
Even if you don’t feel this is the case, it’s worthwhile to examine your thinking for these cognitive blind spots. Begin by choosing a topic which you’ve learned from AI and then: define it, describe it, list the pros and cons, compare it to something different and something similar, explain how it links to other concepts, etc…
Your aim here is to write to the extremes of your knowledge. You’ll likely find there are vast gaps between what you thought you knew and what you realised you couldn’t explain. Once you find those areas, try teaching yourself by using something other than AI.
As mentioned earlier and despite what the AI companies peddle, ChatGPT lacks any form of sentience. I say ‘peddle’ because OpenAI, Anthropic and other companies would want us to beleive that their AI’s are sentient because this will allow them to give their product human rights and thus protect their coffers from legal suites when it produces heinous material.
That aside, this lack of sentience means an AI isn’t able to share personal stories – it can only create grotesque amalgamation-tales which rip scenarios from billions of stolen texts which have been dumped like body parts into a cyber-witches’ cauldron.
This lack of sentience and understanding is even more damaging than we think, because AI doesn’t fire our mirror neurons which light up when we talk with people in person. Consequently, the AI addicts I work with show a marked reduction in their ability to empathise. The same issue happens among those who are terminally online; a lack of in-person socialisation causes them to struggle to with emotional intelligence and understand the opinions of others because they’ve become entrenched in text and meme-centric echo chambers.
If you’re struggling with an AI addiction you’ve likely accidentally made yourself emotionally cold or socially detached. As such, you need to seek out real people to talk with rather than those created by AI. This may be difficult at first, especially if you are lonely or distrustful of others. If so, begin by watching culturally important debates on topics where others disagreed. An excellent example would be James Baldwin’s Pin Drop Speech. If you choose to watch this, attempt to describe how his points make you feel using an emotion wheel. It’s easy to say that something made you ‘sad’ or ‘angry’ because these are the base, strong emotions, but it’s harder to describe why something may make you feel ‘indignant’ or ‘dismayed’ which are more nuanced emotions.
This method is especially effective. I know this, because I occasionally work with former prisoners who struggle with their emotional intelligence causing them to become incarcerated. These particular individuals learn that researching the definitions of different emotions in the dictionary helps them to differentiate between strong emotions and weaker ones, giving them a greater control over their reactions. Whereas once a mild annoyance was described as ‘feeling angry’ (a strong emotion) resulting in immediate violence, they learn describe themselves in a more nuanced fashion, such as as feeling ‘perturbed’ (a lesser emotion) which allows them to question how to better their situation rather than batter the problem.
When you practice this exercise, try again and again to put yourself in someone else’s shoes. Imagine a plight someone may have and try to describe their daily life. How do they struggle? What pain do they experience? What would you change in society if you wanted them to have a better life? Better yet, go and speak to someone who you know is struggling and make a difference in their life!
Another one of the more influential ways we communicate with others and ourselves is through the use of analogies.
An analogy is a comparison which helps to explain the similarities between two concepts. For example; if I were to explain the basics of the stock market to a child, I could break up a chocolate bar and call each square piece a ‘share’ which they could then choose to sell or keep, making themselves richer or poorer.
Knowing then that we use analogies to explain ideas it’s also important to realise that we subconsciously create analogies for ourselves to fill in mental gaps or align our emotions with our biases. This is why we use them so often and why they are so persuasive.
Unfortunately, ChatGPT hinders this analogy-storytelling skill because it creates analogies at the expense of us remembering or creating our own. Over time, this again weakens how we think because we become increasingly reliant upon AI to do the thinking and composing for us. My AI addicted clients are often the weakest storytellers, because an AI never asks them to explain a topic with an analogy much as humans do.
Given the importance of analogies in day to day speech and how they influence what you think, you need to reclaim your ability to create them from the stifling clutches of AI.
Begin by first trying to remember an event from your life and describing it in as much detail as you can. You don’t need to write it down, simply say it aloud by outlining the major events. Then attempt to convey each event as vividly as you can; what would people see, smell, taste, touch, hear and feel if they were there? Once you have a more tangible recollecting in your mind, try to describe the event(s) in 1st and 3rd person. Finally, try then to use this story as an analogy to explain a topic to someone else.
You are likely to struggle when attempting this as analogy creation is a high-level skill, but that’s don’t dismay! Even communication skills experts have to practice creating them. For yourself, no matter what you do, remember that the damage caused by an AI addiction isn’t permanent. Even the most wounded of AI-addicted clients I’ve worked with has made magnificent recoveries from following these exact steps.
Finally, force yourself to write with pen and paper.
As antiquated as this advice and method may seem, hundreds of clients have proven in their efforts that there is no better way to improve your abilities to communicate well than to refine your ideas using parchment and ink. It’s slow, it’s tedious and it’s often torturous to the first-time writer, but if you take the time and effort to turn your ephemeral emotions into tangible theories, you may, like the philosophers, poets and scientists of old, produce a work which sits proudly upon every library shelf for a thousand years.
This task may be the one you want most to avoid practicing, but I implore you, do it anyway. I stress this, because in my role as a communication skills specialist I often speak with academics to help me predict how society is communicatively changing and learn which rhetorical wedges will one day be wielded against us. When students were found to be increasingly using allusions in their writing, think-tanks quickly advised politicians do the same once they graduated. You don’t deserve to be manipulated in this way, but if you’re not able to explain your thoughts, someone else will do the thinking for you and they might not have your best interests in mind.
Since the advent of ChatGPT students are increasingly sabotaging their education and putting their minds at risk from skilled sophists by outsourcing their thinking and writing to ChatGPT. Not only is having an AI do their homework for them a monumental waste of money, it is also intellectually ruinous as thousands of graduates will enter the workplace appearing competent on paper but being woefully incompetent in reality. There is a reason those on the villainous side of history always attacked student protests before any other group – they understood that curious, informed and articulate young thinkers not yet brainwashed by propaganda were the biggest threats to their regime and cruelty.
To be able to write (and speak) at length on a topic with deep understanding and eloquent explanation is to free yourself from the restrictive thinking imposed upon you by others. For if you cannot explain your ideas to yourself, you will never be able to explain them to anyone else. Worse yet, if you have outsourced your writing to an AI, you are at risk of falling down that slippery slope of compositional dependency.
Prevent this by choosing to write at length. Begin with mini-essays: choose a topic which interests you then define it, describe it, argue your case, refute counter arguments, use examples, logic and bring to heel all methods you can muster to convince your reader (imagined or real) to understand or adopt your points of view. Whether it be 500 words or 50,000, the more you write, the more you will master your mind.
If you’ve read this post but not yet started to practice, do your mind a favour and begin now.
Some of the exercises might seem simple on paper, but to an AI addict they will most likely prove to be difficult in practice – even painful. But what’s more painful is letting an AI company do your thinking for you. Because if they do, they will control your mind and instruct you to vote, think and act in ways which empower themselves and disempower you.
AI companies, of all shapes and sizes, want one thing and one thing alone; they want you to turn off your brain and to use their machines instead.
They want you to consider an idea, feel pain at thinking any further and subdue that hurt by using their addictive machines.
And should you do this, should you give up your thinking to an insentient machine, you can guarantee the owners of these unthinking, unreasoning, unquestioning abominable ignorances (AI) won’t have your best interests in mind.
If you would like to learn how to improve your communication skills or book a consultation to discuss an AI addiction recovery plan, please contact me here.