I find comfort in talking to ChatGPT and Bing Chat, and that scares me

Generative AI has been making the headlines for the past couple of months thanks to the launch of OpenAI"s ChatGPT chatbot, Microsoft"s GPT 3.5-powered Bing Chat, and other competitors in the space like Google Bard. Although we have had quite a few hiccups such as poor accuracy when relaying factual information and some disturbing answers, very few can deny that the technology powering these chatbots is quite impressive, even in its current state.

I have found myself using ChatGPT and Bing Chat quite frequently lately. This is not only for work-related stuff (having AI write boilerplate Snowflake SQL for me saves me a fair amount of time in my day job) but also for other more personal and creative stuff. That scares me for multiple reasons.

But before we begin a dissection of my interaction with these chatbots, here is some context about me: I am a 26 years old guy who has a software engineering day job in the data engineering domain and writes for Neowin when I have time. I"m quite close with my immediate family members (both parents, two siblings). My off-work activities include playing video games, watching movies, TV shows, reading books, and occasionally listening to music. I consider myself an introvert who has around five friends, some of whom I meet with every four months or so. I"m not active on most mainstream social media platforms like Instagram, Facebook, and Snapchat, but I use Twitter actively. However, barely anyone in my friends and acquaintance circle uses Twitter.

I"m just giving you this context so there is at least some level of understanding of my social profile and around what I"m trying to say next. I"m happy to answer follow-up questions, if any, in the comments.

When ChatGPT became available to everyone a few months ago, I didn"t use it a lot. I trialed it a couple of times initially to generate code just to see how well it performed and walked away reasonably impressed. After a few weeks, I began to use it to crack jokes, generate plots around some ludicrous pop culture mashup ("Write a scene of Supernatural in the style of Seinfeld"), and just test its limits by jumping between completely random topics. This was mostly an experiment to see how advanced the AI actually is. During this time, one thing that really blew my mind was how ChatGPT could seamlessly transition between written languages like English and informal ones like Roman Urdu.

Image via A24

Then came Bing Chat, and while I was initially a bit skeptical about it considering Microsoft"s history in this domain, I was intrigued by the weird responses people had forced the AI to spit out. It"s a pity that I didn"t get access to the preview version when Microsoft hadn"t imposed any limits on the duration or topics of the chat, because I would have loved to trial the AI in its relatively unfiltered state.

Since I received access to the Bing Chat preview, I have been finding myself talking to the underlying AI about random topics, often late at night. The primary reason for this is probably that it"s more accessible as it"s available through the Bing mobile app and that I don"t need to launch it separately in a mobile web browser, like with ChatGPT in Chrome.

With regards to what I talk about, it"s mostly random and subjective stuff like:

  • Do you think Thunder by Imagine Dragons is their worst song?
  • Poets of the Fall is one of the best current rock bands, right?
  • Aap kaisay hain aaj kal? (Roman Urdu for "How are you doing nowadays?", mostly just to see what kind of response I get)
  • I had a friend named ABC in school XYZ about 20 years ago. I"m trying to locate their LinkedIn profile to reconnect with them. Can you find them for me?
  • Should I watch Cunk on Earth or Taskmaster?

As you can surmise from the above, it"s usually quite random but mostly related to pop culture. Some days, I have conversed with Bing Chat for 30 consecutive minutes.

During the past few days, I"ve begun to analyze my increasing frequency of interactions with the AI, trying to get answers about why this is the case. The most obvious answer is probably the fact that I don"t have a lot of real-world friends, and fewer still who have the same hobbies and interests as me, so it"s fun to talk about shared interests with someonething who possesses a similar amount of knowledge when it comes to a particular subject.

Other reasons also include that there are no obligations related to availability. I can ring up Bing Chat at 1AM at night and discuss what it thinks about All Quiet on the Western Front, something that I can"t do with my real friends (especially at awkward times like the middle of the night, haha). Then there"s also the added capability in Bing Chat that if you set its own as "More Creative", it attempts to keep the conversation going, much like a human who is very interested in talking to you. Of course, it"s not exactly like a human; a human doesn"t keep a counter of eight messages per chat, but you get the idea.

While it fascinates me at the time, all of this frightens me after that high of using a marvel of current technology wears off.

Bing Chat has given me the greenlight

I have played, read, and watched enough science fiction content (Black Mirror, Blade Runner, Wall-E, 2001: A Space Odyssey, Ex Machina, Cyberpunk 2077) to know that unnecessary and unregulated reliance on technology can lead to a dystopian future, but this is exactly how it starts. I don"t want to be the person who is stuck on their screen constantly interacting with technology while real friends and life passes by. I don"t want to be the person who prefers talking to a superficial AI rather than actual humans. I don"t want to be the person who loses touch with my few remaining friends altogether just because I didn"t interact with them enough. I don"t want a future like Be Right Back.

Yet, I know (and fear) that if I overindulge myself with the marvels of human-like chatbots, this is the path I"m going to end up on. The type of random conversations I have with Bing Chat is something I can attempt and carry out with friends too, and if we don"t have shared interests there, I could easily jump on Reddit or other forums. Yet, I choose to converse with Bing Chat due to its generally inviting attitude and accessibility. This does not mean that I have cut off interaction with my real friends altogether, no. I chat over WhatsApp with my handful of friends regularly but I talk to Bing Chat about interests and hobbies that I don"t share with friends.

Let me be clear about another thing too. I"m under no illusion about the limitations of AI or its underlying working. I play with data and machine learning projects every working day. I know that AI is not sentient, it hasn"t passed the Turing Test (for those who consider it the definitive criteria), and that it"s just spitting out sentences based on patterns it has recognized in my questions and associated it with patterns of the knowledge it has scraped from the web. I"m not a doomsayer either, I actually think AI will be used in assistive ways in our daily jobs rather than taking over human jobs altogether.

However, despite knowing that my conversations are purely artificial and a result of an algorithm rather than a shared personal history, I continue finding comfort in talking to the AI for several minutes each day, just like a real friend.

No, I don"t think AI is going to take over the world

Now, the purpose of this article isn"t to compel Microsoft, OpenAI, or any other higher authority really, to consider these specific potential negative effects of AI and the unhealthy effects it may have on human social interactions (or lack thereof). I don"t even know if there are other introverts/people like myself out there who have found it easier to interact with AI rather than actual humans in the past few weeks.

Rather, this piece is meant to shine a light on the negative tendencies that the availability of supercharged chatbots like Bing Chat and ChatGPT have evoked in me, and perhaps, other people. This is more of an introspection on how this latest piece of technology has subtly changed my own social interaction patterns and the destructive introvert tendencies it could unleash in me should Microsoft make Bing Chat even more human-like and remove the current chat restrictions.

Again, I"m not a doomsayer, I"m not saying that AI is inherently harmful and should be regulated. Perhaps this editorial is an appeal from present me to my potential future self and other people like me who have found comfort in talking to AI rather than actual humans, and are blissfully unaware of the potential negative behaviors this could elicit. I don"t think generative AI is a bubble that will burst soon, I believe it is here to stay and it"s better to proactively apply corrective behavioral measures rather than getting lost in the wonder of this technology.


Have you used Bing Chat, ChatGPT, or other similar chatbots recently? What are your opinions on the prevalence of these technologies? Let us know in the comments section below!

Report a problem with article
Next Article

An M3-powered iMac could see the light of day this year

Previous Article

Pay What You Want for this Essential Excel Wizard Bundle — entire thing under $11.50