July 6, 2024
New Study Highlights ChatGPT's Superiority in Providing Personal Advice Compared to Professional Columnists

New Study Highlights ChatGPT’s Superiority in Providing Personal Advice Compared to Professional Columnists

In a recent study published in the journal Frontiers in Psychology, researchers explored whether ChatGPT, an advanced chatbot powered by a powerful AI language model, could provide effective social advice. The findings revealed that the later versions of ChatGPT surpassed professional columnists in delivering personal advice.

Since its public release just two months ago in November last year, ChatGPT has garnered an astounding 100 million active monthly users. The chatbot is powered by one of the largest language models ever created, with the advanced version, GPT-4, estimated to possess a staggering 1.76 trillion parameters. Its capabilities have revolutionized the AI industry.

ChatGPT has been trained on vast amounts of text data, including scraped information from the internet, enabling it to offer advice on a wide range of topics. From law and medicine to history, geography, economics, and more, users can seek answers and guidance from the chatbot. However, it’s always advisable to fact-check the information provided. Moreover, ChatGPT can even generate computer code and provide instructions on car maintenance, such as changing brake fluids.

The versatility and conversational style displayed by ChatGPT have astounded both users and AI experts. Consequently, many individuals have turned to the chatbot for personal advice.

Providing personal advice requires a considerable level of empathy or, at the very least, creating an impression of empathy. Research has shown that recipients who feel unheard are less likely to accept the advice given to them and may even feel disconnected or undervalued. Put simply, advice lacking empathy is unlikely to be helpful.

Additionally, personal dilemmas often lack a right answer, necessitating the advisor to demonstrate sound judgment. In such cases, compassion may hold more significance than being factually correct.

However, ChatGPT was not explicitly trained to exhibit empathy, ethical considerations, or sound judgment. Its training focused on predicting the most probable word to follow in a sentence. This raises the question of how ChatGPT can make users feel heard.

An earlier iteration of ChatGPT, the GPT 3.5 Turbo model, struggled when providing social advice. It wasn’t due to a lack of understanding of the user’s situation; in fact, it often exhibited a better grasp of the context than the users themselves. The issue lied in its failure to adequately address the emotional needs of the users. Similar to Lucy in the Peanuts comic, it was overly eager to offer advice without adequately attending to the user’s emotions, resulting in poor ratings from users.

However, the latest version of ChatGPT, powered by GPT-4, has introduced a feature allowing users to request multiple responses to the same question. Users can then indicate their preferred response, providing valuable feedback to the model. This feedback mechanism has proved instrumental in teaching ChatGPT to generate socially appropriate and empathetic responses, significantly improving its performance as an advice provider.

The study’s results highlight the impressive evolution of ChatGPT in the field of personal advice. Its enhanced ability to respond to users’ emotions and deliver empathetic guidance puts it ahead of professional columnists in certain scenarios. Nevertheless, it is essential for users to exercise critical thinking and verify the information received, as AI models are not infallible.

As ChatGPT continues to evolve, it has the potential to become a trusted source of personal advice, offering valuable insights and perspectives to its ever-growing user-base.

*Note:
1. Source: Coherent Market Insights, Public sources, Desk research
2. We have leveraged AI tools to mine information and compile it