ChatGPT made a big splash and more than a few headlines when it was released in November 2022 and since then there has been one hell of a lot of speculation as to how it would influence Google’s Large Language Model or LLM. This conversation was brought to the fore and has been one of the biggest trending digital talking points of 2023 since Microsoft announced that it will be integrating ChatGPT into its Bing search engine.
Will This Be The End of Google?
In a word… NO! Other than the fact that Google has a near monopoly hold over the global online search market LLMs are not yet perfect and have a number of problems which would have to be ironed out before any ChatGPT integrated search engine could seriously threaten Google’s dominance. Add to this the fact that Google also has the technical ability and the finances to ensure that they remain competitive, and it is easy to see that although things are changing (and for the better), it will still be some time before LLMs make their mark.
How Will ChatGPT Affect Google in the Future?
ChatGPT might not give Bing and Google’s other competitors an advantage but it could very well result in the “unbundling” of online search. And that is where the opportunities for Microsoft and other companies lie. Because when ChatGPT is integrated into an actual product itself (and not simply a search engine) it will reduce the multiple work cases of Google Search.
The “Unbundling” of Google Search Explained
People use Google to answer queries or solve different problems, and Google provides the answers based on its algorithm parameters. Everything from the best restaurant in a given area to finding niche online gaming sites such as GambleOnlineAustralia.com is available on Google within milliseconds of the search query being input. ChatGPt (and other LLMs) have been designed to do something similar, except that they can be integrated into specific products or processes to give more precise answers. This can be seen in the way that developers/engineers use GitHub Copilot or OpenAI Codex to help them write code to solve specific problems. All that is needed is a textual description and the LLM will automatically generate the code for them. This is an example of “unbundling” search because in the past developers/engineers would simply use Google to search for a coding solution to their problem.
What are the Current Problems With Search Engine/ChatGPT Integration?
One of the reasons why Google has taken such a conservative approach to ChatGPT integration is that they have already been experimenting with their own propriety (yet similar) technology for quite some time. They are therefore well aware of its limitations as well as its problems. The most important of these problems with ChatGPT are summarised below:
ChatGPT can “lie”
Although ChatGPT is very good at generating answers which are grammatically cohesive the information that they provide is often factually incorrect. It is also difficult to verify these answers as it doesn’t cite sources.
ChatGPT is expensive to run
All LLMs cost an absolute fortune to run and it has been estimated that a single LLM with a million daily users would cost roughly $100,000 per day to operate.
ChatGPT is comparatively slow to modern search engines
A search engine database will return millions of results within milliseconds of a search input request while ChatGPT can take a number of seconds to generate any response. Additionally, all LLMs are slow to incorporate data updates while search engines were specifically developed to take all of the latest information on the internet into account as they crawl (find) it. A good example of this can be seen in ChatGPT which uses data from 2021 to formulate responses.
Can The Problems With ChatGPT Be Fixed?
Yes, without a doubt the problems mentioned above are fixable, but it will take time and during that time Google will be formulating its own LLM strategy based on its years of findings. Some of the ways that these problems could be resolved are by Microsoft allowing ChatGPT to be integrated or utilising its extremely efficient Azure cloud architecture to speed up disability to use and present new information. The “truthfulness” issue, which is currently being addressed by Bing, will use automated processes to ensure that the answers are more factually correct and not hurtful or harmful.
What are your opinions on the integration of ChatGPT into our everyday lives? Please feel free to leave us your thoughts below.