Blog
In this blog post, I go through how LLM glued existing technologies together to revolutionize search with a new paradigm - Conversational Search.
OpenAI (partnered with Microsoft), announced ChatGPT. This is a chat bot created on top of GPT-3 language model that uses deep learning to produce human-like text.
Not long after the announcement, ChatGPT got integrated into Bing - Microsoft's web search engine. Currently, only selected individuals are able to test the new technology, but you cah check out the interactions shared by the testers, including this one from Linus Tech Tips.
What's amazing about 'searching though ChatGPT' is the chat UX.
When you first enter the website, you might think you're starting a chat, so you expect the engine to be understanding and forgiving of your search queries. You have the expectation that it will try to resolve any ambiguities or assumptions of your query, just like you would clarify with context during a conversation with a person.
During your interaction with the engine, the responses you receive are often information-dense, which is much better than just reading a list of search results. And if your initial question was a bit vague, the engine can help break it down into smaller pieces so you don't have to.
Finally, after you're done searching, your search "conversation" is saved in a more accessible format. It's easier to go back and look at a chat history than it is to go back through your browser history.
Large language models like ChatGPT are not truly sentient or smart.
ChatGPT might use language that sounds convincing and corporate, but it's not really understanding what you're saying. All it's doing is using its vast training data to predict what the next word should be based on the words you've already typed.
Think about it this way: there's a huge difference between your dog recognizing your voice and responding to a command, and ChatGPT (as of early 2023) just producing a bunch of text based on what you typed without actually understanding your intent or context.
This means when you search through ChatGPT, it might seem like it's giving you helpful answers, but it's just spewing out a lot of text that may not actually be true or relevant to what you're looking for. This kind of search experience is not an improvement over traditional search engines, it just looks different.
In addition, ChatGPT (as of early 2023) tends to play 'corporate speech' to sound more convincing and divert the user away from the original query.
The real magic happens when large language models like ChatGPT are combined with other technologies - Bing + ChatGPT. This experience is shown in the Linus Tech Tips video. Let's call this product 'BingGPT'.
BingGPT brings together existing technologies, like web indexing, image reverse search, and image recognition, behind a user-friendly interaction. Unlike ChatGPT, BingGPT understands your intent and finds the information in just one interaction, rather than having to go through a series of searches like you might have to on Google.
In short, BingGPT seems to deliver exactly what conversational search should be all about as outlined above.
Some of these are general to Conversational Search.
Finding the right way to show you personalized search results, just like Google does, is a a challenge in Conversational Search. It may be harder for BingGPT since Microsoft does not have the same amount of historical information about you like Google does.
Serving cost is going to remain as a challenge for the foreseeable future. Hundred-billion size models are extremely expensive to serve. It is unlikely that even ads per each dialogue would cover the cost.
Microsoft probably needs a new ad model in addition to a better revenue model other than ads. That said, Microsoft is a company that had been investing at gaming for a decade despite long-term losses, so they may be willing to lose trillions on BingGPT for the next decade.
When search results are presented in a more user-friendly way by Conversation Search, it's important to be extra careful about misinformation and bias. This is primarily due to the natural tone and friendliness in Conversation Search.
Microsoft work on making sure search results are trustworthy and accurate, and should also partner with educational organizations to prioritize accurate and factual results. This may be impossible in the US political climate.
Not everyone may be ready for the changes it brings.
An obvious example: AdBlock may have trouble adapting to the new technology. How would it even remove 'ad' spots in the dialogue without losing critical information in the sentence?
Your job may be under threat: a lot of desk jobs are just variations of information summarizers for their reporting chain. Are you ready to move on to a job that Conversational Search can't solve?
Tech giants will need to carefully and thoughtfully introduce Conversational Search to the world so that everyone has time to adjust.
Google is a tech company that innovates by solving engineering problems, rather than real-world user issues. Even when their products start out addressing real-world problems, they often shift their focus to solving technical challenges.
I believe this attitude ultimately led to the current 'panic mode'.
Despite having the all necessary technology for years (search, ads, image recognition, etc), they missed the opportunity to glue everything together using LaMDA (or something else). Google search box is literally UX from the 90s, and it still remains.
The next generation of young people should be taught how to break down their goals and desires into smaller, achievable steps. They must be even more cautious when it comes to the results they find online.
Universities will become more focused on research and providing accurate information to society. The university that can effectively partner with Conversational Search and make unbiased, fact-based information available will be at the forefront of education. The main challenge will be ensuring impartiality in their researches.
Doing trades and physical work is already becoming more expensive around the world. This will be accelerated.
I wrote this blog post with assistance from ChatGPT.