Is a ChatGPT-style search engine a good idea? The stock market certainly seems to think so, as it wipes $100 billion from Google’s market value after the company underperformed in its recent AI search event. However, actually turning a chatbot into a profitable business will be a challenge. That being said, Google has had a chat search interface — the Google Assistant — for seven years now, and the world’s largest advertising company has failed to monetize it. And a new Reuters report points to another financial problem in generating a chat session for each search: it will cost a lot more compared to a traditional search engine.
Nowadays Google search works by creating a huge index of the web and when you search for something, these index entries are scanned and ordered and categorized with the most relevant entries being displayed in your search results. Google’s results page actually tells you how long it all takes when you search for something, and it usually takes less than a second. A ChatGPT-style search engine would launch a huge neural network modeled after the human brain every time you perform a search, generate a lot of text, and probably also query that large search index for factual information. The back-and-forth nature of ChatGPT also means that you’ll likely be interacting with it for much more than a split second.
All that extra processing is going to cost a lot more money. After speaking with Alphabet chairman John Hennessy (Alphabet is Google’s parent company) and several analysts, Reuters writes that “an exchange using AI, known as the large language model, is likely to cost 10 times more than a standard keyword search.” and that he was “several billion dollars in additional costs.”
Exactly how many billions of Google’s $60 billion annual net income will be sucked up by a chatbot is up for debate. An estimate in the Reuters report comes from Morgan Stanley, who estimates that if a “ChatGPT-like AI were to handle half of the queries it received with 50-word responses, it would cost Google $6 billion annually.” Another estimate by consulting firm SemiAnalysis claims it would cost $3 billion.
Google hinted that server time was an issue in its first post on its “Bard” chatbot, saying it would start with a “lightweight model version” of Google’s language model and that “this much smaller model requires significantly less computing power , which allows us to scale for more users and allow for more feedback.” It’s interesting to hear that Google is cautious about scaling. Google is Google — it’s already operating at a scale that dwarfs most companies and can handle any workload you want to throw at it. “Scale” is just a matter of what Google is willing to pay for.
Search costs are definitely a bigger concern for Google than Microsoft. One of the reasons Microsoft is so eager to rock the search boat is that most market share estimates put Bing at only about 3 percent of the global search market, while Google accounts for about 93 percent. Search is a core business for Google that Microsoft doesn’t have to take care of, and since it has to process 8.5 billion searches every day, Google’s costs per search can add up very quickly.
Alphabet’s Hennessy told Reuters that Google is working to cut costs, calling it “a problem for a couple of years at worst.” Google has addressed such issues in the past, like when it bought YouTube and was able to cut costs enough to become a moneymaker, and continues to do so today with innovations like building its own video transcoding chips. The company also builds custom machine learning server chips called tensor processing units. However, with Google engaged in a cost-cutting bloodbath in recent months, suddenly finding that its main consumer product will have skyrocketing costs for “a couple of years” isn’t ideal.
It’s not yet clear how much money anyone will make from chatbots designed to provide direct answers. Google and Amazon’s voice assistants have both failed to turn a profit after years of this “we’ll find out later” monetization thinking, and they’re both just more limited chatbots. OpenAI, the creator of ChatGPT, charges money per word generated, which doesn’t work for search engines (it also rides a wave of hype and investor excitement that it can stand for years). According to another Reuters report, Microsoft has already met with advertisers to discuss its plan to “insert [ads] in responses generated by the Bing chatbot,” but it’s unclear how cumbersome that would be or if consumers will react if a chatbot suddenly goes into an ad break.
For Google, it’s again a matter of comparing this new breed of chat search engine to the old one, and it’s unclear whether a chatbot interface would result in more or less ad revenue. You can imagine a future where getting a good answer instantly would mean you spend less time on Google than digging through a list of 10 blue links. If that’s true, then none of the money math from these new search engines looks good.