Doesnt matter how they push it, a GPT style search would cut into their current model for ad revenue the same way voice search does.
With a voice search you cant put up 12 promoted things in front of the result they are looking for, and generate as much data about it. With LLMs being a black box, and costly to reign in for product use its just not as profitable.
If someone else does it, google gets eclipsed potentially so they cant really avoid it.
AMEN. And that's the Real Issue. They can't give a first page of garbage!How much will it impact profits to use AI to reduce the amount of garbage returned in a search?
From my perspective, it would increase the time spent by users on each search engine as it becomes more interactive and you can get the answers without leaving Google or Bing.
So, yes more compute, but I would expect a higher user engagement and time spent on the platform, so potentially more revenue.
I don't understand how monetizing it would be a problem. Can't they just devote a section of the page to ads relevant to the search/chat like they do with current search results?
I suppose I don't understand enough about marketing or I'd understand why such a simple solution wouldn't work well.
I read that BingChat gives you an answer and a source link.Ron Amadeo said:You could imagine a future where instantly getting a good answer would result in less time on Google compared to having to dig through a list of 10 blue links. If that's true, then none of the money math on these new search engines really looks good.
The crypto analogies don't end with the excessive hype.This also probably means greater energy use and more CO2 emissions.
But, right now ChatGPT is PURE Search.
Pure Interactive Search in search of a good answer to your question.
That's not Google Search.
That's where they're losing money. How do you inject wasteful bull* content to take advertiser money, and continue to destroy effectiveness of search?
That's Google's real problem.
That reduces but does not eliminate the extra cost. It still requires unique processing per user. And in the case of YouTube, the extra costs are handled by adding extra advertising, something which it is not clear how it will work.They'll probably solve this the same way they solved the YouTube problem, with bespoke hardware. Neurally-designed silicon is a thing and Google can afford to spin up a fab.
[...] Google hasn’t added a single innovation in a decade or more. What are those tens of thousands engineers even doing?So that’s why Google has been stagnating.
I realized that Google was going nowhere before but this ChatGPT saga made me realize even more how Google hasn’t added a single innovation in a decade or more. What are those tens of thousands engineers even doing? Tweaking the algorithms and increasing ad revenue shouldn’t need that many people, no?
I think search is one of the least interesting use cases for LLM. Because the input/output via chat has higher throughput than via voice, it opens of whole new classes of use cases that weren't possible before. Like, i told chatgpt to generate 30 random names, with names, life stage, and to group them. yes, I could have done it via a script, but it was so much faster, as well as being accessible for those w/o a coding background. the next step is adding more APIs, i think.Text chats do have the advantage in that you can design them to display multiple profit-generating links at once in the result. Voice chat is limited by the medium in which it exists to present you with one result at a time.
Still, once the novelty wears off, are people really going to be that much more enthusiastic about chatting at a computer than speaking to one? I'm not sure. I think the headline grabbing tech is not going to be at the forefront of things, but relegated to the background. That is, improving things (e.g. customer service bots, code completion tools) that already exist and have narrower and more easily defined use cases. I will (again) warn that one of the best use cases for this tech as it exists today is propaganda, which requires content that is merely plausible sounding, a lower bar than all the other use cases that require results that are also true.
Having said that, the image bots seem to be very useful as a concept-phase tool as-is too. This is all going to be very complicated and will affect different industries at different rates.
Unless Bard is very different from ChatGPT, running it on a consumer grade machine, even a high-end gaming PC does not sound practical.Oh it's too expensive google?
Then just open source the model, so people can run them locally. We'll save you a ton of money, you can thank us later ;-)
Chat isn't a very good match for search, especially because getting it right matters. But search is something that a very large number of people do. Generating a list of random names etc is a perfect use, but it's also a niche case. Are there really enough GMs and writers to make that a business case for something as expensive as the training for these systems? I doubt it.I think search is one of the least interesting use cases for LLM. Because the input/output via chat has higher throughput than via voice, it opens of whole new classes of use cases that weren't possible before. Like, i told chatgpt to generate 30 random names, with names, life stage, and to group them. yes, I could have done it via a script, but it was so much faster, as well as being accessible for those w/o a coding background. the next step is adding more APIs, i think.
It’s hard to see how one could ever trust a AI to discern garbage from useful, accurate information.How much will it impact profits to use AI to reduce the amount of garbage returned in a search?
Yeah, it's weird that the headline has a "Does anyone think this is a good idea?" above it, but the article never mentions the elephant in the room that is Bing's...emotional issues. You got some of the best of ChatGPT married to a search engine that sometimes lies or berates you (and after the recent neutering, it has the best of neither).I think search is one of the least interesting use cases for LLM. Because the input/output via chat has higher throughput than via voice, it opens of whole new classes of use cases that weren't possible before. Like, i told chatgpt to generate 30 random names, with names, life stage, and to group them. yes, I could have done it via a script, but it was so much faster, as well as being accessible for those w/o a coding background. the next step is adding more APIs, i think.
Chat isn't a very good match for search, especially because getting it right matters. But search is something that a very large number of people do. Generating a list of random names etc is a perfect use, but it's also a niche case. Are there really enough GMs and writers to make that a business case for something as expensive as the training for these systems? I doubt it.
You're implying that they lost $100B of market cap purely because they executed a demo poorly - and not the much more reasonable assumption that the stock market thinks the whole thing (large language model based chat search) is a bad idea. Google has spent 20 years honing their search model, figuring out how to execute their core product well. It makes much more sense that investors want them to stick with what they're good at (even if being good at running a good business around search is independent from making the product good for the user).Is a ChatGPT-style search engine a good idea? The stock market certainly seems to think so, with it erasing $100 billion from Google's market value after the company's poor showing at its recent AI search event.
This is a huge and very strange assumption to make right off the bat:
You're implying that they lost $100B of market cap purely because they executed a demo poorly - and not the much more reasonable assumption that the stock market thinks the whole thing (large language model based chat search) is a bad idea. Google has spent 20 years honing their search model, figuring out how to execute their core product well. It makes much more sense that investors want them to stick with what they're good at (even if being good at running a good business around search is independent from making the product good for the user).
Exactly what I was thinking. AI has the potential to destroy SEO as we know it and show us the information that we're actually looking for, and I can see that having a very negative impact on Google's profits, especially since sites pay Google for preferential treatment.How much will it impact profits to use AI to reduce the amount of garbage returned in a search?
Yeah until people figure out a whole new set of tricks to get higher placement inside the chat bots, like write really convincing English prose I guess.Exactly what I was thinking. AI has the potential to destroy SEO as we know it and show us the information that we're actually looking for, and I can see that having a very negative impact on Google's profits, especially since sites pay Google for preferential treatment.
Bard may be a major step forward for customer usability, but Bard2 will be a shameless cash-whore.
I think the safer assumption is that LLM's may trigger a shift in SEO. There is an inherent arms race between search tools and tools that clog results with bullshit. There are constant financial incentives for each side to improve their techniques. New tech doesn't make that go away. It just further incentivizes the other side to up their game.Exactly what I was thinking. AI has the potential to destroy SEO as we know it and show us the information that we're actually looking for, and I can see that having a very negative impact on Google's profits, especially since sites pay Google for preferential treatment.
Bard may be a major step forward for customer usability, but Bard2 will be a shameless cash-whore.