ChatGPT competitor comes out swinging with Super Bowl ad mocking AI product pitches.
See full article...
See full article...
Isn't that literally what SEO or AIEO(bingo was his name) does already? So they are influenced by third party product placement they just don't currently profit from it.nor will Claude’s responses be influenced by advertisers or include third-party product placements our users did not ask for.
Won't hurt THEM at least... the amount of power used to generate and insert them will hurt others though.I guess when you're a trillion dollars in the hole, a little ad revenue won't hurt before your bubble pops
They are shooting for the professional market.I would be willing to bet they go back on their word or find a loophole at some point.
What they say and what they do rarely line up.
Also;
Isn't that literally what SEO or AIEO(bingo was his name) does already? So they are influenced by third party product placement they just don't currently profit from it.
Personally I think anyone who isn't already making tons of money from other avenues will go under. So google, meta, x, Microsoft will stay afloat just because they can afford it. I do agree that free plans will go and prices will hike. I can see that killing the vibe coding market as well having a further knock on on the smaller companies like cursor. We will also get to see how many of these AI companies are actually just wrappers of other LLMs.I'm calling it now: when the AI bubble finally pops, Open AI will be the one that dies. Anthropic, etc, will probably kill the free tier and ramp subscription prices. Google will backpedal on AI and make it more professionally focused.
Do you believe that document to be in any way legally binding, Miss Whittier?Based on their published Claude constitution, I'm inclined to give them the benefit of the doubt/take them at their word on this.
Would be nice to see some reference to that constitution in this article, as it drives a lot of what the company will do on the Claude roadmap.
I would be willing to bet they go back on their word or find a loophole at some point.
What they say and what they do rarely line up.
This will be an ad we look back at in 5 years and laugh at, either because AI died in the crib or because Anthropic will have ads. Either way, if there is a there, ads will follow.
Meanwhile, my $200 (free and clear, unlocked) Moto G84 has a big battery, plenty of grunt to run everything except the latest 3D games, and still comes with a headphone jack.Like, no kidding the dongle life was a downgrade, but Apple did it to make money selling high-margin Bluetooth earbuds... same reason Samsung ate crow and removed the headphone jack shortly thereafter.
There are many good places for advertising.
I think that Google, with it's years of investment in TPU's will pull ahead in terms of cost per inference. Others are already attempting to do the same and/or licensing Google TPU's. The race will be amongst those capable of achieving cost performant inference more than anything else.Personally I think anyone who isn't already making tons of money from other avenues will go under. So google, meta, x, Microsoft will stay afloat just because they can afford it. I do agree that free plans will go and prices will hike. I can see that killing the vibe coding market as well having a further knock on on the smaller companies like cursor. We will also get to see how many of these AI companies are actually just wrappers of other LLMs.
Given how many companies Anthropic are already selling to they must be reaching saturation whilst apparently not profitable. Unless they can survive the others dying off and eating their share I don't see how they can survive investment drying up.
Once the true pricing so they're profitable comes out I suspect there will be a lot less interest in the products on offer barring a breakthrough reducing resources.
Hopefully it becomes a big boon for local models though!
Reminds me of that Samsung ad making fun of Apple for removing the headphone jack.
View: https://www.youtube.com/watch?v=wQ4ys-R1B_8
Like, no kidding the dongle life was a downgrade, but Apple did it to make money selling high-margin Bluetooth earbuds... same reason Samsung ate crow and removed the headphone jack shortly thereafter.
I was gonna say "hey, it's this millennium's 'Don't Be Evil'".This will age as poorly as:
View attachment 127692
Should AI chatbots have ads?
No, no, that isn't what it appears. It means famed musician and actress Courtney Love is sharing a password with...her brother. Yes. Probably for an Office360 subscription.This will age as poorly as:
View attachment 127692
It is now Netflix's policy to annihilate all love in the world by any means necessaryThis will age as poorly as:
View attachment 127692
100% they're just talking their book. They don't have as many free users so they can't monetize meaningfully with ads anyway.Oh please, every Online thing gets enshitified with ads at some point.
If NVIDIA's new Rubin Platform pulls off its 10x reduction in inference costs you will see huge price/cashflow changes in the next year. Google's target for its TPUs is 100-1000x reduction in inference costs.I think that Google, with it's years of investment in TPU's will pull ahead in terms of cost per inference. Others are already attempting to do the same and/or licensing Google TPU's. The race will be amongst those capable of achieving cost performant inference more than anything else.
The question isn't can useful work be done, or who has the most useful model, it's who will be able to do so at a sane operating cost in 3-4 years or whenever the circular investment begins to fall apart. That race is already on and showing great progress. Nvidia just aqui-hired a company working on exactly that.
Google has the biggest moat. Meta is quietly building their own. OpenAI is scrambling to do so with Broadcom. Anthropic is licensing TPUs from Google with the largest TPU deal in Google's history.
There is only so much that can be done to make dedicated FP16/FP8 math more efficient. Google's TPUs are hilariously efficient in doing exactly that, but it's not like there are orders of magnitude to be had there.I think that Google, with it's years of investment in TPU's will pull ahead in terms of cost per inference. Others are already attempting to do the same and/or licensing Google TPU's. The race will be amongst those capable of achieving cost performant inference more than anything else.