An AI coding bot took down Amazon Web Services

Uncivil Servant

Ars Scholae Palatinae
4,667
Subscriptor
This paper's been fascinating, not least because it's doing what it sets out to do, sharpening my rhetoric.

Just an example, when arguing against using AI to replace artists, I will now START by pointing out that it won't save them any money and may in fact cost them more. LLM AI's primary "value" is aimed at highly paid staff, not the small time writers and artists getting paid like... writers and artists. Once that's established, that they may even end up paying MORE to use the service, THEN I can go into how the art and writing it produces isn't very good, or very accurate, and is all derivative opening them up to potential lawsuits, and that that's what they're paying for, what they're INSISTING their staff uses. A more costly inferior product. And, since we're talking about rhetoric, it's worth adding in that with the public at large turning on AI, they'd get more value out of simply not buying into it and making that a selling point of their company. Outside of immediate profit, this also attracts potential artists and writers TO your company, giving you the pick of the litter of those who want job security so you get the best talent, which will then draw in more customers. That last bit is old news and how things worked before this bubble, but it's still worth repeating.

You hear that Condo Nast?

Oh, there's a much shorter and simpler argument: sign a contract, pay an artist, and your IP rights are generally ironclad. Can't put Darth Vader's face on something without paying Disney, and George Lucas before that.

It's worth remembering that my entire childhood took place during the period between Return of the Jedi and The Phantom Menace. During that 16 year period during which George Lucas produced zero Star Wars films, how much did he earn in IP royalties?

The difference between artificially generated art and human generated art is going to be the earnings difference between a single summer blockbuster vs a major movie franchise like Star Wars or the Marvel Cinematic Standard Model*. That might not sound so bad at first, some summer blockbusters make a lot of money. But a lot of them go bust and lose money. This is why we have so many sequels and cinematic universes and reboots, because nobody wants to invest in a film without a guaranteed audience.

You need recognizable IP to guarantee the audience, and no artificially generated art will give you that.


*Seriously I think the Standard Model needs a shorter introduction than the MCU at this point
 
Upvote
7 (7 / 0)

Uncivil Servant

Ars Scholae Palatinae
4,667
Subscriptor
The "solution" the Agent provided was to create a variable that contained the status of docker being available or not. In every test it then checked this variable. If docker wasn't available it would simply skip the test. Hence the problem according to the agent was "solved".

All I know about coding I learned from law classes but even the dumbest failsons I had to bear in those classes wouldn't have been this lazy in setting and following rules, and some of those gentlemen had permanent tan lines from their popped Polo collars.
 
Upvote
6 (6 / 0)
Remember, every single white collar job will be gone within the next year apparently.
Actually according to Sam Altman AGI will have arrived in 2025, so we've all already been replaced by chatbots who just don't realize they aren't humans em dash scary thought isn't it my fellow human not chatbots?
 
Upvote
10 (10 / 0)

stux

Ars Scholae Palatinae
812
"Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them." - Reverend Mother Gaius Helen Mohiam, Dune

Ahem, it’s about who owns the means of production, kids!
I thought it was about who controls The Spice?
 
Upvote
0 (0 / 0)
Just wondering what the point of this snippet is. Aluding to AI involvement in that incident? Pointing out that humans make worse mistakes? I'm honestly not sure why that line was even in there. The October incident isn't referenced again, and just serves as a point of 'we had an outage' that is unrelated to the other outages.
I think it's there because otherwise folks might assume this article was about that failure. It was a huge deal and around the same time.

Certainly when I heard "AI caused AWS outage", I immediately assumed they meant the earth-stopping outage, not this random thing. This random thing is certainly a foreboding sign about AI development, but it's not "AI crashed the internet".
 
Last edited:
Upvote
2 (2 / 0)

Derecho Imminent

Ars Legatus Legionis
16,255
Subscriptor
Now witness the power of our fully operational AI agents!
1771866436687.png

oops our deathstar blew up
 
Last edited:
Upvote
0 (0 / 0)
This is a really interesting risk of AI systems.

When an experimental Uber autonomous vehicle killed Elaine Herzberg in 2018, one of the things NTSB cited was "automation complacency", where the human safety driver sort of checked out after hours of running the test track repeatedly with no previous errors.

This is why I think most high-consequence automated systems are held to a way higher standard than for humans, because if something fails 1 in 1000 times it can actually be more dangerous than failing 1 in 10 times, because the humans get complacent and don't mitigate the failures on the one-in-a-thousand system.

Generative AI is based on randomness and has no guarantees of performance (yet). As generative AI gets better it might ironically lead to more dangerous failures.

I like this comparison.

First, I wouldn't say Uber's car was anywhere near the standard for humans. It was at least a hundred times worse. Humans can go like 100M miles without killing someone. Uber went like 100k miles and killed someone. It was a complete failure.

Second, I object to the general statement "that humans get complacent," as though it's something that could happen to anybody. The Uber driver wasn't doing her job. She was paid to watch the road, and instead she was watching TV. That's not complacency, that's neglect.

And yes, it is up to the business/technology people need to account for that. They need to know that some folks just can't be trusted to do their job.

I see AI risk as pretty similar. People who eff it up are the same people who eff up other things. AI just allows it to happen somewhat faster? But the solution is arguably the same? Which is don't give those people the access/power to break things.
 
Last edited:
Upvote
2 (2 / 0)
Oh, there's a much shorter and simpler argument: sign a contract, pay an artist, and your IP rights are generally ironclad. Can't put Darth Vader's face on something without paying Disney, and George Lucas before that.

It's worth remembering that my entire childhood took place during the period between Return of the Jedi and The Phantom Menace. During that 16 year period during which George Lucas produced zero Star Wars films, how much did he earn in IP royalties?

The difference between artificially generated art and human generated art is going to be the earnings difference between a single summer blockbuster vs a major movie franchise like Star Wars or the Marvel Cinematic Standard Model*. That might not sound so bad at first, some summer blockbusters make a lot of money. But a lot of them go bust and lose money. This is why we have so many sequels and cinematic universes and reboots, because nobody wants to invest in a film without a guaranteed audience.

You need recognizable IP to guarantee the audience, and no artificially generated art will give you that.


*Seriously I think the Standard Model needs a shorter introduction than the MCU at this point
Especially since a weirder version of the Standard Model plays into some of those movies, and can be considered a subset of the MCU.
 
Upvote
0 (0 / 0)