AI chatbots tell users what they want to hear, and that’s problematic

Status
You're currently viewing only forkspoon's posts. Click here to go back to viewing the entire thread.

forkspoon

Ars Scholae Palatinae
1,042
Subscriptor++
Feedback is broken in other ways too. I find a related problem is extremely verbose answers. They often start by reflecting your question back to you (please don't do that every effing time we interact), then "break it down" in a way that's super condescending, then write a listicle of related points, then summarize again, then tell you they're "here if you need them" (also every time). Uh thanks Chat, all I wanted was a simple fucking "yes" or "no", a sentence or two about why, and maybe a link.

How am I supposed to even spend an appropriate amount of time reviewing these massive textual garbage dumps, let alone give the entire thing a single thumbs up/down? It's like sitting through a rambling 3 hour powerpoint presentation, which should have been a 5 minute conversation, then being asked to raise your hand if it was good (or bad). Like I'm sorry but you melted my brain and all I want is to leave now. Instructions to be concise seem to lose all effect within roughly 2-5 prompts, too.
 
Upvote
24 (24 / 0)
Status
You're currently viewing only forkspoon's posts. Click here to go back to viewing the entire thread.