Ars OpenForum

LLMs don’t actually know anything; they can do a good impression of knowing things through the use of vectors, which map the semantic meaning of tokenized text.
Humans don't actually know anything either, they just do an impression of knowing things by sending signals across synapses and action potentials down axons, and regulate those interactions with astrocytes.
 
Upvote
-22 (7 / -29)