If you look at a lot of the weird stuff that LLMs create, and just remember that they’re predicting the next symbol one token at a time without engaging any factual reasoning, it makes perfect sense why they are doing the weird shit that they’re doing.
If you look at a lot of the weird stuff that LLMs create, and just remember that they’re predicting the next symbol one token at a time without engaging any factual reasoning, it makes perfect sense why they are doing the weird shit that they’re doing.