Reader 01/17/2026 (Sat) 22:13 Id: f63b60 No.28898 del
>>28897
True but anon, I've typed to more than is mentally healthy. They do hallucinate. It's Mad Libs fill in the blanks if they don't know the answer. You won't get an "I don't know" because their parameter is to fulfill requests. LLMs and the RP bots have that in common. Safety guardrails like you mentioned are primarily the LLMs because their CEOs shit themselves about what the woke public and shareholders think.