please double-check chatgpt’s answers before posting them as facts. chatgpt does not know what it is saying, and often hallucinates false information.
its good for making nicely-written texts, not for researching anything important
they/them, ona/ona
mi toki e toki pona
please double-check chatgpt’s answers before posting them as facts. chatgpt does not know what it is saying, and often hallucinates false information.
its good for making nicely-written texts, not for researching anything important
i never claimed whether u double-checked or not.
i know, but still i think its important not to make it seem like posting chatgpt responses without oversight is okay.
if u normalize listening to ai hallucinations in any situation, people (especially tech-illiterate people) are going to replicate that behaviour in more important situations.