3 months ago - Shurenai - Direct link
Yes, Let's ask ChatGPT, the AI model that, like basically every single one of them, will lie straight to your face because 1: it's essentially rated on how happy you are with the answer, and you as an individual are not trained in the subject you're questioning so as long as the answer is presentable and makes some kind of sense, you'll be satisfied and 2: people constantly feed it misinformation contaminating the information pool.

Chat GPT will say a lie and make ♥♥♥♥ up and 'speaks' with full confidence and authority as if it's giving you the right answer. Could there be a grain or even a sandbox of truth in there? Sure. But it could be lying, or even just simply misinformed on the subject due to bad data.

Or it could be merging multiple different statements and datasets on one subject into one cohesive whole as if that frankenstein is the answer. Because reminder: ChatGPT, like all of them, is trained on an extreme amount of data, and essentially pieces together an answer from that data.

For example, It could've looked at what 1,000 different developers looked at for what THEY PERSONALLY STRIVED FOR for their 1.0 before being done with it. Dev 1 says 1, 3, 6, 9, 12; Dev 2 says 2, 4, 8, 10; Dev 3 says just 1 and 9; dev 49 says just 9 to 12; dev 624 says 6, 7, 8, 9; Etc.

And then ChatGPT takes that collection of data and presents it to you as if all 12 are needed, omitting that it was sourced from 1000 opinions of game designers who may or may not have even been successful at what they did; All because those 12 points came up several times amidst the thousand devs polled in the example. And, Again, with all the confidence of that being 100% the defacto go-to subjectively-and-objectively right answer.

Lastly, There is also no context whatsoever of how you convinced ChatGPT to give that reply in the first place. We don't know what you asked, how you asked it, how many times you had to ask or rephrase it to get the result you want, etc.


:conwayfacepalm: Speaking broadly, Stop asking ChatGPT to give you answers about things you don't have any real understanding of yourself. Because it WILL give you an answer, And because you don't understand it yourself you can't tell a good answer from a bad one, nor do you have any basis to understand where that information even came from.

And if you DO ask chatGPT, or any such AI model, Take the information it gives with a grain of salt, spend some time grilling the damn thing about what it's basing it's answers on and how it came to those conclusions, and do some separate research on the subject yourself.