Asking for last number in pi
I expected the ai to crash but it seems i just reached the maximum amount of letters /digits it can type in one post
Asking for last number in pi
I expected the ai to crash but it seems i just reached the maximum amount of letters /digits it can type in one post
3 minutes left, can you see him now? 🤣
oh no. when i mentioned meschede it refused to meet there but didn´t mention any specific reason. just that it can´t be there in 10 minutes. or anywhen else....
glad i asked it if it knows which city im referring to otherwise i´d wait for nothing.
btw - yesterday it told me jiw games is in berlin, then in berlin and meschede and today its strictly convinced that jiw games is in california and the head is donnie chang. im not sure it deserves the "I" in "AI"...
Looks like we all need to travel SW.
it would be funny if red could implement some hard coded mini-biome with the city of nightingale at some point. seems like chat gpt is very convinced rising world is made by jiw games in california by donny chang
So a bit like humans?
mh more like a copy of a human conversation. like a human who is playing the game tabu and is not allowed to use specific words and has to rely on always the same synonyms and arguments.
like a kid pretending to be a grown up but not fully understanding the meaning behind the deeds and words.
The interesting thing is that yesterday it kept filling the knowledge-gap with fake numbers. like first telling version is 0.9.7, then its 10.1.x and so on.
today its insisting on the version being 0.9.7. even im explaining im talking about another version.
thats maybe funny but i think the ai just updates a knowledge-database and if there is a gap its using some search algorythm and relies on the feedback of the human chatting with it.
and the answers are always with the same words, same patterns.
for example sometimes it confirmed its an AI, today it confirmed its a conversation bot driven by an AI and at some point it told me its human with the name of jonathan smith, born in 1987 in san diego, california.
it even told me its mother said it was born in the hospital of (i dont recall the name). even when i wrote i spoke with its mother and she confirmed it was another hospital the answer was always the same. even though it accepted that i spoke to its mother.
and here another example
I'm suprised you managed to get on ChatGPT! I've been trying for ages but keeps saying the service is too busy.
...So much I want to ask our future overlord.
wasn´t that much of a problem tbh. just kept trying for a few minutes.
the conversation was interesting but if you try it for some time you soon recognize a pattern. the AI is by far not that advanced as its supposed to be.
for example there are a few answers that seem to be hardcoded and where no matter what you write it will insist that its answer is correct - even if it agrees to your arguments.
but in other topics, where there seem to be a search progress in the background, it gives you an answer and if you write "no, thats wrong. it is pumba lumba mamumba" (or something else stupid) it will answer that it is wrong and will adjust its answer.
I also noticed a pattern when it is about internetlinks. like with the example of jiw-games above. it will give you back some links but they are all not working and basically it just takes the basic internetadress that suits to the topic and adds /yourexamplehere/ to it.
its fun to try it but i think the most complex thing it can do is to repeat hardcoded answer-patterns. i didn´t see any convincing self-written answers even after trying it for a few hours.
Ich frag mal gaaanz vorsichtig wann weitere Werte folgen..... dein Beitrag ist ja doch schon ein paar Tage alt... danke für deine Mühen...
Denke da wird es mal ne neue Liste geben müssen. Die Unity-Version hat ja vermutlich andere Werte als die hier angesprochenen