It just makes up things. If you find the lie, instead of keep trying, it all of a sudden finds the real data.
Yesterday I received an email from an artist who wrote Chatgpt lies and is a bad advisor.
It advices her to change the painting, softer edges..she did and next she showed it again..it was sharper edges...it always seems to find something but never the right thing.
She received a long list with what had to be changed .she changed nothing and showed the same picture saying she did it. All of a sudden Chatgpt said it was perfect.
I noticed with me it keeps repeating the same instructions no matter if you say it works.
I had it with something I searched in the car, different types of paint and how to use it and next base app X zora.co
Links are old, and it is not possible for it to find something new, a way out unless you say it is not true, not there l, a dead end.
Also funny: it can't tell the difference between left and right. Once it was mad and replied: fine we do it your way driver seat is right if you stand in front of the car, next the same instructions were given as if the driver's seat is left and my battery as well.
There are so many mistakes it is annoying to ask. It does not even work as a searching engine. Better don't ask what duck.ai comes up with.bit is as if you 'talk' with a brainwashed person with low IQ
Some have been more reliant on AI than others.. I think it's good to speed things up. But, maybe it is not meant to replace a whole thought process, or something of that sort.
But, I think it is obvious by now, that it would still try to generate an answer, even if it does not have all the relevant or necessary information about what was asked.
It's true, when the AI don't want to answer, just it say I can't access to the site, but if you say that it's not positive, the IA give a apologize and do what before wasn't did.
Anyway I prefer Claude than Copilot, but I nevee try ChatGPT.
It just makes up things. If you find the lie, instead of keep trying, it all of a sudden finds the real data.
Yesterday I received an email from an artist who wrote Chatgpt lies and is a bad advisor.
It advices her to change the painting, softer edges..she did and next she showed it again..it was sharper edges...it always seems to find something but never the right thing.
She received a long list with what had to be changed .she changed nothing and showed the same picture saying she did it. All of a sudden Chatgpt said it was perfect.
I noticed with me it keeps repeating the same instructions no matter if you say it works.
I had it with something I searched in the car, different types of paint and how to use it and next base app X zora.co
Links are old, and it is not possible for it to find something new, a way out unless you say it is not true, not there l, a dead end.
Also funny: it can't tell the difference between left and right. Once it was mad and replied: fine we do it your way driver seat is right if you stand in front of the car, next the same instructions were given as if the driver's seat is left and my battery as well.
There are so many mistakes it is annoying to ask. It does not even work as a searching engine. Better don't ask what duck.ai comes up with.bit is as if you 'talk' with a brainwashed person with low IQ
🥴
Some have been more reliant on AI than others.. I think it's good to speed things up. But, maybe it is not meant to replace a whole thought process, or something of that sort.
But, I think it is obvious by now, that it would still try to generate an answer, even if it does not have all the relevant or necessary information about what was asked.
It's true, when the AI don't want to answer, just it say I can't access to the site, but if you say that it's not positive, the IA give a apologize and do what before wasn't did.
Anyway I prefer Claude than Copilot, but I nevee try ChatGPT.