Why ChatGPT Is Actually Getting Dumber Over Time

The rise of ChatGPT and AI raised a lot of questions concerning their fit in society. We've seen Marvel's "Secret Invasion" create an AI-generated intro, something that created a lot of backlash for the company. There's also the concern that ChatGPT and other AI tools could be coming for writing jobs. However, it seems like those fears might have been overblown as ChatGPT is actually getting dumber.

Advertisement

That doesn't mean it can't potentially become a real threat down the road, but it does mean we don't have to hit the panic button just yet. ChatGPT is suffering from what's being called AI drift, and it's actually causing the program to perform worse over time. A study revealed GPT-4 — designed to be more reliable than GPT-3.5 before it – performed better in March 2023 than it did in July 2023. On paper, the model should be doing better as time goes on thanks to it constantly learning new things. However, it's seeming like the opposite is taking place. What's most alarming is the fact this degradation has happened over the course of just a few months.

What does this mean the future of AI?

Seeing AI tools like this become less reliable over time will certainly cause a lot of people to pump the breaks in utilizing them. We've already seen these problems play out in the real world. Gizmodo's AI-generated "Star Wars" article was riddled with errors despite being what would seem like an easy task for AI. When drifting like this starts to occur, it means a human touch is needed to get things back on the right track.

Advertisement

As for why this is happening, it's a combination of a lot of things. As the AI learns more, its behavior can begin to change alongside it. This can cause it to ultimately make predictions that stray from its original purpose, and in turn, cause mistakes to happen. This can range from outdated answers to incorrect assumptions — making it an unreliable tool for the average person. This happening to ChatGPT is one thing, but if it happens with other AI-automated activities, such as self-driving cars, it could have disastrous results.

There are ways to rein in the issues, and it starts with keeping a closer eye on how AI is developing. This means always monitoring the constant shifts, making sure the data it is consuming is accurate, and always seeking feedback from the people using the AI tool whether that's ChatGPT or something else. The drifting is troubling, but it's something that can be fixed if it's caught.

Advertisement

Recommended

Advertisement