Anyone else get the feeling that GPT-3.5 is becoming dumber?
I made an app for myself that can be used to chat with GPT and it also had some extra features that ChatGPT didn’t (but now has). I didn’t use it (only Bing AI sometimes) for some time and now I wanted to use it again. I had to fix some API stuff because the OpenAI module jumped to 1.0.0, but that didn’t affect any prompt (this is important: it’s my app, not ChatGPT, so cannot possibly be a prompt cause if I did nothing) and I didn’t edit what model it used.
When everything was fixed, I started using it and it was obviously dumber than it was before. It made things up, misspelled the name of a place and other things.
This can be intentional, so people buy ChatGPT Premium and use GPT-4. At least GPT-4 is cheaper from the API and it’s not a subscription.
I’ve noticed that too. I recall seeing an article of it detailing how to create a nuclear reactor David Hahn style. I don’t doubt that they’re making it dumber to get people to buy premium now.
At least keep it on topic. Make it a sexy Spiderman thread who wanted to fight crime using Rumblr but accidentally downloaded Grindr and is now wanted for hate crime!
It’s cornmeal, so not the typical fried batter. It has more of a slightly sweet taste that imo compliments the hot dog well. Not for everyone, but I wouldn’t count them out without trying it at least once.
Once you get used to the pain and learn to control it like wolverine they’re actually pretty useful. Lights are one but you’re already sitting down with your snacks and popcorn, zip, ping, solved.
lemmyshitpost
Hot
This magazine is from a federated server and may be incomplete. Browse more on the original instance.