I like how ChatGPT explains how a site failing a certificate check can be spoofed by a different site with a failing certificate, which is just hilariously confused. Not the least of which is because a site spoofing the one being sought can itself have a completely valid certificate, and it probably will have a valid certificate.
GPTs increase the effort on the helpers. GPTs routinely generate incorrect info, and sometimes dangerously incorrect info. Not only do we get to answer the question, we also now get to explain where the GPT went wrong.
A GPT generates text containing words that resemble an answer. It is not a reference source.