> It is so with GPT. Both systems are "trying to give you the best answer."
No, based on your responses you do not understand how language model works. Google is searching in index using keywords and rankings, ChatGPT is predicting plausible words without searching anything anywhere.
What you argue is like saying there is this two guys in library and you ask them to find you something that exists or maybe doesn't exists, both have read all the books, one (Google) have created index of all the words from the books and is going through it to answer you and the other (ChatGPT) do not use any index but he uses his memory with compressed knowledge of statistics between words and will answer by trying to predict any answer that fits statistics between words and in many cases it will basically lie to you and you will have no clue that you were lied to.
There is distinction between indexing human knowledge about some topic where most of the top results are correct (Google) and creating statistics model between words and making things up that never existed and are wrong (ChatGPT).
> "Google is searching in index using keywords and rankings, ChatGPT is predicting plausible words without searching anything anywhere."
Expand your scope to both Google, and the creation of an ecosystem of SEO pages which Google incentives. They are the same, in totality. Google doesn't just index -- it also funds the creation of landing pages.
> "There is distinction between indexing human knowledge ... and creating statistics model between words and making things up that never existed and are wrong "
It's a false distinction. Google is more than a search engine; it is also an advertising company that incentivizes original content creation with the express intent of providing answers to queries.
> It's a false distinction. Google is more than a search engine; it is also an advertising company that incentivizes original content creation.
Obvious straw man argument. Replace word google with search engine.
> Expand your scope to both Google, and the creation of an ecosystem of SEO pages which Google incentives. They are the same, in totality. Google doesn't just index -- it also funds the creation of landing pages.
This doesn't matter, you are mistaking dataset with the model. Search engine will not return to you things that were not in dataset, it will give you many results that you can judge with many points of reference. Language model will return you one answer, answer that could be a correct result that is inside the dataset or could be totally false and incorrect but plausible and you will have no point of reference to check that unless you use a real search engine.
No, based on your responses you do not understand how language model works. Google is searching in index using keywords and rankings, ChatGPT is predicting plausible words without searching anything anywhere.
What you argue is like saying there is this two guys in library and you ask them to find you something that exists or maybe doesn't exists, both have read all the books, one (Google) have created index of all the words from the books and is going through it to answer you and the other (ChatGPT) do not use any index but he uses his memory with compressed knowledge of statistics between words and will answer by trying to predict any answer that fits statistics between words and in many cases it will basically lie to you and you will have no clue that you were lied to.
There is distinction between indexing human knowledge about some topic where most of the top results are correct (Google) and creating statistics model between words and making things up that never existed and are wrong (ChatGPT).