I was playing with Bing. While it manages to write proper text which is amazing the actual answers so far are highly unimpressive. So far nothing that can replace simple search. My opinion of course
There is a lot of work on systems that use transformer models that will get closer to “the truth”. For instance you could run a query through a normal search engine, then use a much more expensive cross-encoder to screen the top results, then have the transformer summarize the documents. In general transformers could be joined to other specialist systems to answer questions in domains like math.
Truth, however, is the most problematic idea in philosophy, and simply introducing the concept of ‘truth’ erodes the truth. (E.g. makes possible the statement “I am lying now” for the logical and for the X Files to assert “the truth is out there”)
The most interesting question though is how people have reacted to ChatGPT. I mean, Wolfram Alpha is a great question answering system but it didn’t cause a panic for Google and Bing. In our strange times people might rather have a bullshit engine than a search engine, and under the sign of “the innovators dilemma”, search engines might rush to disrupt them own business as a result.
My patterns of search have changed due to chatgpt. So for example when coding instead of routinely looking up tutorials and S/O ill have chatgpt open and run through issues step by step and dig down for background when I don't understand a concept. Its a different type of interaction than a google, but it effectively replaces a lot of searches.
Where its not a replacement for search is up to date information, or finding specific domains when I forget the exact url. For the former I usually have to add "hacker news" or "reddit" or "forum" to get quality info anyway.
Honestly I think its overdelivered, its shocking how useful it is. And crazy to think of what you could do with it right now, let alone as it grows and training options expand.
Around a week or two ago I would have said the google-search-competition thing was bull, not now. Not that google is in a bad place, but I dont think search will stay the same for long.
Depends on your expectation. Chat GPT-3 is pretty good. These are just public test (previews). The best is yet to come. Bing "GPT" chat is still a demo. we might get a surprise.
Have you played with GPT-3 / ChatGPT? It’s much better than the Bing option right now, I think they rushed it to get ahead, the speed that Microsoft is adopting all this stuff is really fast, but it’s very rough right now, they did a lot that I feel hurts GPT/chatGPT compared to it in it’s raw form.
I wanted to but it seems they do not like me. I am stuck at infinite loop "Verify your email, We sent an email to: blah, blah, blah ..., resend email". The email never comes even though I did try few different email addresses.
P.S. Ok I finally managed to try it indirectly, frankly it is even worse. It was insanely over patronizing with little actual substance. But I admit it makes one hell of a clueless manager who knows nothing but buzzwords. In this case it achieves human like quality
It is a little generic, what really helps is if you feed it more information about what you want though, you have to teach it, tell it how you want it to behave and respond, what input sources you want it to take notice of.
Ask it to present you options, ask it to put the data in a table, ask it to ask you questions before answering, ask it to take on a certain persona etc.
For example when making goals this year I asked it to break it down into monthly increments and at what point I needed to be at this year and put that into a table, it was quite helpful.
I’ve not used bing yet, but chatgpt is all about feeding it the right data first before telling it what you want else your answers will be generic and awful, I found this article about the bing chat bot that might help:
That's not my experience. In fact it seems GPT is especially adept at metaphors and analogies, because they mirror the layout of its training data. It's very good at identifying related concepts; but try to ask it for a concept that has no relation to anything else, and it will stall out.
Q: Which kitchen metaphor does the large scale structure of our universe resemble?
> ...some scientists have suggested that the large scale structure of the universe resembles a sponge...
Q: What's an example of a recursive analogy with a layer of irony to it?
> An example of a recursive analogy with a layer of irony to it could be the statement "the world is a stage, and we are all just actors playing our parts." This analogy uses the concept of the world being a stage to describe the world itself, and then applies the same concept to individuals within the world, suggesting that their actions and experiences are just part of a larger performance. The ironic layer comes from the fact that, while the analogy may be true in some ways, it also implies that individuals do not have agency or control over their own actions and experiences, which is not necessarily the case.
Try to ask if A is same as B (although it's not much related), what you get is an attempt to approximate the sameness somehow, which is confusing. I expect something like: A is not same as B, instead A is same as C.
I think that's a really interesting point. Essentially, ChatGPT shows the loss of meaning. It can connect words to words, but it doesn't know what anything means. It has no categories to put ideas in. In fact, it doesn't even have idea. All it has is words.
So A is the same as B just as much as A is the same as C, because A, B, and C are just words with no meaning to ChatGPT.
ChatGPT seems like the endpoint of certain lines of poststructuralist philosophy. There is no meaning to the text, only words. Words relate to other words, and that is all.
That's basically the conclusion Wolfram made in this excessively long article [0] he wrote (which is nonetheless worth a read):
> The specific engineering of ChatGPT has made it quite compelling. But ultimately (at least until it can use outside tools) ChatGPT is “merely” pulling out some “coherent thread of text” from the “statistics of conventional wisdom” that it’s accumulated. But it’s amazing how human-like the results are. And as I’ve discussed, this suggests something that’s at least scientifically very important: that human language (and the patterns of thinking behind it) are somehow simpler and more “law like” in their structure than we thought. ChatGPT has implicitly discovered it. But we can potentially explicitly expose it, with semantic grammar, computational language, etc.
I’m eagerly watching this unfold . Google is already magic — I type what I want and it’s usually on the first page.
The question extraction that’s already built in is fantastic.
Well no, because I want to know where the information comes from. Even if it’s “some dude from the Interwebs”, I don’t want a bot presenting stack overflow posts as if facts.
Man, he can summarize a lot of content, such as youtube video not played he will first tell you the approximate content of the video, google search, will summarize your search content, combined with chatgpt If you are interested can try to experience this plug-in I developed, he is free
http://glarity.app/
Truth, however, is the most problematic idea in philosophy, and simply introducing the concept of ‘truth’ erodes the truth. (E.g. makes possible the statement “I am lying now” for the logical and for the X Files to assert “the truth is out there”)
The most interesting question though is how people have reacted to ChatGPT. I mean, Wolfram Alpha is a great question answering system but it didn’t cause a panic for Google and Bing. In our strange times people might rather have a bullshit engine than a search engine, and under the sign of “the innovators dilemma”, search engines might rush to disrupt them own business as a result.