DALL·E I can see obvious use for, GPT tends to be similarly impressive, but I don't understand if it's 'just' interesting research, seeing what we can do sort of thing, or whether people actually see real-world use cases for it?
The closest to it was perhaps that code-generating demo here a day or two ago - but who wants to be a 'GPT programmer' writing code as 'write a Python program that computes fizzbuzz replacing the arguments $fizz$ and $buzz$, ...' instead of just the 'actual' code? It just seems like a more clever AppleScript to me, pseudocode, and I don't think anybody's ever seriously pursued a flexible keyword pseudocode like language as a goal, it's just appeared as a demo of more general models?
Generating template/outline text I suppose? (Like that essay-writing helper here a few days ago.)
To answer you in a very literal sense, GTP-3 is currently powering GitHub Copilot. It’s an actual launched product for $10/month. That’s going be booster rockets for the on-ramp to becoming a coder, and there is evidence it can help all coders be more productive.
As to what else future language models could power, based on my own use, I think fine tuned future language models could probably handle most customer support, accelerate the creation of most web content, accelerate quite a bit of paralegal grunt work, and power highly interactive game NPCs like in AI dungeon, another launched and paid product based on GTP-3.
I also think there were some companies that use gpt3 for analyzing text, like reviews or posts (for analytics like do people talk about a product in a positive matter?).
To add, github copilot is a really clever autocomplete that makes some mundane tasks much quicker. Things which are too small for a library but are still fairly often used can be "typed" more quickly.
I actually have a production use-case that GPT3 solves: a nsfw filter for chat messages. Although there are open-source filters out there for Nodejs, all the ones I tested fail in various ways. For example, famous emoji combos that indicate sexual activity and semantically sexual text that seem inoffensive enough based on the words. For example, “I bet you have the tastiest hot pocket”. Unless the chat is about cheap grocery food, it is probably a sexual reference.
I’m assuming there are existing solutions: paid services or Python libraries undoubtably. But it is a lot easier for me to take 10 minutes of my time to put together a prompt and add the API endpoint to my Nodejs app. As far as cost goes, our needs are low-volume so it doesn’t really matter.
I see GPT3s great utility here as a replacement for Mechanic Turk type tasks. MT is a real headache to setup and manage programmatically. GPT3 is pretty simple once you integrate it into your system. And the fascinating thing is that whole new realms of functionality are opened up to product ideation.
I haven’t tried to do anything beyond low-volume tasks that are Mechanical Turkable, but with GPT4 I think we’ll see cost and performance drop to the point where these things can be done at scale. At that point, software engineers would be foolish not take make GPT4 just another standard library they have at their disposal — kind of like how jQuery opened up web development by providing a generic interface to DOM manipulation across all browsers.
It is just as hard for us to imagine what kind of development GPT4 will enable as it was for people to imagine an internet of web-apps before jQuery. But it certainly will.
Imagine you enter a bunch of raw facts, like a bullet point list; and then the tool converts it into beautiful prose, and produces different output for different audiences.
The closest to it was perhaps that code-generating demo here a day or two ago - but who wants to be a 'GPT programmer' writing code as 'write a Python program that computes fizzbuzz replacing the arguments $fizz$ and $buzz$, ...' instead of just the 'actual' code? It just seems like a more clever AppleScript to me, pseudocode, and I don't think anybody's ever seriously pursued a flexible keyword pseudocode like language as a goal, it's just appeared as a demo of more general models?
Generating template/outline text I suppose? (Like that essay-writing helper here a few days ago.)