App and layout tools
- HTML layout generator
- Creating app design from a description
- React todo list
- React component based on description
- React component based on variable name alone
- GPT-3 generating color scales from color name or emojis
- Website generation in Figma from a description
Search and data analysis
- Question answering and search engine
- Augmenting information in tables
- Creating charts from a description
- Natural-language interface to spreadsheet by generating code
- Generating and iteratively updating graphs
- Guessing the movie/tv show by a description
Program generation and analysis
- Translating natural language into shell commmands
- Reading code and responding to questions about it
- Generating Latex from description
- Generating SQL code 1
- Generating SQL code 2
- Coding interview
- Generating python
- Generating database-specific SQL code
- AI Inceptiion: GPT-3 generating machine learning code
- Most Recommended Books: GPT-3 based book recommendations
- Translating into several languages
- Write this like an attorney
- Automatically generating Request for Admissions
- Writing full emails from key points
- Simplifying legal language
- Iteratively drafted non-literal poetry translation with annotations
- Rephrasing sentences to be more polite
- Summarizing famous people thoughts
- Priming GPT-3 to Speak like Any Big Five Personality
- Content creation for marketing
- Generating memes
- Writing Google ads
- Generating presentations
- Food recipe maker
- "How to recruit board members"
- Shakespeare-style poetry generation
- Generate a quiz on any topic and evaluate students answers
- Generating history questions, with answers
- Text completion and style rewriting
What is OpenAI GPT-3?
GPT-3 is a neural-network-powered language model. A language model is a model that predicts the likelihood of a sentence existing on the planet. For instance, a language model can identify the sentence "I take my dog for a walk" as more probable to exist (i.e. on the Internet) than the sentence "I take my banana for a walk." This holds true for sentences in addition to phrases and, more generally, any sequence of characters.
Like a lot of language models, GPT-3 is elegantly trained on an unlabeled text dataset (in this case, the training information includes among others Common Crawl and Wikipedia). Words or phrases are randomly removed from the text, and the design needs to learn to fill them in using only the surrounding words as context. It's an easy training task that leads to an effective and generalizable model.
The GPT-3 model architecture itself is a transformer-based neural network. This architecture became popular around 2-- 3 years earlier, and is the basis for the popular NLP model BERT and GPT-3's predecessor, GPT-2. From an architecture perspective, GPT-3 is not actually really unique! So what makes it so special and wonderful?
IT'S REALLY BIG. I imply actually huge. With 175 billion parameters, it's the largest language model ever developed (an order of magnitude larger than its nearest competitor!), and was trained on the biggest dataset of any language model. This, it appears, is the primary reason GPT-3 is so remarkably "smart" and human-sounding.
But here's the really wonderful part. As a result of its humongous size, GPT-3 can do what no other model can do (well): perform specific jobs without any special tuning. You can ask GPT-3 to be a translator, a developer, a poet, or a well-known author, and it can do it with its user (you) supplying fewer than 10 training examples. Damn.
This is what makes GPT-3 so amazing to artificial intelligence professionals. Other language models (like BERT) need a fancy fine-tuning action where you collect thousands of examples of (say) French-English sentence sets to teach it how to do translation. To adapt BERT to a particular job (like translation, summarization, spam detection, and so on), you need to go out and discover a big training dataset (on the order of thousands or tens of countless examples), which can be troublesome or sometimes difficult, depending upon the task. With GPT-3, you do not require to do that fine-tuning action. This is the heart of it. This is what gets people thrilled about GPT-3: custom language tasks without training information.
Today, GPT-3 is in personal beta, however kid can I not wait to get my hands on it.