home
aboutsubscribe

GPT-3

July 27, 20202 min read

The biggest hype in AI now is GPT-3, which signals a major breakthrough in Deep Learning.

GPT-3 is a text-generation AI that mimics human writings with seemingly human-level comprehension.

I'm not an expert in this domain, so consider this a summary of the interesting discussions I have read.

Here's what GPT-3 can do:

  • Write poem, email, music
  • Generate Code
  • Write articles

  • Answer comprehension questions from a book

  • Generate summaries and insights from industry reports

What is GPT-3:

GPT-3 is the latest iteration of the unsupervised learning (a type of ML) language model by OpenAI, founded by Elon Musk and Sam Altman.

GPT stands for Generative Pretrained Transformer

  • Generative: text generation
  • Pretrained: trained with text across the Internet, including Google Books, Wikipedia, and coding tutorials
  • Transformer: a neural network architecture introduced by Google in 2017

The killer feature here is few-shot learning (a type of meta-learning in machine learning) – given a few examples of what you want to produce, the model has the ability to use the vast knowledge it has already learned and generalise it to perform you specific task.

Think of GPT-3 as hiring an intern with multiple PhDs who can quickly learn anything.

GPT-3 is an unprecedentedly massive model:

  • it has 175 billions parameters. Parameters are the individual weights and biases in the node connections in a neural network that the model learns and optimises. As a comparison, its predecessor GPT-2 has 1.5 billion parameters. Size matters in language model.
  • it has 96 layers in the neural network
  • cost $12 million to train
  • evaluated on a number of NLP benchmarks and achieves a state-of-the-art level of performance

What GPT-3 is not:

1 - artificial general intelligence (AGI). 

There are two types of AI - general AI and narrow AI. Narrow AI does only one specific task very well while general AI is more sophisticated and performs any generalised task like human.

So far, All AI are narrow AI, and we still have some ways to go to get to AGI.

While GPT-3 is great at generalising text-based tasks, its magic comes from text prediction and pattern recognition, and not from coherent comprehension and reasoning.

It also has not passed Turing Test - the benchmark for whether a machine achieves achieves human-level intelligence. Google Futurist Ray Kurzweil predicted that AI will pass Turing Test in 2029.

2 - technology breakthrough

AI experts do not consider GPT-3 a revolutionary advancement in AI. The technologies behind GPT-3 is similar to its predecessors and other leading models, with the only difference being its massive scale and training.

What is interesting about GPT-3 is that it is an unprecedented experiment into testing scaling laws in neural networks - if a good neural network is 10 times bigger, will it be 10 times smarter? and is there a limit and diminishing return to the scaling?

3 - taking over jobs of writers and programmers

While GPT-3 can potentially serve as a great tool for writing - especially automating marketing copy - it is no replacement for human intent, quirks, and ingenuity.

As for programmers, in a way, describing "button that looks like watermelon" to GPT-3 is a form of declarative coding. High-level programming language used these days are closer semantically to human language than to machine language.

GPT-3 can help with syntax, but we still need programmers to problem solve, architect a solution, and give clear and coherent instruction to computer.



Thank you for reading

If you'd like a monthly email on new posts, please pop your email here.

If you like this article, please click on the heart icon below.

0 loves




© 2022 yinhow

hi