/dev/reading

Transfer Learning for Natural Language Processing

by Paul Azunre
The cover of Transfer Learning for Natural Language Processing
4/5 on Goodreads
ISBN 9781617297267
Published in 2021
272 pages

Description

Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems.

In Transfer Learning for Natural Language Processing you will learn:

  • Fine tuning pretrained models with new domain data
  • Picking the right model to reduce resource usage
  • Transfer learning for neural network architectures
  • Generating text with generative pretrained transformers
  • Cross-lingual transfer learning with BERT
  • Foundations for exploring NLP academic literature

Training deep learning NLP models from scratch is costly, time-consuming, and requires massive amounts of data. In

Transfer Learning for Natural Language Processing, DARPA researcher Paul Azunre reveals cutting-edge transfer learning techniques that apply customizable pretrained models to your own NLP architectures. You’ll learn how to use transfer learning to deliver state-of-the-art results for language comprehension, even when working with limited label data. Best of all, you’ll save on training time and computational costs.