/dev/reading
Category

Natural Language Processing

9 books
Order by
View
by Stephan Raaijmakers

Explore the most challenging issues of natural language processing, and learn how to solve them with cutting-edge deep learning!

Inside Deep Learning for Natural Language Processing you’ll find a wealth of NLP insights, including:

  • An overview of NLP and deep learning
  • One-hot text representations
  • Word embeddings
  • Models for textual similarity
  • Sequential NLP
  • Semantic role labeling
  • Deep memory-based NLP
  • Linguistic structure
  • Hyperparameters for deep NLP

Deep learning has advanced natural language processing to exciting new levels and powerful new applications! For the first time, computer systems can achieve "human" levels of summarizing, making connections, and other tasks that require comprehension and context.

Deep Learning for Natural Language Processing reveals the groundbreaking techniques that make these innovations possible. Stephan Raaijmakers distills his extensive knowledge into useful best practices, real-world applications, and the inner workings of top NLP algorithms.

by Ekaterina Kochmar

Hit the ground running with this in-depth introduction to the NLP skills and techniques that allow your computers to speak human.

In Getting Started with Natural Language Processing you’ll learn about:

  • Fundamental concepts and algorithms of NLP
  • Useful Python libraries for NLP
  • Building a search algorithm
  • Extracting information from raw text
  • Predicting sentiment of an input text
  • Author profiling
  • Topic labeling
  • Named entity recognition

Getting Started with Natural Language Processing is an enjoyable and understandable guide that helps you engineer your first NLP algorithms. Your tutor is Dr. Ekaterina Kochmar, lecturer at the University of Bath, who has helped thousands of students take their first steps with NLP. Full of Python code and hands-on projects, each chapter provides a concrete example with practical techniques that you can put into practice right away. If you’re a beginner to NLP and want to upgrade your applications with functions and features like information extraction, user profiling, and automatic topic labeling, this is the book for you.

Learn to build apps that can understand people
by George-Bogdan Ivanov

Natural Language Processing (NLP) is a collection of techniques to analyze, interpret, and create human-understandable text and speech. Advances in machine learning have pushed NLP to new levels of accuracy and uncanny realism.

Natural Language Processing for Hackers lays out everything you need to crawl, clean, build, fine-tune, and deploy natural language models from scratch—all with easy-to-read Python code.

A Practical Introduction
by Yuli Vasiliev

Natural Language Processing with Python and spaCy will show you how to create NLP applications like chatbots, text-condensing scripts, and order-processing tools quickly and easily. You’ll learn how to leverage the spaCy library to extract meaning from text intelligently; how to determine the relationships between words in a sentence (syntactic dependency parsing); identify nouns, verbs, and other parts of speech (part-of-speech tagging); and sort proper nouns into categories like people, organizations, and locations (named entity recognizing). You’ll even learn how to transform statements into questions to keep a conversation going.

You’ll also learn how to:

  • Work with word vectors to mathematically find words with similar meanings (Chapter 5)
  • Identify patterns within data using spaCy's built-in displaCy visualizer (Chapter 7)
  • Automatically extract keywords from user input and store them in a relational database (Chapter 9)
  • Deploy a chatbot app to interact with users over the internet (Chapter 11)

“Try This” sections in each chapter encourage you to practice what you’ve learned by expanding the book’s example scripts to handle a wider range of inputs, add error handling, and build professional-quality applications.

By the end of the book, you’ll be creating your own NLP applications with Python and spaCy.

Building Language Applications with Hugging Face
by Lewis Tunstall, Leandro von Werra and Thomas Wolf

Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.

Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.

  • Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
  • Learn how transformers can be used for cross-lingual transfer learning
  • Apply transformers in real-world scenarios where labeled data is scarce
  • Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
  • Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments
A Comprehensive Guide to Building Real-World NLP Systems
by Sowmya Vajjala, Bodhisattwa Majumder, Anuj Gupta and Harshit Surana

Many books and courses tackle natural language processing (NLP) problems with toy use cases and well-defined datasets. But if you want to build, iterate, and scale NLP systems in a business setting and tailor them for particular industry verticals, this is your guide. Software engineers and data scientists will learn how to navigate the maze of options available at each step of the journey.

Through the course of the book, authors Sowmya Vajjala, Bodhisattwa Majumder, Anuj Gupta, and Harshit Surana will guide you through the process of building real-world NLP solutions embedded in larger product setups. You’ll learn how to adapt your solutions for different industry verticals such as healthcare, social media, and retail.

With this book, you’ll:

  • Understand the wide spectrum of problem statements, tasks, and solution approaches within NLP
  • Implement and evaluate different NLP applications using machine learning and deep learning methods
  • Fine-tune your NLP solution based on your business problem and industry vertical
  • Evaluate various algorithms and approaches for NLP product tasks, datasets, and stages
  • Produce software solutions following best practices around release, deployment, and DevOps for NLP systems
  • Understand best practices, opportunities, and the roadmap for NLP from a business and product leader’s perspective
Practical applications with deep learning
by Masato Hagiwara

In Real-world Natural Language Processing you will learn how to:

  • Design, develop, and deploy useful NLP applications
  • Create named entity taggers
  • Build machine translation systems
  • Construct language generation systems and chatbots
  • Use advanced NLP concepts such as attention and transfer learning

Real-world Natural Language Processing teaches you how to create practical NLP applications without getting bogged down in complex language theory and the mathematics of deep learning. In this engaging book, you’ll explore the core tools and techniques required to build a huge range of powerful NLP apps, including chatbots, language detectors, and text classifiers.

by Paul Azunre

Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems.

In Transfer Learning for Natural Language Processing you will learn:

  • Fine tuning pretrained models with new domain data
  • Picking the right model to reduce resource usage
  • Transfer learning for neural network architectures
  • Generating text with generative pretrained transformers
  • Cross-lingual transfer learning with BERT
  • Foundations for exploring NLP academic literature

Training deep learning NLP models from scratch is costly, time-consuming, and requires massive amounts of data. In

Transfer Learning for Natural Language Processing, DARPA researcher Paul Azunre reveals cutting-edge transfer learning techniques that apply customizable pretrained models to your own NLP architectures. You’ll learn how to use transfer learning to deliver state-of-the-art results for language comprehension, even when working with limited label data. Best of all, you’ll save on training time and computational costs.

Explore Generative AI and Large Language Models with Hugging Face, ChatGPT, GPT4-V, and DALL-E 3
by Denis Rothman

Transformers for Natural Language Processing and Computer Vision, Third Edition, explores Large Language Model (LLM) architectures, applications, and various platforms (Hugging Face, OpenAI, and Google Vertex AI) used for Natural Language Processing (NLP) and Computer Vision (CV).

The book guides you through different transformer architectures to the latest Foundation Models and Generative AI. You’ll pretrain and fine-tune LLMs and work through different use cases, from summarization to implementing question-answering systems with embedding-based search techniques. You will also learn the risks of LLMs, from hallucinations and memorization to privacy, and how to mitigate such risks using moderation models with rule and knowledge bases. You’ll implement Retrieval Augmented Generation (RAG) with LLMs to improve the accuracy of your models and gain greater control over LLM outputs.

Dive into generative vision transformers and multimodal model architectures and build applications, such as image and video-to-text classifiers. Go further by combining different models and platforms and learning about AI agent replication.

This book provides you with an understanding of transformer architectures, pretraining, fine-tuning, LLM use cases, and best practices.

What you will learn

  • Learn how to pretrain and fine-tune LLMs
  • Learn how to work with multiple platforms, such as Hugging Face, OpenAI, and Google Vertex AI
  • Learn about different tokenizers and the best practices for preprocessing language data
  • Implement Retrieval Augmented Generation and rules bases to mitigate hallucinations
  • Visualize transformer model activity for deeper insights using BertViz, LIME, and SHAP
  • Create and implement cross-platform chained models, such as HuggingGPT
  • Go in-depth into vision transformers with CLIP, DALL-E 2, DALL-E 3, and GPT-4V

Who this book is for

This book is ideal for NLP and CV engineers, software developers, data scientists, machine learning engineers, and technical leaders looking to advance their LLMs and generative AI skills or explore the latest trends in the field. Knowledge of Python and machine learning concepts is required to fully understand the use cases and code examples. However, with examples using LLM user interfaces, prompt engineering, and no-code model building, this book is great for anyone curious about the AI revolution.