Amazon cover image
Image from Amazon.com

Elements of Dimensionality Reduction and Manifold Learning [electronic resource] /

By: Contributor(s): Material type: TextTextPublisher: Cham : Springer International Publishing : Imprint: Springer, 2023Edition: 1st ed. 2023Description: XXVIII, 606 p. 59 illus., 32 illus. in color. online resourceContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9783031106026
Subject(s): Additional physical formats: Printed edition:: No title; Printed edition:: No title; Printed edition:: No titleDDC classification:
  • 006.3 23
LOC classification:
  • Q334-342
  • TA347.A78
Online resources:
Contents:
Chapter 1: Introduction -- Part 1: Preliminaries and Background -- Chapter 2: Background on Linear Algebra -- Chapter 3: Background on Kernels -- Chapter 4: Background on Optimization -- Part 2: Spectral dimensionality Reduction -- Chapter 5: Principal Component Analysis -- Chapter 6: Fisher Discriminant Analysis -- Chapter 7: Multidimensional Scaling, Sammon Mapping, and Isomap -- Chapter 8: Locally Linear Embedding -- Chapter 9: Laplacian-based Dimensionality Reduction -- Chapter 10: Unified Spectral Framework and Maximum Variance Unfolding -- Chapter 11: Spectral Metric Learning -- Part 3: Probabilistic Dimensionality Reduction -- Chapter 12: Factor Analysis and Probabilistic Principal Component Analysis -- Chapter 13: Probabilistic Metric Learning -- Chapter 14: Random Projection -- Chapter 15: Sufficient Dimension Reduction and Kernel Dimension Reduction -- Chapter 16: Stochastic Neighbour Embedding -- Chapter 17: Uniform Manifold Approximation and Projection (UMAP) -- Part 4: Neural Network-based Dimensionality Reduction -- Chapter 18: Restricted Boltzmann Machine and Deep Belief Network -- Chapter 19: Deep Metric Learning -- Chapter 20: Variational Autoencoders -- Chapter 21: Adversarial Autoencoders.
In: Springer Nature eBookSummary: Dimensionality reduction, also known as manifold learning, is an area of machine learning used for extracting informative features from data for better representation of data or separation between classes. This book presents a cohesive review of linear and nonlinear dimensionality reduction and manifold learning. Three main aspects of dimensionality reduction are covered: spectral dimensionality reduction, probabilistic dimensionality reduction, and neural network-based dimensionality reduction, which have geometric, probabilistic, and information-theoretic points of view to dimensionality reduction, respectively. The necessary background and preliminaries on linear algebra, optimization, and kernels are also explained to ensure a comprehensive understanding of the algorithms. The tools introduced in this book can be applied to various applications involving feature extraction, image processing, computer vision, and signal processing. This book is applicable to a wide audience who would like to acquire a deep understanding of the various ways to extract, transform, and understand the structure of data. The intended audiences are academics, students, and industry professionals. Academic researchers and students can use this book as a textbook for machine learning and dimensionality reduction. Data scientists, machine learning scientists, computer vision scientists, and computer scientists can use this book as a reference. It can also be helpful to statisticians in the field of statistical learning and applied mathematicians in the fields of manifolds and subspace analysis. Industry professionals, including applied engineers, data engineers, and engineers in various fields of science dealing with machine learning, can use this as a guidebook for feature extraction from their data, as the raw data in industry often require preprocessing. The book is grounded in theory but provides thorough explanations and diverse examples to improve the reader’s comprehension of the advanced topics. Advanced methods are explained in a step-by-step manner so that readers of all levels can follow the reasoning and come to a deep understanding of the concepts. This book does not assume advanced theoretical background in machine learning and provides necessary background, although an undergraduate-level background in linear algebra and calculus is recommended.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
No physical items for this record

Chapter 1: Introduction -- Part 1: Preliminaries and Background -- Chapter 2: Background on Linear Algebra -- Chapter 3: Background on Kernels -- Chapter 4: Background on Optimization -- Part 2: Spectral dimensionality Reduction -- Chapter 5: Principal Component Analysis -- Chapter 6: Fisher Discriminant Analysis -- Chapter 7: Multidimensional Scaling, Sammon Mapping, and Isomap -- Chapter 8: Locally Linear Embedding -- Chapter 9: Laplacian-based Dimensionality Reduction -- Chapter 10: Unified Spectral Framework and Maximum Variance Unfolding -- Chapter 11: Spectral Metric Learning -- Part 3: Probabilistic Dimensionality Reduction -- Chapter 12: Factor Analysis and Probabilistic Principal Component Analysis -- Chapter 13: Probabilistic Metric Learning -- Chapter 14: Random Projection -- Chapter 15: Sufficient Dimension Reduction and Kernel Dimension Reduction -- Chapter 16: Stochastic Neighbour Embedding -- Chapter 17: Uniform Manifold Approximation and Projection (UMAP) -- Part 4: Neural Network-based Dimensionality Reduction -- Chapter 18: Restricted Boltzmann Machine and Deep Belief Network -- Chapter 19: Deep Metric Learning -- Chapter 20: Variational Autoencoders -- Chapter 21: Adversarial Autoencoders.

Dimensionality reduction, also known as manifold learning, is an area of machine learning used for extracting informative features from data for better representation of data or separation between classes. This book presents a cohesive review of linear and nonlinear dimensionality reduction and manifold learning. Three main aspects of dimensionality reduction are covered: spectral dimensionality reduction, probabilistic dimensionality reduction, and neural network-based dimensionality reduction, which have geometric, probabilistic, and information-theoretic points of view to dimensionality reduction, respectively. The necessary background and preliminaries on linear algebra, optimization, and kernels are also explained to ensure a comprehensive understanding of the algorithms. The tools introduced in this book can be applied to various applications involving feature extraction, image processing, computer vision, and signal processing. This book is applicable to a wide audience who would like to acquire a deep understanding of the various ways to extract, transform, and understand the structure of data. The intended audiences are academics, students, and industry professionals. Academic researchers and students can use this book as a textbook for machine learning and dimensionality reduction. Data scientists, machine learning scientists, computer vision scientists, and computer scientists can use this book as a reference. It can also be helpful to statisticians in the field of statistical learning and applied mathematicians in the fields of manifolds and subspace analysis. Industry professionals, including applied engineers, data engineers, and engineers in various fields of science dealing with machine learning, can use this as a guidebook for feature extraction from their data, as the raw data in industry often require preprocessing. The book is grounded in theory but provides thorough explanations and diverse examples to improve the reader’s comprehension of the advanced topics. Advanced methods are explained in a step-by-step manner so that readers of all levels can follow the reasoning and come to a deep understanding of the concepts. This book does not assume advanced theoretical background in machine learning and provides necessary background, although an undergraduate-level background in linear algebra and calculus is recommended.

There are no comments on this title.

to post a comment.
© 2024 IIIT-Delhi, library@iiitd.ac.in