---
product_id: 148137089
title: "Deep Learning (The MIT Press Essential Knowledge series)"
price: "€ 26.40"
currency: EUR
in_stock: true
reviews_count: 13
url: https://www.desertcart.be/products/148137089-deep-learning-the-mit-press-essential-knowledge-series
store_origin: BE
region: Belgium
---

# Deep Learning (The MIT Press Essential Knowledge series)

**Price:** € 26.40
**Availability:** ✅ In Stock

## Quick Answers

- **What is this?** Deep Learning (The MIT Press Essential Knowledge series)
- **How much does it cost?** € 26.40 with free shipping
- **Is it available?** Yes, in stock and ready to ship
- **Where can I buy it?** [www.desertcart.be](https://www.desertcart.be/products/148137089-deep-learning-the-mit-press-essential-knowledge-series)

## Best For

- Customers looking for quality international products

## Why This Product

- Free international shipping included
- Worldwide delivery with tracking
- 15-day hassle-free returns

## Description

Deep Learning (The MIT Press Essential Knowledge series) [Kelleher, John D.] on desertcart.com. *FREE* shipping on qualifying offers. Deep Learning (The MIT Press Essential Knowledge series)

Review: Excellent but not so easy. - The back cover indicates "An accessible introduction to AI ..." Ok, it is accessible if you have a pretty good background in calculus including partial derivatives and chain rule, regression, matrix algebra operation, advanced geometry, etc." You get the picture. But, that is not the author's fault. This is the cognitive entry gate to understanding DNN. You need a foundation going in. I have read several books on DNNs. And, I taught myself how to develop such DNN models. Many of the books I had read before invariably combined some contextual material with some software codes to get you going. Although many of these books were between good and very good; it was refreshing to pick up a book solely concentrated on making you understand the underlying math of DNNs. Be warned, the author does not let a single stone unturned. If you are just into getting a high level understanding on how DNN work, maybe a couple of good articles at Medium will suffice. This book is a lot more than that. The author drills down on the subject. The author also has a pretty original approach to the subject that is much more geometry based than I had ever read elsewhere. He talks of mapping, and different types of spaces. He represents a lot of decisions along a two-dimensional graphs in ways I had not seen done by other authors. This book is very comparable and competitive with "Neural Networks, a Visual Introduction for Beginners" by Michael Taylor. And, I think for the ones with a pretty good background in math, but below the ones of a college grad or masters in math, Taylor's book is much more accessible and actually teaches you a lot. However, while Taylor is a very good teacher at the introductory level, Kelleher is also an excellent one at the more advanced level. Taylor and Kelleher approach the subject differently at different levels and you will learn a lot from both. From Taylor, I got a pretty good understanding of DNNs. And, I got to develop some pretty good DNNs to explain and simulate the stock market (with only a mediocre level of success, so I still have to keep my day job). From Kelleher, I learned that the DNN structure I was using that included Sigmoid activation functions was really outdated. And, that I really have to learn how to develop DNNs that use long short term memory (LSTM) with rectified linear function (ReLu) (instead of Sigmoid) to improve my DNNs. This will be an ambitious undertaking, as I will have to graduate from using a very simple R package (deepnet) that allows you to code a DNN in essentially a single line of code with all the arguments you need to specify a traditional DNN. But, to develop a DNN with LSTM with ReLu, I will have to use Python Keras with Tensorflow, a far more complex undertaking. Nevertheless, Kelleher imparted to me extensive theoretical knowledge on why I have to move away from Sigmoid activation and towards ReLu with LSTM. Given that, I could not ask more from Kelleher. He much raised my understanding of the subject. If you are in a similar boat as I am, you will appreciate this book a lot. As you will see, or as you know already DNNs is an ongoing process. There is no clear finish line. This is unlike many other model structures such as ARIMA, ECM, VAR, etc. where what you see is what you get; as these model structures have an end point. Once you reached it, you know and understand them. With DNNs, there is always either a topic you thought you understood, but you uncover you actually do not. And, there are a lot of subjects you don't even know off as the field is evolving rapidly in ever complex and diversified directions. I think DNNs will keep mathematicians busy for a pretty long time. And, that is kind of exciting in itself. When you uncover a quantitative method that seems to ever have room to evolve, it is pretty cool stuff.
Review: a gentle, solid and modern introduction to deep learning - The author has provided, in this book, a modern (to 2019) introduction to deep learning. The focus of the book is on a limited number of topics, such as backpropagation, treated very deeply (but with few assumptions about technical preparation). In additional, Kelleher has given a pretty up-to-date perspective on this subject. In recent years, due to a number of factors, such as good matrix-calculation hardware, deep learning and neural networks have shot into the vanguard of interest for weak AI. Therefore, Kelleher's expert presentation, and careful "hand-holding", as he proceeds to discuss some of the important topics, like the evolution of threshold functions, is particularly timely. I think that the very minimal level of understanding of linear algebra and calculus that is necessary to grasp the technical aspects of his discussion, make this book very valuable book for a broad audience, such as for software engineers at a beginning level in this area, and technical staff generally. Short of a good course, this summary overview is about the best one could hope for in a technical introduction, at a high level. I strongly recommend this book as a very easy, short read, that will be informative about some important basics. With the advent of software and hardware improvements, over the next twenty or thirty years, like quantum computers, deep learning is very likely to remain a significant tool in many technical fields, including physics (which is my primary area of interest).

## Technical Specifications

| Specification | Value |
|---------------|-------|
| Best Sellers Rank | #125,713 in Books ( See Top 100 in Books ) #13 in Computer Vision & Pattern Recognition #33 in Computer Neural Networks #269 in Artificial Intelligence & Semantics |
| Customer Reviews | 4.5 4.5 out of 5 stars (481) |
| Dimensions  | 5.06 x 0.75 x 7 inches |
| Edition  | Illustrated |
| ISBN-10  | 0262537559 |
| ISBN-13  | 978-0262537551 |
| Item Weight  | 8 ounces |
| Language  | English |
| Part of series  | MIT Press Essential Knowledge |
| Print length  | 296 pages |
| Publication date  | September 10, 2019 |
| Publisher  | The MIT Press |

## Images

![Deep Learning (The MIT Press Essential Knowledge series) - Image 1](https://m.media-amazon.com/images/I/61+NtYUWq4L.jpg)

## Customer Reviews

### ⭐⭐⭐⭐⭐ Excellent but not so easy.
*by A***S on February 8, 2020*

The back cover indicates "An accessible introduction to AI ..." Ok, it is accessible if you have a pretty good background in calculus including partial derivatives and chain rule, regression, matrix algebra operation, advanced geometry, etc." You get the picture. But, that is not the author's fault. This is the cognitive entry gate to understanding DNN. You need a foundation going in. I have read several books on DNNs. And, I taught myself how to develop such DNN models. Many of the books I had read before invariably combined some contextual material with some software codes to get you going. Although many of these books were between good and very good; it was refreshing to pick up a book solely concentrated on making you understand the underlying math of DNNs. Be warned, the author does not let a single stone unturned. If you are just into getting a high level understanding on how DNN work, maybe a couple of good articles at Medium will suffice. This book is a lot more than that. The author drills down on the subject. The author also has a pretty original approach to the subject that is much more geometry based than I had ever read elsewhere. He talks of mapping, and different types of spaces. He represents a lot of decisions along a two-dimensional graphs in ways I had not seen done by other authors. This book is very comparable and competitive with "Neural Networks, a Visual Introduction for Beginners" by Michael Taylor. And, I think for the ones with a pretty good background in math, but below the ones of a college grad or masters in math, Taylor's book is much more accessible and actually teaches you a lot. However, while Taylor is a very good teacher at the introductory level, Kelleher is also an excellent one at the more advanced level. Taylor and Kelleher approach the subject differently at different levels and you will learn a lot from both. From Taylor, I got a pretty good understanding of DNNs. And, I got to develop some pretty good DNNs to explain and simulate the stock market (with only a mediocre level of success, so I still have to keep my day job). From Kelleher, I learned that the DNN structure I was using that included Sigmoid activation functions was really outdated. And, that I really have to learn how to develop DNNs that use long short term memory (LSTM) with rectified linear function (ReLu) (instead of Sigmoid) to improve my DNNs. This will be an ambitious undertaking, as I will have to graduate from using a very simple R package (deepnet) that allows you to code a DNN in essentially a single line of code with all the arguments you need to specify a traditional DNN. But, to develop a DNN with LSTM with ReLu, I will have to use Python Keras with Tensorflow, a far more complex undertaking. Nevertheless, Kelleher imparted to me extensive theoretical knowledge on why I have to move away from Sigmoid activation and towards ReLu with LSTM. Given that, I could not ask more from Kelleher. He much raised my understanding of the subject. If you are in a similar boat as I am, you will appreciate this book a lot. As you will see, or as you know already DNNs is an ongoing process. There is no clear finish line. This is unlike many other model structures such as ARIMA, ECM, VAR, etc. where what you see is what you get; as these model structures have an end point. Once you reached it, you know and understand them. With DNNs, there is always either a topic you thought you understood, but you uncover you actually do not. And, there are a lot of subjects you don't even know off as the field is evolving rapidly in ever complex and diversified directions. I think DNNs will keep mathematicians busy for a pretty long time. And, that is kind of exciting in itself. When you uncover a quantitative method that seems to ever have room to evolve, it is pretty cool stuff.

### ⭐⭐⭐⭐⭐ a gentle, solid and modern introduction to deep learning
*by M***E on March 15, 2020*

The author has provided, in this book, a modern (to 2019) introduction to deep learning. The focus of the book is on a limited number of topics, such as backpropagation, treated very deeply (but with few assumptions about technical preparation). In additional, Kelleher has given a pretty up-to-date perspective on this subject. In recent years, due to a number of factors, such as good matrix-calculation hardware, deep learning and neural networks have shot into the vanguard of interest for weak AI. Therefore, Kelleher's expert presentation, and careful "hand-holding", as he proceeds to discuss some of the important topics, like the evolution of threshold functions, is particularly timely. I think that the very minimal level of understanding of linear algebra and calculus that is necessary to grasp the technical aspects of his discussion, make this book very valuable book for a broad audience, such as for software engineers at a beginning level in this area, and technical staff generally. Short of a good course, this summary overview is about the best one could hope for in a technical introduction, at a high level. I strongly recommend this book as a very easy, short read, that will be informative about some important basics. With the advent of software and hardware improvements, over the next twenty or thirty years, like quantum computers, deep learning is very likely to remain a significant tool in many technical fields, including physics (which is my primary area of interest).

### ⭐⭐⭐⭐ Not an extensive book on the topic
*by E***A on April 12, 2025*

Read the sample before you buy this book. As it says, this is not a technical book. It's an introduction to those who are not technical on the subject.

## Frequently Bought Together

- Deep Learning (The MIT Press Essential Knowledge series)
- Machine Learning, revised and updated edition (The MIT Press Essential Knowledge series)
- Algorithms (The MIT Press Essential Knowledge series)

---

## Why Shop on Desertcart?

- 🛒 **Trusted by 1.3+ Million Shoppers** — Serving international shoppers since 2016
- 🌍 **Shop Globally** — Access 737+ million products across 21 categories
- 💰 **No Hidden Fees** — All customs, duties, and taxes included in the price
- 🔄 **15-Day Free Returns** — Hassle-free returns (30 days for PRO members)
- 🔒 **Secure Payments** — Trusted payment options with buyer protection
- ⭐ **TrustPilot Rated 4.5/5** — Based on 8,000+ happy customer reviews

**Shop now:** [https://www.desertcart.be/products/148137089-deep-learning-the-mit-press-essential-knowledge-series](https://www.desertcart.be/products/148137089-deep-learning-the-mit-press-essential-knowledge-series)

---

*Product available on Desertcart Belgium*
*Store origin: BE*
*Last updated: 2026-04-30*