SlideShare a Scribd company logo
Introduction to Deep Learning
Poo Kuan Hoong
19th July 2016
Data Science Institute
• The Data Science Institute is a research
center based in the Faculty of Computing
& Informatics, Multimedia University.
• The members comprise of expertise
across faculties such as Faculty of
Computing and Informatics, Faculty of
Engineering, Faculty of Management &
Faculty of Information Science and
Technology.
• Conduct research in leading data science
areas including stream mining, video
analytics, machine learning, deep
learning, next generation data
visualization and advanced data
modelling.
Google DeepMind playing Atari Breakout
https://www.youtube.com/watch?v=V1eYniJ0Rnk
Google DeepMind Alphago
Baidu eye
https://www.youtube.com/watch?v=Xe5RcJ1JY3c
Acknowledgement
Andrew Ng: Deep Learning,
Self-Taught Learning and
Unsupervised Feature
Learning [Youtube]
Yann LeCun: Deep
Learning Tutorial, ICML,
Atlanta, 2013 [PDF]
Geoff Hinton, Yoshua
Bengio & Yann LeCun:
Deep Learning: NIPS2015
Tutorial [PDF]
Yoshua Bengio: Theano: A Python
framework for fast computation of
mathematical expressions. [URL]
Andrej Karpathy: Visualizing and
Understanding Recurrent Networks, ICLR
2016, [PDF]
Outline
• A brief history of machine learning
• Understanding the human brain
• Neural Network: Concept, implementation and challenges
• Deep Belief Network (DBN): Concept and Application
• Convolutional Neural Network (CNN): Concept and Application
• Recurrent Neural Network (RNN): Concept and Application
• Deep Learning: Strengths, weaknesses and applications
• Deep Learning: Platforms, frameworks and libraries
• Demo
Introduction
• In the past 10 years, machine learning and
Artificial Intelligence have shown
tremendous progress
• The recent success can be attributed to:
• Explosion of data
• Cheap computing cost – CPUs and GPUs
• Improvement of machine learning models
• Much of the current excitement concerns a
subfield of it called “deep learning”.
A brief history of Machine learning
• Most of the machine learning methods are based on supervised
learning
Input
Feature
Representation
Learning Algorithm
A brief history of Machine learning
32 45 21 ..
12 10 45 ..
17 33 36 ..
… … …
12 56 18 ..
92 76 22 ..
33 63 71 ..
… … …
Features Training
Trainable Feature Hierarchy
Traditional machine perception
• Hand crafted feature extractors
Human Brain
Auditory Cortex
Auditory cortex learns to see.
(Same rewiring process also
works for touch/ somatosensory
cortex.)
Seeing with tongue
Human Brain
Biological Neuron Artificial Neuron
Neuron/Unit
Weight
Neural Network
• Deep Learning is primarily about neural networks, where a network is
an interconnected web of nodes and edges.
• Neural nets were designed to perform complex tasks, such as the task
of placing objects into categories based on a few attributes.
• Neural nets are highly structured networks, and have three kinds of
layers - an input, an output, and so called hidden layers, which refer
to any layers between the input and the output layers.
• Each node (also called a neuron) in the hidden and output layers has
a classifier.
Neural Network
Neural Network: Forward Propagation
• The input neurons first receive the
data features of the object. After
processing the data, they send their
output to the first hidden layer.
• The hidden layer processes this output
and sends the results to the next
hidden layer.
• This continues until the data reaches
the final output layer, where the
output value determines the object's
classification.
• This entire process is known as
Forward Propagation, or Forward prop.
Neural Network: Backward Propagation
• To train a neural network over a large set of labelled data, you must
continuously compute the difference between the network’s
predicted output and the actual output.
• This difference is called the cost, and the process for training a net is
known as backpropagation, or backprop
• During backprop, weights and biases are tweaked slightly until the
lowest possible cost is achieved.
• An important aspect of this process is the gradient, which is a
measure of how much the cost changes with respect to a change in a
weight or bias value.
The 1990s view of what was wrong with back-
propagation
• It required a lot of labelled training data
• almost all data is unlabeled
• The learning time did not scale well
• It was very slow in networks with multiple hidden layers.
• It got stuck at local optima
• These were often surprisingly good but there was no good theory
Deep Belief Network (DBN)
• The Deep Belief Network, or DBN,
was also conceived by Geoff Hinton.
• Used by Google for their work on the
image recognition problem.
• DBN is trained two layers at a time,
and these two layers are treated like
an RBM.
• Throughout the net, the hidden layer
of an RBM acts as the input layer of
the adjacent one. So the first RBM is
trained, and its outputs are then
used as inputs to the next RBM. This
procedure is repeated until the
output layer is reached.
Deep Belief Network (DBN)
• DBN is capable of recognizing the inherent patterns in the data. In
other words, it’s a sophisticated, multilayer feature extractor.
• The unique aspect of this type of net is that each layer ends up
learning the full input structure.
• Layers generally learn progressively complex patterns – for facial
recognition, early layers could detect edges and later layers would
combine them to form facial features.
• DBN learns the hidden patterns globally, like a camera slowly bringing
an image into focus.
• DBN still requires a set of labels to apply to the resulting patterns. As
a final step, the DBN is fine-tuned with supervised learning and a
small set of labeled examples.
Deep Neural Network (Deep Net)
Convolutional Neural Network (CNN)
• CNN inspired by the Visual Cortex.
• CNNs are deep nets that are used for image, object, and even speech
recognition.
• Pioneered by Yann Lecun (NYU)
• Deep supervised neural networks are generally too difficult to train.
• CNNs have multiple types of layers, the first of which is the
convolutional layer.
Convolutional Neural Network (CNN)
• A series of filters forms layer one, called the convolutional layer. The weights and
biases in this layer determine the effectiveness of the filtering process.
• Each flashlight represents a single neuron. Typically, neurons in a layer activate or
fire. On the other hand, in the convolutional layer, neurons search for patterns
through convolution. Neurons from different filters search for different patterns,
and thus they will process the input differently.
Filter 2/ Neural 2
W1=10
W3=4
W2=5
Convolutional Neural Network (CNN)
CNN: Application
• Classify a scene in an image
• Image Classifier Demo (NYU): http://horatio.cs.nyu.edu/
• Describe or understanding an image
• Toronto Deep Learning Demo: http://deeplearning.cs.toronto.edu/i2t
• MIT Scene Recognition Demo: http://places.csail.mit.edu/demo.html
• Handwriting recognition
• Handwritten digits recognition:
http://cs.stanford.edu/people/karpathy/convnetjs/demo/mnist.html
• Video classification
• Large-scale Video Classification with Convolutional Neural Networks
http://cs.stanford.edu/people/karpathy/deepvideo/
Recurrent Neural Network (RNN)
• The Recurrent Neural Net (RNN) is
the brainchild of Juergen
Schmidhuber and Sepp Hochreiter.
• RNNs have a feedback loop where
the net’s output is fed back into
the net along with the next input.
• RNNs receive an input and produce
an output. Unlike other nets, the
inputs and outputs can come in a
sequence.
• Variant of RNN is Long Term Short
Memory (LSTM)
RNN: Application
• RNN is suitable for time series data, where an output can be the next
value in a sequence, or the next several values
Classify Image frame by
frame
Image captioning Document Classification
Deep Learning: Benefits
• Robust
• No need to design the features ahead of time – features are automatically
learned to be optimal for the task at hand
• Robustness to natural variations in the data is automatically learned
• Generalizable
• The same neural net approach can be used for many different applications
and data types
• Scalable
• Performance improves with more data, method is massively parallelizable
Deep Learning: Weaknesses
• Deep Learning requires a large dataset, hence long training period.
• In term of cost, Machine Learning methods like SVMs and other tree
ensembles are very easily deployed even by relative machine learning
novices and can usually get you reasonably good results.
• Deep learning methods tend to learn everything. It’s better to encode prior
knowledge about structure of images (or audio or text).
• The learned features are often difficult to understand. Many vision features
are also not really human-understandable (e.g,
concatenations/combinations of different features).
• Requires a good understanding of how to model multiple modalities with
traditional tools.
Deep Learning: Applications
Deep Learning: Application
https://deepmind.com/alpha-go http://places.csail.mit.edu/demo.html
Robotic grasping
Pedestrian detection using DL
Deep Learning Development Cycle
Deep Learning: Platform & Frameworks &
Libraries
Platform
• Ersatz Labs - cloud-based deep learning platform [http://www.ersatz1.com/]
• H20 – deep learning framework that comes with R and Python interfaces
[http://www.h2o.ai/verticals/algos/deep-learning/]
Framework
• Caffe - deep learning framework made with expression, speed, and modularity in mind.
Developed by the Berkeley Vision and Learning Center (BVLC)
[http://caffe.berkeleyvision.org/]
• Torch - scientific computing framework with wide support for machine learning
algorithms that puts GPUs first. Based on Lua programming language [http://torch.ch/]
Library
• Tensorflow - open source software library for numerical computation using data flow
graphs from Google [https://www.tensorflow.org/]
• Theano - a python library developed by Yoshua Bengio’s team
[http://deeplearning.net/software/theano/]
Learned Models
• Trained Models can be shared with others
• Save the training time
• For example: AlexNet, GoogLeNet, ParseNet, etc
• URLs:
• https://github.com/BVLC/caffe/wiki/Model-Zoo
• http://deeplearning4j.org/model-zoo
An Introduction to Deep Learning
Nvidia: Digits
• The NVIDIA Deep Learning GPU Training System (DIGITS) puts the
power of deep learning in the hands of data scientists and
researchers.
• Quickly design the best deep neural network (DNN) for your data
using real-time network behavior visualization.
• https://developer.nvidia.com/digits
Car Park Images
Car park images
Cropped Car Park space
Digits – Image Classification Model
Digits – AlexNet Training
Digits – AlexNet Training
Digits – Testing and Validation
Digits – Results
Digits – Results
Digits – Results
Digits – Further Evaluation
Digits – Further Evaluation
Digits – Further Evaluation
Digits – try it out yourself
• https://github.com/NVIDIA/DIGITS/blob/master/docs/GettingStarted.
md
Thanks!
Questions?
@kuanhoong
https://www.linkedin.com/in/kuanhoong
khpoo@mmu.edu.my

More Related Content

PPTX
Introduction to Deep learning
PDF
Deep learning - A Visual Introduction
PPTX
Deep learning
PPTX
Introduction to Deep Learning
PPTX
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
PDF
Introduction of Deep Learning
PDF
An introduction to Machine Learning
PPTX
Deep learning
Introduction to Deep learning
Deep learning - A Visual Introduction
Deep learning
Introduction to Deep Learning
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
Introduction of Deep Learning
An introduction to Machine Learning
Deep learning

What's hot (20)

PDF
Deep Learning - Convolutional Neural Networks
PDF
Neural networks and deep learning
PPTX
Deep Learning Explained
PDF
Machine learning
PDF
Convolutional Neural Networks (CNN)
PPTX
Intro to deep learning
PDF
Convolutional neural network
PPTX
What is Deep Learning?
PPTX
Deep neural networks
PPTX
Deep Learning Tutorial
PPTX
Deep Learning With Neural Networks
PPT
Artificial Neural Networks - ANN
PDF
An introduction to Deep Learning
PDF
Introduction to Recurrent Neural Network
PDF
Deep learning seminar report
PPT
Machine Learning presentation.
PPTX
Convolutional neural network
PPTX
Introduction to deep learning
PDF
Deep learning - what is it and why now?
PPTX
Deep learning health care
Deep Learning - Convolutional Neural Networks
Neural networks and deep learning
Deep Learning Explained
Machine learning
Convolutional Neural Networks (CNN)
Intro to deep learning
Convolutional neural network
What is Deep Learning?
Deep neural networks
Deep Learning Tutorial
Deep Learning With Neural Networks
Artificial Neural Networks - ANN
An introduction to Deep Learning
Introduction to Recurrent Neural Network
Deep learning seminar report
Machine Learning presentation.
Convolutional neural network
Introduction to deep learning
Deep learning - what is it and why now?
Deep learning health care
Ad

Similar to An Introduction to Deep Learning (20)

PDF
Big Data Malaysia - A Primer on Deep Learning
PDF
DSRLab seminar Introduction to deep learning
PDF
MDEC Data Matters Series: machine learning and Deep Learning, A Primer
PPTX
Introduction to deep learning
PPTX
Neural Networks and Deep Learning Basics
PDF
introduction to deeplearning
PPT
Multimedia Data Mining using Deep Learning
PDF
Training machine learning deep learning 2017
PPTX
Computer Vision labratory of stud_L4 (2).pptx
PPTX
Open Source AI and ML, Whats Possible Today?
PPTX
Introduction of Machine learning and Deep Learning
PPTX
Introduction to deep learning
PPTX
Karan ppt for neural network and deep learning
PPTX
DEEP_LEARNING_Lecture1 for btech students.pptx
PPTX
Deep learning introduction
PPTX
The Backbone of Modern AI Models" The architecture of Transformers
PPTX
Unit one ppt of deeep learning which includes Ann cnn
PPT
DEEP LEARNING PPT aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
PPTX
Deep Learning: Evolution of ML from Statistical to Brain-like Computing- Data...
PPTX
Deep learning: the future of recommendations
Big Data Malaysia - A Primer on Deep Learning
DSRLab seminar Introduction to deep learning
MDEC Data Matters Series: machine learning and Deep Learning, A Primer
Introduction to deep learning
Neural Networks and Deep Learning Basics
introduction to deeplearning
Multimedia Data Mining using Deep Learning
Training machine learning deep learning 2017
Computer Vision labratory of stud_L4 (2).pptx
Open Source AI and ML, Whats Possible Today?
Introduction of Machine learning and Deep Learning
Introduction to deep learning
Karan ppt for neural network and deep learning
DEEP_LEARNING_Lecture1 for btech students.pptx
Deep learning introduction
The Backbone of Modern AI Models" The architecture of Transformers
Unit one ppt of deeep learning which includes Ann cnn
DEEP LEARNING PPT aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
Deep Learning: Evolution of ML from Statistical to Brain-like Computing- Data...
Deep learning: the future of recommendations
Ad

More from Poo Kuan Hoong (20)

PDF
Build an efficient Machine Learning model with LightGBM
PDF
Tensor flow 2.0 what's new
PDF
The future outlook and the path to be Data Scientist
PDF
Data Driven Organization and Data Commercialization
PDF
TensorFlow and Keras: An Overview
PDF
Explore and Have Fun with TensorFlow: Transfer Learning
PDF
Deep Learning with R
PDF
Explore and have fun with TensorFlow: An introductory to TensorFlow
PDF
The path to be a Data Scientist
PPTX
Deep Learning with Microsoft R Open
PPTX
Microsoft APAC Machine Learning & Data Science Community Bootcamp
PDF
Customer Churn Analytics using Microsoft R Open
PDF
Machine Learning and Deep Learning with R
PDF
The path to be a data scientist
PDF
Handwritten Recognition using Deep Learning with R
PDF
Machine learning and big data
PDF
Context Aware Road Traffic Speech Information System from Social Media
PDF
Virtual Interaction Using Myo And Google Cardboard (slides)
PPTX
A Comparative Study of HITS vs PageRank Algorithms for Twitter Users Analysis
PPTX
Towards Auto-Extracting Car Park Structures: Image Processing Approach on Low...
Build an efficient Machine Learning model with LightGBM
Tensor flow 2.0 what's new
The future outlook and the path to be Data Scientist
Data Driven Organization and Data Commercialization
TensorFlow and Keras: An Overview
Explore and Have Fun with TensorFlow: Transfer Learning
Deep Learning with R
Explore and have fun with TensorFlow: An introductory to TensorFlow
The path to be a Data Scientist
Deep Learning with Microsoft R Open
Microsoft APAC Machine Learning & Data Science Community Bootcamp
Customer Churn Analytics using Microsoft R Open
Machine Learning and Deep Learning with R
The path to be a data scientist
Handwritten Recognition using Deep Learning with R
Machine learning and big data
Context Aware Road Traffic Speech Information System from Social Media
Virtual Interaction Using Myo And Google Cardboard (slides)
A Comparative Study of HITS vs PageRank Algorithms for Twitter Users Analysis
Towards Auto-Extracting Car Park Structures: Image Processing Approach on Low...

Recently uploaded (20)

PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PPTX
TLE Review Electricity (Electricity).pptx
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PPTX
1. Introduction to Computer Programming.pptx
PDF
Heart disease approach using modified random forest and particle swarm optimi...
PDF
Empathic Computing: Creating Shared Understanding
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
Accuracy of neural networks in brain wave diagnosis of schizophrenia
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PPTX
cloud_computing_Infrastucture_as_cloud_p
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
A comparative study of natural language inference in Swahili using monolingua...
Mobile App Security Testing_ A Comprehensive Guide.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
TLE Review Electricity (Electricity).pptx
Programs and apps: productivity, graphics, security and other tools
Advanced methodologies resolving dimensionality complications for autism neur...
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
1. Introduction to Computer Programming.pptx
Heart disease approach using modified random forest and particle swarm optimi...
Empathic Computing: Creating Shared Understanding
Spectral efficient network and resource selection model in 5G networks
Group 1 Presentation -Planning and Decision Making .pptx
Accuracy of neural networks in brain wave diagnosis of schizophrenia
NewMind AI Weekly Chronicles - August'25-Week II
cloud_computing_Infrastucture_as_cloud_p
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Network Security Unit 5.pdf for BCA BBA.
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
MIND Revenue Release Quarter 2 2025 Press Release
Encapsulation_ Review paper, used for researhc scholars
A comparative study of natural language inference in Swahili using monolingua...

An Introduction to Deep Learning

  • 1. Introduction to Deep Learning Poo Kuan Hoong 19th July 2016
  • 2. Data Science Institute • The Data Science Institute is a research center based in the Faculty of Computing & Informatics, Multimedia University. • The members comprise of expertise across faculties such as Faculty of Computing and Informatics, Faculty of Engineering, Faculty of Management & Faculty of Information Science and Technology. • Conduct research in leading data science areas including stream mining, video analytics, machine learning, deep learning, next generation data visualization and advanced data modelling.
  • 3. Google DeepMind playing Atari Breakout https://www.youtube.com/watch?v=V1eYniJ0Rnk
  • 6. Acknowledgement Andrew Ng: Deep Learning, Self-Taught Learning and Unsupervised Feature Learning [Youtube] Yann LeCun: Deep Learning Tutorial, ICML, Atlanta, 2013 [PDF] Geoff Hinton, Yoshua Bengio & Yann LeCun: Deep Learning: NIPS2015 Tutorial [PDF] Yoshua Bengio: Theano: A Python framework for fast computation of mathematical expressions. [URL] Andrej Karpathy: Visualizing and Understanding Recurrent Networks, ICLR 2016, [PDF]
  • 7. Outline • A brief history of machine learning • Understanding the human brain • Neural Network: Concept, implementation and challenges • Deep Belief Network (DBN): Concept and Application • Convolutional Neural Network (CNN): Concept and Application • Recurrent Neural Network (RNN): Concept and Application • Deep Learning: Strengths, weaknesses and applications • Deep Learning: Platforms, frameworks and libraries • Demo
  • 8. Introduction • In the past 10 years, machine learning and Artificial Intelligence have shown tremendous progress • The recent success can be attributed to: • Explosion of data • Cheap computing cost – CPUs and GPUs • Improvement of machine learning models • Much of the current excitement concerns a subfield of it called “deep learning”.
  • 9. A brief history of Machine learning • Most of the machine learning methods are based on supervised learning Input Feature Representation Learning Algorithm
  • 10. A brief history of Machine learning 32 45 21 .. 12 10 45 .. 17 33 36 .. … … … 12 56 18 .. 92 76 22 .. 33 63 71 .. … … …
  • 13. Traditional machine perception • Hand crafted feature extractors
  • 14. Human Brain Auditory Cortex Auditory cortex learns to see. (Same rewiring process also works for touch/ somatosensory cortex.) Seeing with tongue
  • 15. Human Brain Biological Neuron Artificial Neuron Neuron/Unit Weight
  • 16. Neural Network • Deep Learning is primarily about neural networks, where a network is an interconnected web of nodes and edges. • Neural nets were designed to perform complex tasks, such as the task of placing objects into categories based on a few attributes. • Neural nets are highly structured networks, and have three kinds of layers - an input, an output, and so called hidden layers, which refer to any layers between the input and the output layers. • Each node (also called a neuron) in the hidden and output layers has a classifier.
  • 18. Neural Network: Forward Propagation • The input neurons first receive the data features of the object. After processing the data, they send their output to the first hidden layer. • The hidden layer processes this output and sends the results to the next hidden layer. • This continues until the data reaches the final output layer, where the output value determines the object's classification. • This entire process is known as Forward Propagation, or Forward prop.
  • 19. Neural Network: Backward Propagation • To train a neural network over a large set of labelled data, you must continuously compute the difference between the network’s predicted output and the actual output. • This difference is called the cost, and the process for training a net is known as backpropagation, or backprop • During backprop, weights and biases are tweaked slightly until the lowest possible cost is achieved. • An important aspect of this process is the gradient, which is a measure of how much the cost changes with respect to a change in a weight or bias value.
  • 20. The 1990s view of what was wrong with back- propagation • It required a lot of labelled training data • almost all data is unlabeled • The learning time did not scale well • It was very slow in networks with multiple hidden layers. • It got stuck at local optima • These were often surprisingly good but there was no good theory
  • 21. Deep Belief Network (DBN) • The Deep Belief Network, or DBN, was also conceived by Geoff Hinton. • Used by Google for their work on the image recognition problem. • DBN is trained two layers at a time, and these two layers are treated like an RBM. • Throughout the net, the hidden layer of an RBM acts as the input layer of the adjacent one. So the first RBM is trained, and its outputs are then used as inputs to the next RBM. This procedure is repeated until the output layer is reached.
  • 22. Deep Belief Network (DBN) • DBN is capable of recognizing the inherent patterns in the data. In other words, it’s a sophisticated, multilayer feature extractor. • The unique aspect of this type of net is that each layer ends up learning the full input structure. • Layers generally learn progressively complex patterns – for facial recognition, early layers could detect edges and later layers would combine them to form facial features. • DBN learns the hidden patterns globally, like a camera slowly bringing an image into focus. • DBN still requires a set of labels to apply to the resulting patterns. As a final step, the DBN is fine-tuned with supervised learning and a small set of labeled examples.
  • 23. Deep Neural Network (Deep Net)
  • 24. Convolutional Neural Network (CNN) • CNN inspired by the Visual Cortex. • CNNs are deep nets that are used for image, object, and even speech recognition. • Pioneered by Yann Lecun (NYU) • Deep supervised neural networks are generally too difficult to train. • CNNs have multiple types of layers, the first of which is the convolutional layer.
  • 25. Convolutional Neural Network (CNN) • A series of filters forms layer one, called the convolutional layer. The weights and biases in this layer determine the effectiveness of the filtering process. • Each flashlight represents a single neuron. Typically, neurons in a layer activate or fire. On the other hand, in the convolutional layer, neurons search for patterns through convolution. Neurons from different filters search for different patterns, and thus they will process the input differently. Filter 2/ Neural 2 W1=10 W3=4 W2=5
  • 27. CNN: Application • Classify a scene in an image • Image Classifier Demo (NYU): http://horatio.cs.nyu.edu/ • Describe or understanding an image • Toronto Deep Learning Demo: http://deeplearning.cs.toronto.edu/i2t • MIT Scene Recognition Demo: http://places.csail.mit.edu/demo.html • Handwriting recognition • Handwritten digits recognition: http://cs.stanford.edu/people/karpathy/convnetjs/demo/mnist.html • Video classification • Large-scale Video Classification with Convolutional Neural Networks http://cs.stanford.edu/people/karpathy/deepvideo/
  • 28. Recurrent Neural Network (RNN) • The Recurrent Neural Net (RNN) is the brainchild of Juergen Schmidhuber and Sepp Hochreiter. • RNNs have a feedback loop where the net’s output is fed back into the net along with the next input. • RNNs receive an input and produce an output. Unlike other nets, the inputs and outputs can come in a sequence. • Variant of RNN is Long Term Short Memory (LSTM)
  • 29. RNN: Application • RNN is suitable for time series data, where an output can be the next value in a sequence, or the next several values Classify Image frame by frame Image captioning Document Classification
  • 30. Deep Learning: Benefits • Robust • No need to design the features ahead of time – features are automatically learned to be optimal for the task at hand • Robustness to natural variations in the data is automatically learned • Generalizable • The same neural net approach can be used for many different applications and data types • Scalable • Performance improves with more data, method is massively parallelizable
  • 31. Deep Learning: Weaknesses • Deep Learning requires a large dataset, hence long training period. • In term of cost, Machine Learning methods like SVMs and other tree ensembles are very easily deployed even by relative machine learning novices and can usually get you reasonably good results. • Deep learning methods tend to learn everything. It’s better to encode prior knowledge about structure of images (or audio or text). • The learned features are often difficult to understand. Many vision features are also not really human-understandable (e.g, concatenations/combinations of different features). • Requires a good understanding of how to model multiple modalities with traditional tools.
  • 33. Deep Learning: Application https://deepmind.com/alpha-go http://places.csail.mit.edu/demo.html Robotic grasping Pedestrian detection using DL
  • 35. Deep Learning: Platform & Frameworks & Libraries Platform • Ersatz Labs - cloud-based deep learning platform [http://www.ersatz1.com/] • H20 – deep learning framework that comes with R and Python interfaces [http://www.h2o.ai/verticals/algos/deep-learning/] Framework • Caffe - deep learning framework made with expression, speed, and modularity in mind. Developed by the Berkeley Vision and Learning Center (BVLC) [http://caffe.berkeleyvision.org/] • Torch - scientific computing framework with wide support for machine learning algorithms that puts GPUs first. Based on Lua programming language [http://torch.ch/] Library • Tensorflow - open source software library for numerical computation using data flow graphs from Google [https://www.tensorflow.org/] • Theano - a python library developed by Yoshua Bengio’s team [http://deeplearning.net/software/theano/]
  • 36. Learned Models • Trained Models can be shared with others • Save the training time • For example: AlexNet, GoogLeNet, ParseNet, etc • URLs: • https://github.com/BVLC/caffe/wiki/Model-Zoo • http://deeplearning4j.org/model-zoo
  • 38. Nvidia: Digits • The NVIDIA Deep Learning GPU Training System (DIGITS) puts the power of deep learning in the hands of data scientists and researchers. • Quickly design the best deep neural network (DNN) for your data using real-time network behavior visualization. • https://developer.nvidia.com/digits
  • 42. Digits – Image Classification Model
  • 43. Digits – AlexNet Training
  • 44. Digits – AlexNet Training
  • 45. Digits – Testing and Validation
  • 49. Digits – Further Evaluation
  • 50. Digits – Further Evaluation
  • 51. Digits – Further Evaluation
  • 52. Digits – try it out yourself • https://github.com/NVIDIA/DIGITS/blob/master/docs/GettingStarted. md