9:00 AM - 5:00 PM
There are no prerequisites for this course. However, some experience with the Python programming language would be beneficial.
Anyone who wants to learn about neural networks and deep learning can attend this course.
Neural Networks with Deep Learning Training Course Overview
Neural Networks are a set of algorithms designed to identify patterns. These are developed to imitate the human brain. Neural networks translate sensory data through labelling or clustering raw input and machine perception. These networks identify numerical patterns that are stored in vectors. All the real-world data, including text, images, or sound, must be translated into these numerical patterns. Neural networks can be thought of as a clustering and classification layer on top of the data stored and managed.
The Knowledge Academy’s Neural Networks with Deep Learning Training course will provide delegates with an understanding of deep learning and neural networks. Delegates will be familiarised with basic concepts of neural networks such as binary classification, logistic regression, derivatives, and vectorisation.
During this 1-day training course, delegates will be introduced to Python and Jupyter/IPython notebooks. Delegates will learn about shallow neural networks, including vectorised implementation, activation functions, and backpropagation intuition. In addition, delegates will also gain knowledge on the concepts of deep neural networks involving deep L-layer neural network, deep representations, and forward and backward propagation.
- Delegate pack consisting of course notes and exercises
- Experienced Instructor
Neural Networks with Deep Learning Training Course Outline
Introduction to Deep Learning
- Introduction to Neural Networks
- Supervised Learning with Neural Networks
Neural Networks Fundamentals
- Binary Classification
- Logistic Regression
- Gradient Descent and Derivatives
- Computational Graph
- Introduction to Python
- Jupyter/IPython Notebooks
Shallow Neural Networks
- Representation of a Neural Network
- Computing the Output of a Neural Network
- Vectorised Implementation
- Activation Functions
- Derivatives of Activation Functions
- Gradient Descent for Neural Networks
- Backpropagation Intuition
Deep Neural Networks
- Deep L-layer Neural Network
- Forward Propagation
- Deep Representations
- Building Blocks
- Backward Propagation
- Difference Between Parameters and Hyperparameters
9:00 AM - 5:00 PM