Super excited to announce today’s session on Introduction to Neural Networks & its working. This is will be an introductory session where we will learn about aspects such as what, why and where Deep Learning is used and alongside, we will learn about its working!
Without any further ado, please note down the session details below:
Schedule - 21st August at 20:30 IST / 17:00 CET / 15:00 UTC (please locate time in your timezone here)
Does the model checks for precision of output at each hidden level before moving onto next? If any error occurs in output, does it refactor the weights at the same time?
@sathish24 The download links for all the slides will be added once everything freezes. For time being, we would request you to go through the material on bootcamp platform.
@yuvi_008 Hi Yuvraj. Unfortunately we have closed all the registrations for now. We would be hosting more of such bootcamps in the future. You may follow us on LinkedIn for future updates. https://www.linkedin.com/company/dphi/
Hello, can someone please explain how you usually decide the size of your deep neural network?
Without much domain knowledge, I usually pick some shapes randomly, but I am wondering how to do it the best way?
Honestly, there is no generic way to determine the number of layers for a Deep Neural Network. Domain experts try to implement the problems based on their previous experiences and try different variations with the layers before selecting the best one.
Yoshua Bengio, Head of Montreal Institute for Learning Algorithm says:
"Very simple. Just keep adding layers until the test error does not improve anymore.”
A method recommended by Geoff Hinton is to add layers until you start to overfit your training set. Then you add dropout or another regularization methods.
To further help you understand this, I have found a very interesting article on how to configure the number of layers in a Deep Neural Network. It will walk you through the basics of understanding your problem along with 5 different ways of approaches towards selecting the right number of layers.