Difference between revisions of "Artificial Neural Networks and Deep Learning"

From Chrome
Jump to: navigation, search
(Course Slides)
 
(61 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
__FORCETOC__
 
__FORCETOC__
  
The following are last minute news you should be aware of ;-)
+
The following are last-minute news you should be aware of ;-)
 +
13/10/2021: Final grade for 2020/2021 year are [[Media:AN2DL_Grades_20210830.pdf|here]]
 +
30/09/2021: Video and slides updated + notebooks published
 +
24/09/2021: Removed last year practicals, this year they will be different!!! Python crash course moved at the end of the material
 +
22/09/2021: Added colab crash course on Python 3 in the material session
 +
18/09/2021: We skip exercising on 22/09 and added exercising on 15/12 I updated the schedule!
 +
16/09/2021: Updated material on the first lectures + added reference to python tutorials in the material section
 +
15/09/2021: Lectures start today!
 +
14/09/2021: Website under maintenance ... come back later
 +
 
 +
<!--
 +
07/08/2021: Result from [[Media:AN2DL_Grades_20210728.pdf|28/07/2021 call with all homeworks]] are available here
 +
25/07/2021: Result from [[Media:AN2DL_Grades_20210708.pdf|08/07/2021 call with all homeworks]] are available here
 +
26/05/2021: The link to the form to request an instance of remote examination for June and July is [https://forms.office.com/r/i5fFDsPNGa here!]
 +
17/02/2021: Result from [[Media:AN2DL_Grades_20210125_7.pdf|25/01/2020 call with all homeworks]] are available here
 +
12/02/2021: Result from [[Media:AN2DL_Grades_20210125_6.pdf|25/01/2020 call with some (~85) of the third homeworks]] are available here
 +
10/02/2021: Result from [[Media:AN2DL_Grades_20210125_5.pdf|25/01/2020 call with second homeworks]] are available here
 
  09/02/2021: Result from [[Media:AN2DL_Grades_20210125_4.pdf|25/01/2020 call with some (~210) of the second homeworks]] are available here
 
  09/02/2021: Result from [[Media:AN2DL_Grades_20210125_4.pdf|25/01/2020 call with some (~210) of the second homeworks]] are available here
 
  09/02/2021: Result from [[Media:AN2DL_Grades_20210125_3.pdf|25/01/2020 call with some (~120) of the second homeworks]] are available here
 
  09/02/2021: Result from [[Media:AN2DL_Grades_20210125_3.pdf|25/01/2020 call with some (~120) of the second homeworks]] are available here
Line 33: Line 49:
 
  14/09/2020: FIX - the hours of the second team were overlapping to the first, now they are correctly one after the other
 
  14/09/2020: FIX - the hours of the second team were overlapping to the first, now they are correctly one after the other
 
  13/09/2020: The course is about to start ... stay tuned!
 
  13/09/2020: The course is about to start ... stay tuned!
 +
-->
  
 
<!-- [2019/2020] 08/08/2020: Published [[Media:AN2DL_Grades_20200715_challenges.pdf|the results of the 15/07/2020 written call]] with ALL challenges included
 
<!-- [2019/2020] 08/08/2020: Published [[Media:AN2DL_Grades_20200715_challenges.pdf|the results of the 15/07/2020 written call]] with ALL challenges included
Line 74: Line 91:
 
* [https://www.deib.polimi.it/ita/personale/dettagli/549640 Giacomo Boracchi]: the course co-teacher and this is his [https://politecnicomilano.webex.com/join/giacomo.boracchi webex room]  
 
* [https://www.deib.polimi.it/ita/personale/dettagli/549640 Giacomo Boracchi]: the course co-teacher and this is his [https://politecnicomilano.webex.com/join/giacomo.boracchi webex room]  
 
* [https://www.deib.polimi.it/ita/personale/dettagli/846174 Francesco Lattari]: the course teaching assistant and this is his [https://politecnicomilano.webex.com/join/francesco.lattari webex room]
 
* [https://www.deib.polimi.it/ita/personale/dettagli/846174 Francesco Lattari]: the course teaching assistant and this is his [https://politecnicomilano.webex.com/join/francesco.lattari webex room]
 +
* [https://www.deib.polimi.it/ita/personale/dettagli/920490 Eugenio Lomurno]: the course teaching assistant and this is his [https://politecnicomilano.webex.com/join/eugenio.lomurno webex room]
  
 
===Course Program and Syllabus===
 
===Course Program and Syllabus===
Line 93: Line 111:
  
 
  Note: Lecture timetable interpretation
 
  Note: Lecture timetable interpretation
  * On Wednesday, in 2.0.2 (EX N.0.2), starts at 15:15, ends at 17:00
+
  * On Wednesday, in T.2.1, Team 1, starts at 15:15, ends at 17:00
  * On Wednesday, in 2.1.2 (EX N.1.2), starts at 17:30, ends at 19:15
+
  * On Wednesday, in T.2.1, Team 2, starts at 17:30, ends at 19:15
 
  * On Thursday, in teacher webex room, starts at 16:30, ends at 19:15
 
  * On Thursday, in teacher webex room, starts at 16:30, ends at 19:15
  
  New Note: from November 18th teams are merged and lectures are moved online
+
  Note: Teams division is based on your Codice Persona (and should minimize overlap)
  * On Wednesday, in teacher webex room, starts at 16:15, ends at 18:15
+
  * Team 1: odd Codice Persona
  * On Thursday, in teacher webex room, starts at 16:30, ends at 19:15
+
  * Team 2: even Codice Persona
  
 +
For a google calendar you might [https://boracchi.faculty.polimi.it/teaching/AN2DLCalendar.htm look here!]
 +
 +
{| border="1" align="center" style="text-align:center;"
 +
|-
 +
|Date || Day || Time || Room || Teacher || Topic
 +
|-
 +
|15/09/2021 || Wednesday || 15:15 - 17:00 || T.2.1 (Team1 ) || rowspan="2" | [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || rowspan="2" | [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=243171f0d60ccb6507e5d097b3c55131 Course Introduction + Deep Learning Intro]
 +
|-
 +
|15/09/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|16/09/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=48eb781c6c832ebe5d1a65aa60aa80f8 From Perceptrons to Feed Forward Neural Networks]
 +
|-
 +
|22/09/2021 || Wednesday || -- || -- || rowspan="2" | -- || rowspan="2" |  -- No Lecture --
 +
|-
 +
|22/09/2021 || Wednesday || -- || --
 +
|-
 +
|23/09/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=d6326382b3d5b22cc9c82cc2094fc7d9 Feed forward neural networks and Backpropagation]
 +
|-
 +
|29/09/2021 || Wednesday || 15:15 - 17:00 || T.2.1 (Team 1)  || rowspan="2" |  [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=7e07badcc1bdb91e76d9c42fb46a44dd KERAS: Numpy, Tensorflow and FNN]
 +
|-
 +
|29/09/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|30/09/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=62460aac13f46ceb85f365ef3af13da9 Error Functions Design]
 +
|-
 +
|06/10/2021 || Wednesday || 15:15 - 17:00 || T.2.1 (Team 1) || rowspan="2" | [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || rowspan="2" | [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=866cd6c1a612bdf9fb92e8de7630ccaa Overfitting, cross-validation, and Early Stopping]
 +
|-
 +
|06/10/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|07/10/2021 || Thursday || --- || --- || --- || No Lectures (Graduation)
 +
|-
 +
|13/10/2021 || Wednesday || 15:15 - 17:00 || T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=3c4494b4865929a682d8b7773deeb06b KERAS: FFNN and Overfitting]
 +
|-
 +
|13/10/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2) 
 +
|-
 +
|14/10/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || [https://politecnicomilano.webex.com/politecnicomilano/ldr.php?RCID=bfb3b4478f89834a3fe8908ef38a9a20 Training tricks: activation functions, network initialization, and other stuff...]
 +
|-
 +
|20/10/2021 || Wednesday || 15:15 - 17:00 || T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || rowspan="2" | The Image Classification Problem
 +
|-
 +
|20/10/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|21/10/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || Convolutional Neural Networks
 +
|-
 +
|27/10/2021 || Wednesday || 15:15 - 17:00 || T.2.1 (Team 1) || rowspan="2" | [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | KERAS: Convolutional Neural Networks
 +
|-
 +
|27/10/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|28/10/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || Training with data scarsity
 +
|-
 +
|03/11/2021 || Wednesday || 15:15 - 17:00 ||T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | KERAS: Convolutional Neural Networks
 +
|-
 +
|03/11/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|04/11/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || Famous CNN architectures
 +
|-
 +
|10/11/2021 || Wednesday || --- || --- || --- || -- No Lecture (Prove in Itinere) --
 +
|-
 +
|11/11/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || Fully Convolutional CNN, CNN for image segmentation
 +
|-
 +
|17/11/2021 || Wednesday || 15:15 - 17:00 ||T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || rowspan="2" | CNN for localization and detction
 +
|-
 +
|17/11/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|18/11/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/giacomo.boracchi Giacomo Boracchi] || GANs
 +
|-
 +
|24/11/2021 || Wednesday || 15:15 - 17:00 ||T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | KERAS: Autoencoder, classification, segmentation
 +
|-
 +
|24/11/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|25/11/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || Recurrent neural networks + LSTM
 +
|-
 +
|01/12/2021 || Wednesday || 15:15 - 17:00 ||T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | KERAS: learning with time series
 +
|-
 +
|01/12/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|02/12/2021 || Thursday || 16:30 - 19:15 || Virtual Room || [https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || Sequence to sequence learning and Word Embedding
 +
|-
 +
|08/12/2021 || Wednesday || --- || --- || --- || -- No Lecture (Holiday) --
 +
|-
 +
|09/12/2021 || Thursday || 16:30 - 19:15 || Virtual Room ||[https://politecnicomilano.webex.com/join/matteo.matteucci Matteo Matteucci] || Attention Mechanism and Transformer
 +
|-
 +
|15/12/2021 || Wednesday || 15:15 - 17:00 ||T.2.1 (Team 1)|| rowspan="2" | [https://politecnicomilano.webex.com/join/francesco.lattari Francesco Lattari] || rowspan="2" | KERAS: learning with text
 +
|-
 +
|15/12/2021 || Wednesday || 17:30 - 19:15 || T.2.1 (Team 2)
 +
|-
 +
|16/12/2021 || Thursday || 16:30 - 19:15 || Virtual Room || --- || -- Spare Lecture --
 +
|-
 +
|}
  
 +
<!-- 2020 - 2021
 
{| border="1" align="center" style="text-align:center;"
 
{| border="1" align="center" style="text-align:center;"
 
|-
 
|-
Line 181: Line 287:
 
|-
 
|-
 
|}
 
|}
 
+
-->
  
 
<!-- [2019/2020]
 
<!-- [2019/2020]
Line 253: Line 359:
 
Course evaluation is composed of two parts:
 
Course evaluation is composed of two parts:
  
* A written examination covering the whole program graded up to 26/32 ... may be less
+
* A written examination covering the whole program graded up to 22/30
* A home project in the form of a "Kaggle style" competition practicing the topics of the course graded up to 6/32 ... may be more
+
* 2 home projects in the form of a "Kaggle style" challenge practicing the topics of the course graded up to 4/30 each
 
+
The final score will sum the grade of the written exam and the grade of the home project.
+
  
 +
The final score will sum the grade of the written exam and the grade of the home projects. Home projects are not compulsory and they are issued only once a year.
 +
<!--
 
You can find here one [[Media:AN2DL_ExamExample.pdf|example of the exam text]] to get a flavor of what to expect in the written examination.
 
You can find here one [[Media:AN2DL_ExamExample.pdf|example of the exam text]] to get a flavor of what to expect in the written examination.
 +
-->
  
 
==Teaching Material (the textbook)==  
 
==Teaching Material (the textbook)==  
Line 265: Line 372:
  
 
* [http://www.deeplearningbook.org/ Deep Learning]. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016.
 
* [http://www.deeplearningbook.org/ Deep Learning]. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016.
 +
<!-- The remaining material about the course is available through [http://webeep.polimi.it WeBeep].-->
 +
 +
Regarding the python programming language, we will provide you the basics about numpy and python scripting in case you want some introductory material you can check here
 +
 +
* [https://docs.python.org/3/tutorial/ Python tutorials]: these are the official python tutorials, we suggest 3.An Informal Introduction to Python (Numbers, Strings, Lists), 4.More Control Flow Tools (if, for, range, functions), 5.Data Structures (More on lists, Dictionaries), 9.Classes.
  
 
===Course Slides===
 
===Course Slides===
  
 
Slides from the lectures by Matteo Matteucci
 
Slides from the lectures by Matteo Matteucci
*[[Media:AN2DL_00_2021_Course_Introduction.pdf|[2020/2021] Course Introduction]]: introductory slides of the course with useful information about the course syllabus, grading, and the course logistics.  
+
*[[Media:AN2DL_00_2121_Course_Introduction.pdf|[2021/2022] Course Introduction]]: introductory slides of the course with useful information about the course syllabus, grading, and the course logistics.  
*[[Media:AN2DL_01_2021_Deep_Learning_Intro.pdf|[2020/2021] Machine Learning vs Deep Learning]]: introduction to machine learning paradigms and definition of deep learning with examples
+
*[[Media:AN2DL_01_2122_Deep_Learning_Intro.pdf|[2021/2022] Machine Learning vs Deep Learning]]: introduction to machine learning paradigms and definition of deep learning with examples
*[[Media:AN2DL_02_2021_Prceptron_2_FeedForward_v1.pdf|[2020/2021] From Perceptrons to Feed Forward Neural Networks]]: the original Perceptron model, Hebbian learning, feed-forward architecture, backpropagation and gradient descent, error functions and maximum likelihood estimation   
+
*[[Media:AN2DL_02_2122_Perceptron_2_FeedForward.pdf|[2021/2022] From Perceptrons to Feed Forward Neural Networks]]: the original Perceptron model, Hebbian learning, feed-forward architecture, backpropagation and gradient descent, error functions and maximum likelihood estimation   
 
*[[Media:AN2DL_03_2021_NeuralNetwroksTraining_tmp2.pdf|[2020/2021] Neural Networks Training]]: dealing with overfitting (weight decay, early stopping, dropout), vanishing gradient (ReLU and friends), batch normalization  
 
*[[Media:AN2DL_03_2021_NeuralNetwroksTraining_tmp2.pdf|[2020/2021] Neural Networks Training]]: dealing with overfitting (weight decay, early stopping, dropout), vanishing gradient (ReLU and friends), batch normalization  
 
*[[Media:AN2DL_04_2021_RecurrentNeuralNetworks.pdf|[2020/2021] Recurrent Neural Networks]]: learning with sequences, Recurrent Neural Networks, vanishing gradient, Long Short-Term Memories (LSTM), seq2seq model.   
 
*[[Media:AN2DL_04_2021_RecurrentNeuralNetworks.pdf|[2020/2021] Recurrent Neural Networks]]: learning with sequences, Recurrent Neural Networks, vanishing gradient, Long Short-Term Memories (LSTM), seq2seq model.   
Line 278: Line 390:
  
 
Slides from the lectures by Giacomo Boracchi are available in [https://boracchi.faculty.polimi.it/teaching/AN2DL.htm his webpage], for you  
 
Slides from the lectures by Giacomo Boracchi are available in [https://boracchi.faculty.polimi.it/teaching/AN2DL.htm his webpage], for you  
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez1_ImageClassification.pdf Image Classification]: Image classification and related issues, template matching, image classification via nearest neighbors methods, image classification via linear classifiers, image classification via hand-crafted features.  
+
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez1_ImageClassification.pdf Image Classification]]: Image classification and related issues, template matching, image classification via nearest neighbors methods, image classification via linear classifiers, image classification via hand-crafted features.  
 
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez2_CNN.pdf Convolutional Neural Networks]: From hand-crafted features to convolutional neural networks.
 
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez2_CNN.pdf Convolutional Neural Networks]: From hand-crafted features to convolutional neural networks.
 
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez3_4_CNN_TL_Data_Scarcity.pdf Training Convolutional Neural Networks]: How to train CNNs, famous architectures, data augmentation, and the like.
 
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez3_4_CNN_TL_Data_Scarcity.pdf Training Convolutional Neural Networks]: How to train CNNs, famous architectures, data augmentation, and the like.
 
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez5_CNN_Architectures_CNN_for_segmentation.pdf Convolutional Neural Networks for Image Segmentation]: CNN architectures for segmentation and detection.
 
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez5_CNN_Architectures_CNN_for_segmentation.pdf Convolutional Neural Networks for Image Segmentation]: CNN architectures for segmentation and detection.
 +
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez6_CNN_Architectures_CNN_for_object_detection.pdf Convolutional Neural Networks for Localization and Object Detection]
 +
* [https://boracchi.faculty.polimi.it/teaching/AN2DL/2020_AN2DL_Lez7_Generative_Models.pdf Generative Adversarial Networks]
  
Slides from the practicals by Francesco Lattari
+
 
*[[Media:AN2DL_Lab1_2020_KerasIntroduction.zip|[2020/2021] Introduction to Keras]]: Introduction to Keras and Tensorflow2 (slides + notebook)
+
Slides from the practicals by Francesco Lattari and Eugenio Lomurno
 +
* [https://colab.research.google.com/drive/13hoe0sHZ8ZOOfWV0wN0Q8SYmLj_oz6Vh?usp=sharing#scrollTo=UO1ENESOKXmf A Crash on Python]: a Colab notebook with a crash course on Python 3 to prepare for the practicals!
 +
* [https://drive.google.com/file/d/1-qNsIRvllmhetxX8Nmns3OYJwqn4KFhZ/view?usp=sharing Notebook 1:] NumPy Arrays vs TensorFlow2 Tensors
 +
* [https://drive.google.com/file/d/12ODGsMkMPxFykDXp0w1dWj_F1FDU65sh/view?usp=sharing Notebook 2:] FeedForward Neural Network
 +
* [https://colab.research.google.com/drive/1LudK5lDFqbD-DG8p_U5wGSx9kTidrkWe?usp=sharing Notebook 3:] Hold Out, Early Stopping and Cross Validation
 +
 
 +
<!--*[[Media:AN2DL_Lab1_2020_KerasIntroduction.zip|[2020/2021] Introduction to Keras]]: Introduction to Keras and Tensorflow2 (slides + notebook)
 
*[[Media:AN2DL_Lab1_2020_KerasOverfitting.zip|[2020/2021] Facing overfitting in Keras]]: Techniques to limit overfitting in Keras and Tensorboard use (slides + notebook)
 
*[[Media:AN2DL_Lab1_2020_KerasOverfitting.zip|[2020/2021] Facing overfitting in Keras]]: Techniques to limit overfitting in Keras and Tensorboard use (slides + notebook)
 
*[[Media:AN2DL_Lab1_2020_KerasCNN.zip|[2020/2021] Convolutional architectures in Keras]]: How to build, train, and evaluate convolutional models for image classification in Keras and Tensorflow2 (slides + notebook)
 
*[[Media:AN2DL_Lab1_2020_KerasCNN.zip|[2020/2021] Convolutional architectures in Keras]]: How to build, train, and evaluate convolutional models for image classification in Keras and Tensorflow2 (slides + notebook)
 
*[[Media:AN2DL_Lab1_2020_KerasSegmentation.zip|[2020/2021] Image Segmentation in Keras]]: How to build, train, and evaluate convolutional models for image segmentation in Keras and Tensorflow2 (slides + notebook)
 
*[[Media:AN2DL_Lab1_2020_KerasSegmentation.zip|[2020/2021] Image Segmentation in Keras]]: How to build, train, and evaluate convolutional models for image segmentation in Keras and Tensorflow2 (slides + notebook)
*[[Media:AN2DL_Lab1_2020_KerasRecurrentNeuralNetworks.zip|[2020/2021] Recurrent architectures in Keras]]: How to build, train, and evaluate recurrent neural architectures in Keras and Tensorflow2 (slides + notebook)
+
*[[Media:AN2DL_Lab1_2020_KerasRecurrentNeuralNetworks.zip|[2020/2021] Recurrent architectures in Keras]]: How to build, train, and evaluate recurrent neural architectures in Keras and Tensorflow2 (slides + notebook)-->
 +
 
 
<!--
 
<!--
 
*[[Media:AN2DL_Lab1_2019_KerasIntroduction.zip|[2019/2020] Introduction to Keras]]: Introduction to Keras and Tensorflow2 (slides + notebook)
 
*[[Media:AN2DL_Lab1_2019_KerasIntroduction.zip|[2019/2020] Introduction to Keras]]: Introduction to Keras and Tensorflow2 (slides + notebook)
Line 294: Line 415:
 
*[[Media:AN2DL_Lab3_2019_KerasRNN.zip|[2019/2020] Recurrent architectures in Keras]]: How to build, train, and evaluate recurrent neural architectures in Keras and Tensorflow2 (slides + notebook)
 
*[[Media:AN2DL_Lab3_2019_KerasRNN.zip|[2019/2020] Recurrent architectures in Keras]]: How to build, train, and evaluate recurrent neural architectures in Keras and Tensorflow2 (slides + notebook)
 
-->
 
-->
 
+
<!--
 
===Past Online Exams===
 
===Past Online Exams===
  
Line 341: Line 462:
 
*[https://www.kaggle.com/t/ef3b29a4a5d74b2fbb3e3ec50ca4e206 Image Segmentation Homework]: the second homework is about image segmentation with convolutional neural networks and the like. The deadline to submit the results is December the 17th.
 
*[https://www.kaggle.com/t/ef3b29a4a5d74b2fbb3e3ec50ca4e206 Image Segmentation Homework]: the second homework is about image segmentation with convolutional neural networks and the like. The deadline to submit the results is December the 17th.
 
*[https://www.kaggle.com/t/6d6fd208f06f4d1abd71d47bd36586ce Visual Question Answering Homework]: the third homework is about visual question answering with convolutional and recurrent neural networks ... plus word2vec. The deadline to submit the results is January the 15th.
 
*[https://www.kaggle.com/t/6d6fd208f06f4d1abd71d47bd36586ce Visual Question Answering Homework]: the third homework is about visual question answering with convolutional and recurrent neural networks ... plus word2vec. The deadline to submit the results is January the 15th.
 +
-->

Latest revision as of 22:55, 14 October 2021


The following are last-minute news you should be aware of ;-)

13/10/2021: Final grade for 2020/2021 year are here
30/09/2021: Video and slides updated + notebooks published
24/09/2021: Removed last year practicals, this year they will be different!!! Python crash course moved at the end of the material
22/09/2021: Added colab crash course on Python 3 in the material session
18/09/2021: We skip exercising on 22/09 and added exercising on 15/12 I updated the schedule!
16/09/2021: Updated material on the first lectures + added reference to python tutorials in the material section
15/09/2021: Lectures start today!
14/09/2021: Website under maintenance ... come back later


Course Aim & Organization

Neural networks are mature, flexible, and powerful non-linear data-driven models that have successfully been applied to solve complex tasks in science and engineering. The advent of the deep learning paradigm, i.e., the use of (neural) network to simultaneously learn an optimal data representation and the corresponding model, has further boosted neural networks and the data-driven paradigm.

Nowadays, deep neural network can outperform traditional hand-crafted algorithms, achieving human performance in solving many complex tasks, such as natural language processing, text modeling, gene expression modeling, and image recognition. The course provides a broad introduction to neural networks (NN), starting from the traditional feedforward (FFNN) and recurrent (RNN) neural networks, till the most successful deep-learning models such as convolutional neural networks (CNN) and long short-term memories (LSTM).

The course major goal is to provide students with the theoretical background and the practical skills to understand and use NN, and at the same time become familiar and with Deep Learning for solving complex engineering problems.

Teachers

The course is composed of a blending of lectures and exercises by the course teachers and a teaching assistant.

Course Program and Syllabus

This goal is pursued in the course by:

  • Presenting major theoretical results underpinning NN (e.g., universal approx, vanishing/exploding gradient, etc.)
  • Describing the most important algorithms for NN training (e.g., backpropagation, adaptive gradient algorithms, etc.)
  • Illustrating the best practices on how to successfully train and use these models (e.g., dropout, data augmentation, etc.)
  • Providing an overview of the most successful Deep Learning architectures (e.g., CNNs, sparse and dense autoencoder, LSTMs for sequence to sequence learning, etc.)
  • Providing an overview of the most successful applications with particular emphasis on models for solving visual recognition tasks.

We have compiled a detailed syllabus of the course student can use to double check their preparation against before the exam.

  • [2020/2021] Course Syllabus: a detailed list of topics covered by the course and which students are expected to know when approaching the exam

Detailed course schedule

A detailed schedule of the course can be found here; topics are just indicative while days and teachers are correct up to some last minute change (I will notify you by email). Please note that not all days we have lectures!!

Note: Lecture timetable interpretation
* On Wednesday, in T.2.1, Team 1, starts at 15:15, ends at 17:00
* On Wednesday, in T.2.1, Team 2, starts at 17:30, ends at 19:15
* On Thursday, in teacher webex room, starts at 16:30, ends at 19:15
Note: Teams division is based on your Codice Persona (and should minimize overlap)
* Team 1: odd Codice Persona
* Team 2: even Codice Persona
For a google calendar you might look here!
Date Day Time Room Teacher Topic
15/09/2021 Wednesday 15:15 - 17:00 T.2.1 (Team1 ) Matteo Matteucci Course Introduction + Deep Learning Intro
15/09/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
16/09/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci From Perceptrons to Feed Forward Neural Networks
22/09/2021 Wednesday -- -- -- -- No Lecture --
22/09/2021 Wednesday -- --
23/09/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci Feed forward neural networks and Backpropagation
29/09/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: Numpy, Tensorflow and FNN
29/09/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
30/09/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci Error Functions Design
06/10/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Matteo Matteucci Overfitting, cross-validation, and Early Stopping
06/10/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
07/10/2021 Thursday --- --- --- No Lectures (Graduation)
13/10/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: FFNN and Overfitting
13/10/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
14/10/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci Training tricks: activation functions, network initialization, and other stuff...
20/10/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Giacomo Boracchi The Image Classification Problem
20/10/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
21/10/2021 Thursday 16:30 - 19:15 Virtual Room Giacomo Boracchi Convolutional Neural Networks
27/10/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: Convolutional Neural Networks
27/10/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
28/10/2021 Thursday 16:30 - 19:15 Virtual Room Giacomo Boracchi Training with data scarsity
03/11/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: Convolutional Neural Networks
03/11/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
04/11/2021 Thursday 16:30 - 19:15 Virtual Room Giacomo Boracchi Famous CNN architectures
10/11/2021 Wednesday --- --- --- -- No Lecture (Prove in Itinere) --
11/11/2021 Thursday 16:30 - 19:15 Virtual Room Giacomo Boracchi Fully Convolutional CNN, CNN for image segmentation
17/11/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Giacomo Boracchi CNN for localization and detction
17/11/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
18/11/2021 Thursday 16:30 - 19:15 Virtual Room Giacomo Boracchi GANs
24/11/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: Autoencoder, classification, segmentation
24/11/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
25/11/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci Recurrent neural networks + LSTM
01/12/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: learning with time series
01/12/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
02/12/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci Sequence to sequence learning and Word Embedding
08/12/2021 Wednesday --- --- --- -- No Lecture (Holiday) --
09/12/2021 Thursday 16:30 - 19:15 Virtual Room Matteo Matteucci Attention Mechanism and Transformer
15/12/2021 Wednesday 15:15 - 17:00 T.2.1 (Team 1) Francesco Lattari KERAS: learning with text
15/12/2021 Wednesday 17:30 - 19:15 T.2.1 (Team 2)
16/12/2021 Thursday 16:30 - 19:15 Virtual Room --- -- Spare Lecture --


Course Evaluation

Course evaluation is composed of two parts:

  • A written examination covering the whole program graded up to 22/30
  • 2 home projects in the form of a "Kaggle style" challenge practicing the topics of the course graded up to 4/30 each

The final score will sum the grade of the written exam and the grade of the home projects. Home projects are not compulsory and they are issued only once a year.

Teaching Material (the textbook)

Lectures will be based on material from different sources, teachers will provide their slides to students as soon they are available. As a general reference you can check the following text, but keep in mind that teachers will not follow it strictly

  • Deep Learning. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016.

Regarding the python programming language, we will provide you the basics about numpy and python scripting in case you want some introductory material you can check here

  • Python tutorials: these are the official python tutorials, we suggest 3.An Informal Introduction to Python (Numbers, Strings, Lists), 4.More Control Flow Tools (if, for, range, functions), 5.Data Structures (More on lists, Dictionaries), 9.Classes.

Course Slides

Slides from the lectures by Matteo Matteucci

Slides from the lectures by Giacomo Boracchi are available in his webpage, for you


Slides from the practicals by Francesco Lattari and Eugenio Lomurno