Statistické strojové učení

Cvičení

Zkouška

Cheat sheet

A cheat sheet is allowed for the exam, as described in course note from 2020-01-13:

You are allowed to prepare & use one A4 page with handwritten notes (one sided).

Example cheat sheet:

2019/2020 winter semester, prepared by bartefil

27.1.2017

Celý test:

exam_ws16.pdf

18.1.2019

Zkoušky z minulých let, které se mi podařilo získat:

zkousky_ssu.pdf1)

18.1.2019

Midterm test, který jsme si letos mohli vypracovat nanečisto.

mid_term_test_ws18.pdf

22.1.2019

20.1.2020

be4m33ssu_exam_2020-01-20.pdf

Solutions by bartefil:

  1. Regular Perceptron algorithm. See slide 4 in svm1_ws2019.pdf.
  2. Assignment 2:
    1. a) Evaluate ER for each h. Select h that maximizes ER.
    2. b) 5000 log(4000)
  3. Assignment 3:
    1. a) \alpha(k) = \frac{p(x, k)}{\sum_{k'} p(x, k')}
    2. b) Partial solution: \max_\pi \sum_{l=1}^m \sum_{k=0}^n \alpha_l(k) \log \pi_k
  4. Assignment 4:
    • Algorithm: for i in range(n): for k in K: p(s_i = k) := \sum_{k' \in K} p(s_i = k | s_{i-1} = k') p(s_{i-1} = k')
    • Complexity: O(n |K|^2)
  5. See slide 38 in ensembling-ws2019.pdf. Discussion missing.
  6. Assignment 5:
    • Auxiliary: y_j = \sum_i x_i w_{i,j}
    • Forward: z_k = \max(y_k, a_k y_k)
    • Backward: dz_k / dx_i = ([y_k > a_k y_k] + [z_k \leq a_k z_k] a_k) w_{i,k}
    • Parameter:
      • dz_k / da_p = [p = k] [y_k \leq a_k y_k] y_k
      • dz_k / dw_{l,m} = [k = m] ([y_k > a_k y_k] + [z_k \leq a_k z_k] a_k) x_l

3.2.2020

13.1.2021

19.1.2021

4.2.2021

21.1.2022

1.2.2022

1.2.2023

14.2.2023

Helpful materials

Literature

https://web.stanford.edu/~hastie/Papers/ESLII.pdf

Majority of SSU subjects understandably explained here: http://www.cs.huji.ac.il/~shais/UnderstandingMachineLearning/understanding-machine-learning-theory-algorithms.pdf

SVM

Lecture on SVM on MIT https://www.youtube.com/watch?v=_PwhiWxHK8o

MIT notes: https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-fall-2010/tutorials/MIT6_034F10_tutor05.pdf

https://www.youtube.com/watch?v=IOetFPgsMUc + pokracovanie v part II. a III.

Neural nets + convolutional

3Blue1Brown: Neural Networks (YouTube playlist) Nice basic explanation of how neural networks work. Chapters 3 and 4 provide efficient explanations of backpropagation using good visualizations.

https://www.youtube.com/watch?v=vT1JzLTH4G4&list=PL3FW7Lu3i5JvHM8ljYj-zLfQRF3EO8sYv Whole course on neural nets and convolutional networks. Very comprehensive lectures, explained from the basic concepts plus nice motivation examples.

MLE

First what is likelihood?

EM + gaussian mixture

https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-867-machine-learning-fall-2006/lecture-notes/lec15.pdf

Andrew Ng: Lecture on clustering, mixture of Gaussians, Jensen's inequality, EM algorithm (CS 229, Stanford University): video, lecture notes

Thomas P. Minka: Expectation-Maximization as lower bound maximization (recommended in lecture slides in 2019)

Bayes learning

MIT notes: https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-fall-2010/tutorials/MIT6_034F10_tutor06.pdf

GBM https://www.gormanalysis.com/blog/gradient-boosting-explained/

1) Note that pages 5 and 6 duplicate pages 1 and 2 respectively.
courses/be4m33ssu.txt · Poslední úprava: 2023/06/30 13:37 autor: pedro
Nahoru
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0