[7 March, 13:00] Let's Talk ML

Markéta Jůzlová - Hyperband (slides)

Hyperband is a multi-armed bandit strategy proposed for hyper-parameter optimization of learning algorithms. Despite its conceptual simplicity, the authors report competitive results to state-of-the-art hyper-parameter optimization methods such as Bayesian optimization.
I will describe the main principle of the method and its possible extension.

Ondra Bíža - Visualizing Deep Neural Networks (slides)

Techniques for visualizing deep neural networks have seen significant improvements in the last year. I will explain a novel algorithm for visualizing convolutional filters and use it to analyze a deep residual network.

[21 February, 13:00] Let's Talk ML

Radek Bartyzal - Born Again Neural Networks (slides)

Knowledge distillation is a process of training a compact model (student) to approximate the results of a previously trained, more complex model (teacher).
The authors of this paper have inspired themselves by this idea and tried training a student of same complexity as its teacher and found that the student surpasses the teacher in many cases. They also try to train a student that has a different architecture than the teacher with interesting results.

This will be one longer (40 min) talk where I will also describe the relevant architectures used in the paper. (DenseNet, Wide ResNet).

[14 December, 11:00] Let's Talk ML

Ondra Bíža - Overcoming catastrophic forgetting in neural networks (slides)

J. Kirkpatrick et al. (2017)
Artificial Neural Networks struggle with learning multiple different tasks due to a phenomenon known as catastrophic forgetting. In my talk, I will introduce catastrophic forgetting, describe a new learning algorithm called EWC that mitigates it and briefly mention neurobiological principles that inspired the creation of EWC.

Ondra Podsztavek - Deep Q-network (slides)

Deep Q-network (DQN) is a DeepRL system which combines deep neural networks with reinforcement learning and is able to master a diverse range of Atari 2600 games with only the raw pixels and score as input. It represents a general-purpose agent that is able to adapt its behaviour without any human intervention.

[30 November, 11:00] Let's Talk ML

Filip Paulů - Analogové umělé neuronové sítě (slides)

Výkon dnes používaných neuronových sítí přestává stačit. Jak můžeme jejich rychlost zvýšit? Je nadále cesta počítat složité a rozsáhlé struktury neuronových sítí na digitálních procesorech a grafických kartách? Je možné jít jinou cestou? O tom všem si budeme povídat.

Václav Ostrožlík - Capsule Networks (slides)

Geoffrey Hinton, jeden z "otců deep learningu", nedávno publikoval dva články představující novou architekturu neuronových sítí, nazvanou Capsule Networks. V přednášce ukážu princip fungování těchto sítí, jak je možné je trénovat a jaké nové možnosti přinášejí.

[16 November, 11:00] Let's Talk ML

Tomáš Pajurek: Machine Learning infrastructure in Azure (slides)

Talk will be focused on early stages of ML pipeline (data ingestion, storage, preprocessing) and also cluster computation services. Some topics will be accompanied with hands-on examples. 

Vladimir Ananyev: Unsupervised feature selection for time series clustering (slides)

I will present how clustering can be performed using the extracted features instead of the raw time series, and also the problem of selecting relevant feature subsets and some of the techniques that can be used for that purpose.

Follow Us

Copyright (c) Data Science Laboratory @ FIT CTU 2014–2016. All rights reserved.