Let's talk ML

Let's Talk ML is a regular meeting of people interested in machine learning and eager to share the knowledge with others..
We meet at 14:30am, every odd thursday in Datalab, CTU FIT (room A-1347).

The format is usually two short talks followed by discussion. The talk could be about anything Machine Learning related - your own research, an interesting ML paper, or some new exciting method.

Sign up to our mailing list to get notifications about new Let's Talk events.

Events dates:

  • [28. February] Let's Talk ML

    Petr Nevyhoštěný - Introduction to Graph Neural Networks (slides)

    Deep learning has achieved a great success in machine learning tasks, ranging from image and video classification, speech recognition and natural language understanding. However, there is an increasing number of applications where data are from non-Euclidean domains and are represented as graph structures. This talk will be a brief introduction to the topic of graph neural networks which attempt to deal with these problems.


    Radek Bartyzal - Dropout is a special case of the stochastic delta rule: faster and more accurate deep learning (slides)

    This talk will explain how is replacing weights with random variables connected to Dropout and what benefits it may bring.

  • [13. December, 11:00] Let's Talk ML

    Radek Bartyzal - Dataset Distillation
    Model distillation aims to distill the knowledge of a complex model into a simpler one. Dataset distillation keeps the model fixed and instead attempts to distill the knowledge from a large training dataset into a small one. The idea is to synthesize a small number of examples that will, when given to the learning algorithm as training data, approximate the model trained on the original data.

    Václav Ostrožlík - Self-Normalizing Neural Networks
    Neural networks are gaining success at many domains in the last years. However, it seems like the main stage belongs to convolutional and recurrent networks while the feed forward neural networks (FNNs) are left behind in the beginner tutorial sections. FNNs that perform well are typically shallow and, therefore cannot exploit many levels of abstract representations. Authors of this paper propose Self-Normalizing Neural Networks allowing training deeper feed-forward networks with a couple of new techniques.

  • [6 April, 11:00] Let's Talk ML

    Ondřej Bíža: Recurrent Convolutional Networks (slides)
    Petr Nevyhoštěný: Restricted Boltzmann Machines (slides)

  • [23 March, 11:00] Let's Talk ML

    Place: TH:A:1347

    Markéta Jůzlová: Attacking machine learning with adversarial examples (slides)
    Veronika Maurerová: Feature extraction using CNN (slides)

     

  • [9 March, 11:00] Let's Talk ML

    Place: TH:A:1347

    Petr Nevyhoštěný: EU law and machine learning (slides)
    Václav Ostrožlík: Convolutional Neural Networks (slides)

  • [23 February, 11:00] Let's Talk ML

    Place: TH:A:1347

    Radek Bartyzal: Generative Adversarial Networks (slides)
    Veronika Maurerová: A Model-based Approach to Optimizing Ms. Pac-Man Game Strategies in Real Time (slides)

  • [8 December, 13:00] Let's Talk ML

    Place: T9:364

    Radek Bartyzal: t-SNE (slides)
    Václav Ostrožlík: Word2vec: A deeper look

  • [24 November, 13:00] Let's Talk ML

    Place: T9:364

    Václav Ostrožlík: Word2vec (slides)
    Petr Nevyhoštěný: Mood classification from lyrics (slides)

  • [10 November, 13:00] Let's Talk ML

    Place: T9:364

    Markéta Jůzlová: Dimensionality Reduction (slides)
    Petr Nevyhoštěný: Conditional Random Fields (slides)

  • [27 October, 13:00] Let's Talk ML

    Place: T9:364

    Veronika Maurerová: Gradient boosting machines (slides)
    Tomáš Frýda: Gaussian Processes (slides)

  • [21 September, 15:00] Let's Talk ML

    Place: TH:A:1242

    Václav Ostrožlík: WaveNet (slides)
    Veronika Maurerová: Predikce kriminality (slides)

  • [31 August, 14:00] Let's Talk ML

    Place: TH:A:1242

    Radek Bartyzal: Neural Machine Translation (slides)

  • [24 August, 14:00] Let's Talk ML

    Place: TH:A:1242

    Václav Ostrožlík: Dropout (slides)
    Tomáš Frýda: Introduction to Bayesian optimization (slides, Jupyter notebook)

  • [17 August, 14:00] Let's Talk ML

    Place: TH:A:1242

    Radek Bartyzal: Why ReLU? (slides)
    Tomáš Pajurek: Event streaming and storing in Azure (slides)

  • [10 August, 13:00] Let's Talk ML

    Place: TH:A:1242

    Radek Bartyzal: Online Optimization Algorithm (slides)
    Václav Ostrožlík: Neural Style (slides)


Follow Us

Copyright (c) Data Science Laboratory @ FIT CTU 2014–2016. All rights reserved.