{"id":246,"date":"2013-11-27T12:41:54","date_gmt":"2013-11-27T12:41:54","guid":{"rendered":"http:\/\/eventmanagerblog.com\/event-wordpress-theme-tyler\/?page_id=2"},"modified":"2019-04-25T09:07:14","modified_gmt":"2019-04-25T07:07:14","slug":"program","status":"publish","type":"page","link":"https:\/\/excel.fit.vutbr.cz\/2019\/program\/","title":{"rendered":"Program"},"content":{"rendered":"\r\n<div id=\"program\">\r\n<div class=\"container widget\" style=\"margin-bottom: 1.5em;\">\r\n<div class=\"row\">\r\n<div class=\"col-md-12\">\r\n<p>Konference Excel@FIT, kter\u00e1 bude prob\u00edhat ve \u010dtvrtek 25. 4. 2019 na <a href=\"http:\/\/www.fit.vutbr.cz\/\" target=\"blank\" rel=\"noopener noreferrer\">Fakult\u011b informa\u010dn\u00edch technologi\u00ed VUT v Brn\u011b<\/a>, p\u0159edstav\u00ed p\u0159ijat\u00e9 autorsk\u00e9 pr\u00e1ce a prezenta\u010dn\u00ed instalace sponzor\u016f z oblasti IT.<\/p>\r\n<\/div>\r\n<\/div>\r\n<\/div>\r\n<div id=\"tile_textcolumns\" class=\"container widget\">\r\n<div class=\"row\">\r\n<div class=\"col-md-4\">\r\n<h3>Dopoledne<\/h3>\r\n<p><img class=\"img-responsive center-block\" src=\"\/wp-content\/images\/2019\/dopoledne.jpg\" alt=\"\"><\/p>\r\n<p>V hlavn\u00edm s\u00e1le konference zazn\u00ed <strong>odborn\u00e9 refer\u00e1ty autor\u016f<\/strong>, kte\u0159\u00ed byli vybr\u00e1ni programov\u00fdm v\u00fdborem Excel@FIT a prob\u011bhne <strong>panelov\u00e1 diskuze na vybran\u00e9 t\u00e9ma<\/strong>.<\/p>\r\n<\/div>\r\n<div class=\"col-md-4\">\r\n<h3>Odpoledne<\/h3>\r\n<p><img class=\"img-responsive center-block\" src=\"\/wp-content\/images\/2019\/odpoledne.jpg\" alt=\"\"><\/p>\r\n<p>V ur\u010den\u00fdch prostor\u00e1ch konference prob\u011bhne <strong>voln\u00e1 p\u0159ehl\u00eddka v\u0161ech sout\u011b\u017en\u00edch prac\u00ed<\/strong> formou plak\u00e1t\u016f a prototyp\u016f a <strong>prezenta\u010dn\u00ed instalace host\u016f<\/strong>.<\/p>\r\n<\/div>\r\n<div class=\"col-md-4\">\r\n<h3>Z\u00e1v\u011brem<\/h3>\r\n<p><img class=\"img-responsive center-block\" src=\"\/wp-content\/images\/2019\/zaver.jpg\" alt=\"\"><\/p>\r\n<p>V hlavn\u00edm s\u00e1le konference bude <strong>vyhl\u00e1\u0161en\u00ed nejlep\u0161\u00edch prac\u00ed<\/strong> a <strong>p\u0159ed\u00e1n\u00ed cen<\/strong>.<\/p>\r\n<\/div>\r\n<\/div>\r\n<\/div>\r\n<div class=\"row\">\r\n<div class=\"col-md-6\">\r\n<h3>Program konference<\/h3>\r\n<table class=\"program-table table\">\r\n<tbody>\r\n<tr class=\"default\">\r\n<th>8.53<\/th>\r\n<td>Zah\u00e1jen\u00ed<\/td>\r\n<\/tr>\r\n<tr class=\"default\">\r\n<th>9.00<\/th>\r\n<td>P\u0159edn\u00e1\u0161ky<\/td>\r\n<\/tr>\r\n<tr class=\"refreshment\">\r\n<th>11.15<\/th>\r\n<td>P\u0159est\u00e1vka<\/td>\r\n<\/tr>\r\n<tr class=\"default\">\r\n<th>11.30<\/th>\r\n<td>Panelov\u00e1 diskuze<\/td>\r\n<\/tr>\r\n<tr class=\"refreshment\">\r\n<th>12.30<\/th>\r\n<td>Ob\u011bd a networking<\/td>\r\n<\/tr>\r\n<tr class=\"default\">\r\n<th>13.30<\/th>\r\n<td>P\u0159ehl\u00eddka studentsk\u00fdch prac\u00ed formou plak\u00e1t\u016f a prototyp\u016f<\/td>\r\n<\/tr>\r\n<tr class=\"refreshment\">\r\n<th>15.30<\/th>\r\n<td>P\u0159est\u00e1vka<\/td>\r\n<\/tr>\r\n<tr class=\"default\">\r\n<th>16.00<\/th>\r\n<td>Slavnostn\u00ed vyhl\u00e1\u0161en\u00ed v\u00fdsledk\u016f a p\u0159ed\u00e1n\u00ed cen<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n<\/div>\r\n<div class=\"col-md-6\">\r\n<h3>Doprovodn\u00fd program<\/h3>\r\n<table class=\"program-table table\">\r\n<tbody>\r\n<tr class=\"default\">\r\n<th>12:00-16:00<\/th>\r\n<td>Prezentace sponzor\u016f a v\u00fdzkumn\u00fdch skupin<br>(foyer D, C a prostory P\u0159ehl\u00eddky)<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n<\/div>\r\n<\/div>\r\n<img class=\"img-responsive center-block\" src=\"\/wp-content\/images\/2019\/mapa-arealu-velka.jpg?v3\" alt=\"\">\r\n<h2>P\u0159edn\u00e1\u0161ky<\/h2>\r\n<p>V dopoledn\u00edm bloku budou v hlavn\u00edm s\u00e1le konference auto\u0159i vybran\u00fdch prac\u00ed prezentovat sv\u00e9 v\u00fdsledky.<\/p>\r\n<link rel=\"stylesheet\" media=\"all\" type=\"text\/css\" href=\"\/submissions\/css\/proceedings.css?v20151119\">  \r\n<div class=\"row\" id=\"submissions\">\r\n\t\t<div class=\"thumbnail topic14\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">1<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal1\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/001\/1_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Insertion of 2D Graphics into a Scene Captured by a Stationary Camera<\/h4>\r\n\t\t\t\t<h5>Son Hai Nguyen<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/001\/1.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/001\/1_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><a href=\"https:\/\/youtu.be\/zNo-B2FJkUY\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"32\" height=\"32\" alt=\"Video\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal1\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">1<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Insertion of 2D Graphics into a Scene Captured by a Stationary Camera<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Son Hai Nguyen<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>Augmented Reality, Computer Vision, Image Processing<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Zpracov\u00e1n\u00ed dat (obraz, zvuk, text apod.)<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>Augmented reality visualizes additional information in real-world environment. Main goal is achieving natural looking of the inserted 2D graphics in a scene captured by a stationary camera with possibility of real time processing. Although several methods tackled foreground segmentation problem, many of them are not robust enough on diverse datasets. Modified background subtraction algorithm ViBe yields best visual results, but because of the nature of binary mask, edges of the segmented objects are coarse. In order to smooth edges, Global Sampling Matting is performed, this refinement greatly increased the perceptual quality of segmentation. Considering that the shadows are not classified by ViBe, artifacts were occurring after insertion of segmented objects on top of the graphics. This was solved by the proposed shadow segmentation, which was achieved by comparing the differences between brightness and gradients of the background model and the current frame. To remove plastic look of the inserted graphics, texture propagation has been proposed, that considers the local and mean brightness of the background. Segmentation algorithms and image matting algorithms are tested on various datasets. Resulted pipeline is demonstrated on a dataset of videos.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/001\/1_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/001\/1.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/001\/1_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><a href=\"https:\/\/youtu.be\/zNo-B2FJkUY\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"48\" height=\"48\" alt=\"Video\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic11\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">13<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal13\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/013\/13_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Efficient Algorithms for Tree Automata<\/h4>\r\n\t\t\t\t<h5>Ond\u0159ej Vale\u0161<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/013\/13.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/013\/13_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal13\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">13<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Efficient Algorithms for Tree Automata<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Ond\u0159ej Vale\u0161<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>finite automata, tree automata, language equivalence, language inclusion, bisimulation, antichains, bisimulation up-to congruence<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Testov\u00e1n\u00ed, anal\u00fdza a verifikace<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>Tree automata and their languages find use in the field of formal verification and theorem proving but for many practical applications performance of existing algorithms for tree automata manipulation is unsatisfactory. In this work a novel algorithm for testing language equivalence and inclusion on tree automata is proposed and implemented as a module of the VATA library with a goal of creating algorithm that is comparatively faster than existing methods on at least a portion of real-world examples. First, existing approaches to equivalence and inclusion testing on both word and tree automata are examined. These existing approaches are then modified to create the bisimulation up-to congruence algorithm for tree automata. Efficiency of this new approach is compared with existing tree automata language equivalence and inclusion testing methods.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/013\/13_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/013\/13.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/013\/13_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic12 topic14\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">15<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal15\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/015\/15_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Mobiln\u00ed aplikace pro rozpozn\u00e1n\u00ed leukokorie ze sn\u00edmku lidsk\u00e9ho obli\u010deje<\/h4>\r\n\t\t\t\t<h5>Pavel H\u0159eb\u00ed\u010dek<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/015\/15.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/015\/15_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><a href=\"https:\/\/youtu.be\/xHnnN1x65S0\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"32\" height=\"32\" alt=\"Video\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal15\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">15<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Mobiln\u00ed aplikace pro rozpozn\u00e1n\u00ed leukokorie ze sn\u00edmku lidsk\u00e9ho obli\u010deje<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Pavel H\u0159eb\u00ed\u010dek<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>Mobiln\u00ed aplikace, Eye Check, Leukokorie, Zdrav\u00e9 o\u010di, iOS, Android, React Native, OpenCV, Dlib, REST<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">U\u017eivatelsk\u00e1 rozhran\u00ed<\/span> <span class=\"label label-default\">Zpracov\u00e1n\u00ed dat (obraz, zvuk, text apod.)<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>C\u00edlem t\u00e9to pr\u00e1ce je n\u00e1vrh a implementace multiplatformn\u00ed multijazy\u010dn\u00e9 mobiln\u00ed aplikace pro rozpozn\u00e1n\u00ed leukokorie ze sn\u00edmku lidsk\u00e9ho obli\u010deje pro platformy iOS a Android. Leukokorie je b\u011blav\u00fd svit zornice, kter\u00fd se p\u0159i pou\u017eit\u00ed blesku m\u016f\u017ee na fotografii objevit. V\u010dasnou detekc\u00ed tohoto symptomu lze zachr\u00e1nit zrak \u010dlov\u011bka. Samotn\u00e1 aplikace umo\u017e\u0148uje analyzovat fotografii u\u017eivatele a detekovat p\u0159\u00edtomnost leukokorie. C\u00edlem aplikace je tedy anal\u00fdza o\u010d\u00ed \u010dlov\u011bka, od \u010deho\u017e je tak\u00e9 odvozen n\u00e1zev mobiln\u00ed aplikace - Eye Check. K vytvo\u0159en\u00ed multiplatformn\u00ed aplikace byl pou\u017eit framework React Native. Pro detekci obli\u010deje a pr\u00e1ci s fotografi\u00ed byly pou\u017eity knihovny OpenCV a Dlib. Komunikace mezi klientem a serverem je \u0159e\u0161ena pomoc\u00ed architektury REST. V\u00fdsledkem je mobiln\u00ed aplikace, kter\u00e1 p\u0159i detekci leukokorie u\u017eivatele upozorn\u00ed, \u017ee by m\u011bl nav\u0161t\u00edvit sv\u00e9ho l\u00e9ka\u0159e.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/015\/15_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/015\/15.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/015\/15_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><a href=\"https:\/\/youtu.be\/xHnnN1x65S0\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"48\" height=\"48\" alt=\"Video\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic14\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">21<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal21\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/021\/21_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>4D-DCT Based Light Field Image Compression<\/h4>\r\n\t\t\t\t<h5>Drahom\u00edr Dlabaja<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/021\/21.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/021\/21_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal21\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">21<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">4D-DCT Based Light Field Image Compression<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Drahom\u00edr Dlabaja<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>Light field, Lossy compression, JPEG, Transform coding, Plenoptic representation, Quality assessment<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Zpracov\u00e1n\u00ed dat (obraz, zvuk, text apod.)<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>This paper proposes a light field image encoding solution based on four-dimensional discrete cosine transform and quantization. The solution is an extension to JPEG baseline compression. A light field image is interpreted and encoded as a four-dimensional volume to exploit both intra and inter view correlation. Solutions to 4D quantization and block traversal are introduced in this paper. The experiments compare the performance of the proposed solution against the compression of individual image views with JPEG and HEVC intra in terms of PSNR. Obtained results show that the proposed solution outperforms the reference encoders for light images with a low average disparity between views, therefore is suitable for images taken by lenslet based light field camera and images synthetically generated.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/021\/21_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/021\/21.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/021\/21_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic02\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">35<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal35\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/035\/35_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Bioinformatic Tool for Classification of Bacteria into Taxonomic Categories Based on the Sequence of 16S rRNA Gene<\/h4>\r\n\t\t\t\t<h5>Nikola Vale\u0161ov\u00e1<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/035\/35.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/035\/35_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal35\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">35<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Bioinformatic Tool for Classification of Bacteria into Taxonomic Categories Based on the Sequence of 16S rRNA Gene<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Nikola Vale\u0161ov\u00e1<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>Machine learning, Metagenomics, Bacteria classification, Phylogenetic tree, 16S rRNA, DNA sequencing, scikit-learn<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Bioinformatika<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>This work deals with the problem of automated classification and recognition of bacteria after obtaining their DNA by the sequencing process. In the scope of this paper, a~new classification method based on the 16S rRNA gene segment is designed and described. The presented principle is based on the tree structure of taxonomic categories and uses well-known machine learning algorithms to classify bacteria into one of the connected classes at a~given taxonomic level. A~part of this work is also dedicated to implementation of the described algorithm and evaluation of its prediction accuracy. The performance of various classifier types and their settings is examined and the setting with the best accuracy is determined. Accuracy of the implemented algorithm is also compared to an existing method based on BLAST local alignment algorithm available in the QIIME microbiome analysis toolkit.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/035\/35_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/035\/35.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/035\/35_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic14\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">45<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal45\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/045\/45_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Robust Speaker Verification Using Deep Neural Networks<\/h4>\r\n\t\t\t\t<h5>J\u00e1n Profant<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/045\/45.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/045\/45_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal45\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">45<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Robust Speaker Verification Using Deep Neural Networks<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>J\u00e1n Profant<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>speaker verification, neural networks, deep learning<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Zpracov\u00e1n\u00ed dat (obraz, zvuk, text apod.)<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>The objective of this work is to study state-of-the-art deep neural networks based speaker verification systems called x-vectors on wideband conditions, such as YouTube. This system takes variable length audio recording and maps it into fixed length embedding which is afterward used to represent the speaker. We compared our systems to BUT's submission to Speakers in the Wild Speaker Recognition Challenge (SITW). We observed, that when comparing single best systems, with recently published x-vectors we were able to obtain more than 4.38 times lower Equal Error Rate on SITW core-core condition compared to SITW submission from BUT. Moreover, we find that diarization substantially reduces error rate when there are multiple speakers for SITW core-multi condition but we could not see the same trend on NIST Speaker Recognition Evaluation 2018 Video Annotations for YouTube data.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/045\/45_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/045\/45.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/045\/45_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic10 topic14\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">48<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal48\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/048\/48_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>U\u010den\u00ed a adaptace neuronov\u00fdch s\u00edt\u00ed pro rozpozn\u00e1v\u00e1n\u00ed textu<\/h4>\r\n\t\t\t\t<h5>Jan Koh\u00fat<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/048\/48.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/048\/48_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal48\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">48<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">U\u010den\u00ed a adaptace neuronov\u00fdch s\u00edt\u00ed pro rozpozn\u00e1v\u00e1n\u00ed textu<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Jan Koh\u00fat<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>rozpozn\u00e1v\u00e1n\u00ed textu, rekurentn\u00ed neuronov\u00e9 s\u00edt\u011b, konvolu\u010dn\u00ed neuronov\u00e9 s\u00edt\u011b, adaptace, aktivn\u00ed u\u010den\u00ed, dataset IMPACT<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Robotika a um\u011bl\u00e1 inteligence<\/span> <span class=\"label label-default\">Zpracov\u00e1n\u00ed dat (obraz, zvuk, text apod.)<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>C\u00edlem t\u00e9to pr\u00e1ce je srovn\u00e1n\u00ed architektur neuronov\u00fdch s\u00edt\u00ed pro rozpozn\u00e1v\u00e1n\u00ed textu. D\u00e1le pak adaptace neuronov\u00fdch s\u00edt\u00ed na jin\u00e9 texty, ne\u017e na kter\u00fdch byly u\u010deny. Pro tyto experimenty vyu\u017e\u00edv\u00e1m rozs\u00e1hl\u00fd a rozmanit\u00fd dataset IMPACT o v\u00edce ne\u017e jednom milionu \u0159\u00e1dk\u016f. Pomoc\u00ed neuronov\u00fdch s\u00edt\u00ed prov\u00e1d\u00edm kontrolu vhodnosti \u0159\u00e1dk\u016f tzn. \u010ditelnost a spr\u00e1vnost v\u00fd\u0159ez\u016f \u0159\u00e1dk\u016f. Celkem srovn\u00e1v\u00e1m 6 \u010dist\u011b konvolu\u010dn\u00edch s\u00edt\u00ed a 9 rekurentn\u00edch s\u00edt\u00ed. Adaptace prov\u00e1d\u00edm na polsk\u00fdch historick\u00fdch textech s~t\u00edm, \u017ee tr\u00e9novac\u00ed data adaptovan\u00fdch s\u00edt\u00ed neobsahovaly texty ve slovansk\u00fdch jazyc\u00edch. Adaptace vyu\u017e\u00edvaj\u00ed p\u0159\u00edstupy aktivn\u00edho u\u010den\u00ed pro v\u00fdb\u011br nov\u00fdch adapta\u010dn\u00edch dat. \u010cist\u011b konvolu\u010dn\u00ed s\u00edt\u011b dosahuj\u00ed \u00fasp\u011b\u0161nosti 98.6 \\%, rekurentn\u00ed s\u00edt\u011b pak 99.5 \\%. \u00dasp\u011b\u0161nost s\u00edt\u00ed p\u0159ed adaptac\u00ed se pohybuje kolem 79\\%, po postupn\u00e9 adaptaci na 2500 \u0159\u00e1dc\u00edch stoupne \u00fasp\u011b\u0161nost na 97 \\%. P\u0159\u00edstupy aktivn\u00edho u\u010den\u00ed dosahuj\u00ed lep\u0161\u00ed \u00fasp\u011b\u0161nosti ne\u017e n\u00e1hodn\u00fd v\u00fdb\u011br. Pro zpracov\u00e1n\u00ed dataset\u016f je vhodn\u00e9 pou\u017e\u00edvat ji\u017e natr\u00e9novan\u00e9 neuronov\u00e9 s\u00edt\u011b tak, aby se odstranilo co mo\u017en\u00e1 nejv\u00edce chybn\u00fdch dat. Rekurentn\u00ed vrstvy znateln\u011b zvy\u0161uj\u00ed \u00fasp\u011b\u0161nost s\u00edt\u00ed. P\u0159i adaptaci je v\u00fdhodn\u00e9 vyu\u017e\u00edvat p\u0159\u00edstup\u016f aktivn\u00edho u\u010den\u00ed.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/048\/48_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/048\/48.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/048\/48_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic14\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">54<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal54\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/054\/54_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Exploring contextual information in neural machine translation<\/h4>\r\n\t\t\t\t<h5>Josef Jon<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/054\/54.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal54\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">54<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Exploring contextual information in neural machine translation<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Josef Jon<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>neural machine translation, context, transformer, document level translation<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Zpracov\u00e1n\u00ed dat (obraz, zvuk, text apod.)<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>This works explores means of utilizing extra-sentential context in neural machine translation (NMT). Traditionally, NMT systems translate one source sentence to one target sentence without any notion of surrounding text. This is clearly insufficient and different from how humans translate text. For many high resource language pairs, NMT systems output is nowadays indistinguishable from human translations under certain (strict) conditions. One of the conditions is that evaluators see the sentences separately. When evaluating whole documents, even the best NMT systems still fall short of human translations. This motivates the research of employing document context in NMT, since there might not be much more space left to improve translations on sentence level, at least for high resource languages and domains. This work sumarizes recent state-of-the art approaches, implements them, evaluates them both in terms of general translation quality and on specific context related phenomena and analyzes their shortcomings. Additionaly, context phenomena test set for English to Czech translation was created to enable further comparison and analysis.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/054\/54_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/054\/54.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic01 topic08\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">55<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal55\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/055\/55_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Security Analysis of Immersive Virtual Reality and Its Implications<\/h4>\r\n\t\t\t\t<h5>Martin Vondr\u00e1\u010dek<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"https:\/\/youtu.be\/xoEeyHyPCfY\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"32\" height=\"32\" alt=\"Video\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal55\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">55<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Security Analysis of Immersive Virtual Reality and Its Implications<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Martin Vondr\u00e1\u010dek<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>Network Traffic Analysis, Reverse Engineering, Application Crippling, Penetration Testing, Hacking, Social Applications, Unity, Bigscreen, HTC Vive, Oculus Rift<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Bezpe\u010dnost<\/span> <span class=\"label label-default\">Po\u010d\u00edta\u010dov\u00e9 s\u00edt\u011b<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>Immersive virtual reality is a technology that finds more and more areas of application. It is used not only for entertainment but also for work and social interaction where user's privacy and confidentiality of the information has a high priority. Unfortunately, security measures taken by software vendors are often not sufficient. This paper shows results of extensive security analysis of a popular VR application Bigscreen which has more than 500,000 users. We have utilised techniques of network traffic analysis, penetration testing, reverse engineering, and even application crippling. We have found critical vulnerabilities directly exposing the privacy of the users and allowing the attacker to take full control of a victim's computer. Found security flaws allowed distribution of malware and creation of a botnet using a computer worm.  Our team has discovered a novel VR cyber attack Man-in-the-Room. We have also found a security vulnerability in the Unity engine. Our responsible disclosure has helped to mitigate the risks for more than half a million Bigscreen users and all affected Unity applications worldwide.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/055\/55_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"https:\/\/youtu.be\/xoEeyHyPCfY\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"48\" height=\"48\" alt=\"Video\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic08\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">57<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal57\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/057\/57_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>TCP Reset Cookies \u2013 a heuristic method for TCP SYN Flood mitigation<\/h4>\r\n\t\t\t\t<h5>Patrik Goldschmidt<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/057\/57.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/057\/57_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><a href=\"https:\/\/youtu.be\/9HhdWlWkn2g\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"32\" height=\"32\" alt=\"Video\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal57\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">57<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">TCP Reset Cookies \u2013 a heuristic method for TCP SYN Flood mitigation<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Patrik Goldschmidt<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>TCP Reset Cookies, TCP SYN Flood, DDoS mitigation<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Po\u010d\u00edta\u010dov\u00e9 s\u00edt\u011b<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>TCP SYN Flood is one of the most widespread DoS attack types used on computer networks nowadays. As a possible countermeasure, this paper proposes a long-forgotten network-based mitigation method TCP Reset Cookies. The method utilizes the TCP three-way-handshake mechanism to establish a security association with a client before forwarding its SYN data. Since the nature of the algorithm requires client validation, all SYN segments from spoofed IP addresses are effectively discarded. From the perspective of a legitimate client, the first connection causes up to 1-second delay, but all consecutive SYN traffic is delayed only by circa 30 microseconds. The document provides a detailed description and analysis of this approach, as well as implementation details with enhanced security tweaks. The project was conducted as a part of security research by CESNET. The discussed implementation is already integrated into a DDoS protection solution deployed in CESNET's backbone network and Czech internet exchange point at NIX.CZ.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/057\/57_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/057\/57.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/057\/57_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><a href=\"https:\/\/youtu.be\/9HhdWlWkn2g\"><img src=\"\/submissions\/images\/video-icon.png\" width=\"48\" height=\"48\" alt=\"Video\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic01\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">58<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal58\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/058\/58_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>LiSa -- Multiplatform Linux Sandbox for Analyzing IoT Malware<\/h4>\r\n\t\t\t\t<h5>Daniel Uh\u0159\u00ed\u010dek<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/058\/58.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/058\/58_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal58\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">58<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">LiSa -- Multiplatform Linux Sandbox for Analyzing IoT Malware<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Daniel Uh\u0159\u00ed\u010dek<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>IoT, Malware, Linux, Security, Dynamic Analysis, Network Analysis, SystemTap<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Bezpe\u010dnost<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>Weak security standards of IoT devices levereged Linux malware in past few years. Exposed telnet and ssh services with default passwords, outdated firmware or system vulnerabilities -- all of those are ways of letting attackers build botnets of thousands of compromised embedded devices. This paper emphasizes the importance of open source community in the field of malware analysis and presents design and implementation of multiplatform sandbox for automated malware analysis on Linux platform. Project LiSa (Linux Sandbox) is a modular system which outputs json data that can be further analyzed either manually or with pattern matching (e.g. with YARA) and serves as a tool to detect and classify Linux malware. LiSa was tested on recent IoT malware samples provided by Avast Software and it solved various problems of existing implementations.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/058\/58_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/058\/58.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/058\/58_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"thumbnail topic11\">\r\n\t\t\t<div class=\"caption caption-image\">\r\n\t\t\t\t<span class=\"badge\">59<\/span><a href=\"#\" data-toggle=\"modal\" data-target=\"#modal59\" rel=\"noopener noreferrer\"><img class=\"thumbnail-icon\" src=\"\/submissions\/2019\/059\/59_nahled.png\" alt=\"\"><\/a>\r\n\t\t\t<\/div>\r\n\t\t\t<div class=\"caption\">\r\n\t\t\t\t<h4>Scalable Static Analysis Using Facebook Infer<\/h4>\r\n\t\t\t\t<h5>Dominik Harmim, Vladim\u00edr Marcin, Ond\u0159ej Pavela<\/h5>\r\n\t\t\t\t<p class=\"download-menu\"><a href=\"\/submissions\/2019\/059\/59.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"32\" height=\"32\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/059\/59_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"32\" height=\"32\" alt=\"Plak\u00e1t\"><\/a><\/p>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t<div class=\"modal fade bs-example-modal-lg\" id=\"modal59\" tabindex=\"-1\" role=\"dialog\" aria-hidden=\"true\">\r\n\t\t\t<div class=\"modal-dialog modal-lg\">\r\n\t\t\t\t<div class=\"modal-content\">\r\n\t\t\t\t\t<div class=\"modal-header\">\r\n\t\t\t\t\t\t<span class=\"badge\">59<\/span>\r\n\t\t\t\t\t\t<p><button type=\"button\" class=\"close\" data-dismiss=\"modal\" aria-label=\"Close\"><span aria-hidden=\"true\">\u00d7<\/span><\/button><\/p>\r\n\t\t\t\t\t\t<h3 class=\"modal-title\">Scalable Static Analysis Using Facebook Infer<\/h3>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-body\">\r\n\t\t\t\t\t\t<div class=\"row\">\r\n\t\t\t\t\t\t\t<div class=\"col-md-8\">\r\n\t\t\t\t\t\t\t\t<p><strong>Dominik Harmim, Vladim\u00edr Marcin, Ond\u0159ej Pavela<\/strong><\/p>\r\n\t\t\t\t\t\t\t\t<p><em>Facebook Infer, Static Analysis, Abstract Interpretation, Atomicity Violation, Concurrent Programs, Performance, Worst-Case Cost, Deadlock<\/em><\/p>\r\n\t\t\t\t\t\t\t\t<p><span class=\"label label-default\">Testov\u00e1n\u00ed, anal\u00fdza a verifikace<\/span> <\/p>\r\n\t\t\t\t\t\t\t\t<p>Static analysis has nowadays become one of the most popular ways of catching bugs early in the modern software. However, reasonably precise static analyses do still often  have problems with scaling to larger codebases. And efficient static analysers, such as Coverity or Code Sonar, are often proprietary and difficult to openly evaluate or extend. Facebook Infer offers a static analysis framework that is open source, extendable, and promoting efficient modular and incremental analysis. In this work, we propose three inter-procedural analysers extending the capabilities of Facebook Infer: Looper (a resource bounds analyser), L2D2 (a low-level deadlock detector), and Atomer (an atomicity violation analyser). We evaluated our analysers on both smaller hand-crafted examples as well as publicly available benchmarks derived from real-life  low-level programs and obtained encouraging results. In particular, L2D2 attained  100 % detection rate and 11 % false positive rate on an extensive benchmark of hundreds of functions and millions of lines of code.<\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t\t<div class=\"col-md-4\">\r\n\t\t\t\t\t\t\t\t<p><img class=\"img-responsive\" src=\"\/submissions\/2019\/059\/59_nahled.png\" alt=\"\"><\/p>\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<div class=\"modal-footer\">\r\n\t\t\t\t\t\t<div class=\"pull-left\"><div class=\"btn-group\" role=\"group\"><a href=\"\/submissions\/2019\/059\/59.pdf\"><img src=\"\/submissions\/images\/paper-icon.png\" width=\"48\" height=\"48\" alt=\"\u010cl\u00e1nek\"><\/a><a href=\"\/submissions\/2019\/059\/59_poster.pdf\"><img src=\"\/submissions\/images\/poster-icon.png\" width=\"48\" height=\"48\" alt=\"Plak\u00e1t\"><\/a><\/div><\/div>\r\n\t\t\t\t\t\t<div class=\"pull-right\"><button type=\"button\" class=\"btn btn-primary\" data-dismiss=\"modal\">Zav\u0159\u00edt<\/button><\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div><div class=\"clearfix\"><\/div>\r\n<\/div> \r\n<\/div>\r\n","protected":false},"excerpt":{"rendered":"Program konference Excel@FIT, kter\u00e1 se kon\u00e1 na Fakult\u011b informa\u010dn\u00edch technologi\u00ed VUT v Brn\u011b.  ","protected":false},"author":2,"featured_media":2708,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"ngg_post_thumbnail":0,"footnotes":""},"class_list":["post-246","page","type-page","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/pages\/246","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/comments?post=246"}],"version-history":[{"count":140,"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/pages\/246\/revisions"}],"predecessor-version":[{"id":3061,"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/pages\/246\/revisions\/3061"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/media\/2708"}],"wp:attachment":[{"href":"https:\/\/excel.fit.vutbr.cz\/2019\/wp-json\/wp\/v2\/media?parent=246"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}