Machine Learning and Pattern Recognition University of Tsukuba
This course aims at enabling students to develop an understanding of the fundamental principles and practical use of machine learning. The course is designed to examine supervised and unsupervised learning methods based on neural networks and Bayesian models for essential techniques in data science such as image recognition, document classification and clustering. The course is made up of a series of lectures – focusing on fundamental principles (linear algebra and probability theory). Hands-on exercises are also provided to understand the practical use of different machine learning methods (e.g., pattern recognition of images and text data). The course also introduces a broad range of practical techniques, e.g., data mining applications and parallel computing methods for large scale data.
Students will develop an understanding of the core principles of machine learning, the ability to follow mathematical derivations according to the theory of statistical machine learning and Bayesian models. Students will acquire the skill to design a basic software for pattern recognition, by using machine learning tools.
Management competence, Quantitative research competence
Grading considers a project report (60%) and assignments including hands-on exercises (40%). The project report is evaluated based on four evaluation criteria; sophistication of implementation, reliability, comprehensibility, and originality. Students need to score above 60% (out of 100%) in order to pass the course. Grades from A+ to C will be determined based on the total score.
(1) Tasks in machine learning (supervised learning, unsupervised learning, training data, test data) (Yu) (2) Classification by linear classifiers (separating hyperplane, weight matrix) (Yu) (3) Training neural networks (loss function, gradient descent) (Yu) (4) Image recognition by convolutional neural networks (filter, backpropagation) (Yu) (5) Probability distributions and Bayesian models (maximum likelihood estimation, prior distribution, Bayesian inference) (Wakabayashi) (6) Document classification by naive Bayes models (latent variables, MAP estimation) (Wakabayashi) (7) Clustering by Gaussian mixture models (EM algorithm) (Wakabayashi) (8) Sequence labeling by hidden Markov models (transition probability, graphical models) (Wakabayashi) (9) Data mining application (Yu) (10) Parallel processing and MapReduce (Wakabayashi)
Online Course Requirement
Wakabayashi Kei,Yu Haitao
Class format - Online/Face-to-face - The first half of each class (8:40-9:55) is provided online (on-demand), and the second half (10:10-11:25) is provided face-to-face. - We use MS Stream for delivering online (on-demand) lectures. The links to the lecture videos are provided on manaba.
Site for Inquiry
Please inquire about the courses at the address below.
Contact person: Wakabayashi Kei,Yu Haitao
Email address: email@example.com,firstname.lastname@example.org
Link to the syllabus provided by the university