ผลต่างระหว่างรุ่นของ "MINDLAB READING"

จาก Theory Wiki
ไปยังการนำทาง ไปยังการค้นหา
แถว 11: แถว 11:
  
 
==May 11, 2007: Mixture Models and EM algorithm==
 
==May 11, 2007: Mixture Models and EM algorithm==
 +
[[Image:Bishop cover.jpg|thumb|200px|Bishop's PRML book]]
 +
 
This topic is a nice introduction for Bayesian paradigm in machine learning. After this talk, we should be able to answer the following questions:
 
This topic is a nice introduction for Bayesian paradigm in machine learning. After this talk, we should be able to answer the following questions:
 
* What is the Bayesian machine learning paradigm?  
 
* What is the Bayesian machine learning paradigm?  
แถว 19: แถว 21:
 
* How can we train the learner in Bayesian paradigm?
 
* How can we train the learner in Bayesian paradigm?
 
:* This talk illustrates one Bayesian toolkit: the EM algorithm.
 
:* This talk illustrates one Bayesian toolkit: the EM algorithm.
 
[[Image:Bishop cover.jpg|thumb|200px|Bishop's PRML book]]
 
  
 
===Main paper===
 
===Main paper===

รุ่นแก้ไขเมื่อ 09:20, 4 พฤษภาคม 2550

This page is created to be the (temporary?) main information page of the MIND lab reading group. Contents in our reading group can be any old or new (but interesting or important to many members) research works.

Plan

For the first 2-3 week, I, Jung, will take the lead. I will cover three topics which are frequently mentioned in literatures and I'm most familiar with: Mixture Models and EM algorithm, Sampling methods and Principal Component analysis.

After that, if it successes, I hope that we can continue the process and change the leader week by week. The leader should have at least 2 weeks to read the topic.

Date and Time

At this moment, I plan to arrange the reading group in every friday afternoon (1pm - 2pm), but let us talk for more convenient dates and times.

May 11, 2007: Mixture Models and EM algorithm

Bishop's PRML book

This topic is a nice introduction for Bayesian paradigm in machine learning. After this talk, we should be able to answer the following questions:

  • What is the Bayesian machine learning paradigm?
  • The Bayes equation
  • Why being Bayesian is a good idea? What are advantages of the Bayesian paradigm over the classical paradigm?
  • It is intuitive, easy to understand (but might not easy to do)
  • It can solve the model selection problem
  • How can we train the learner in Bayesian paradigm?
  • This talk illustrates one Bayesian toolkit: the EM algorithm.

Main paper

  • Chris Bishop's PRML book chapter 9. ( I have one copy left; anyone who do not have this book please feel free to borrow me)

Supplementary

Prerequisite

In order to understand how EM work, we have to understand the Jensen's inequality and Kullback-Leibler divergence.

May 18, 2007: The Monte Carlo Methods#1

May 25, 2007: The Monte Carlo Methods#2