ผลต่างระหว่างรุ่นของ "Machine Learning at U of C"

จาก Theory Wiki
ไปยังการนำทาง ไปยังการค้นหา
(Reverted edit of Jittat, changed back to last version by 62.215.3.45)
(revert to parinya)
 
แถว 1: แถว 1:
[http://kataevka.ifrance.com/articles/intel/ intel 3 6] [http://brandalinden.ifrance.com/topic/maxus.htm maxus] [http://ariadnafeni.ifrance.com/resources/darkroom/ darkroom] [http://gaburlei.angelfire.com/resources/wide-lcd.htm wide lcd] [http://isabellaval.ifrance.com/inno-del/ inno del trentino] [http://nymphbmbzl222.ifrance.com/e-auricolare.htm e310 auricolare samsung auricolari] [http://brandalinden.ifrance.com/topic/lavatrici-.htm lavatrici 50 cm] [http://brandalinden.ifrance.com/topic/auto-crash.htm auto crash] [http://konstantinkar.ifrance.com/description/hp-ipaq/ hp ipaq pocket pc h2210] [http://brandalinden.ifrance.com/topic/air-pegasus.htm air pegasus nike] [http://kataevka.ifrance.com/articles/que-es/ que es una computadora] [http://brandalinden.ifrance.com/topic/roccia.htm roccia] [http://ariadnafeni.ifrance.com/resources/teac-mp/ teac mp3 1gb] [http://hewvey.ifrance.com/library/temi-svolti/ temi svolti inglese] [http://alena344.ifrance.com/text/la-guerriera/ la guerriera] [http://dylantian.ifrance.com/styles/auckland-intrattenimento.htm auckland intrattenimento] [http://gaburlei.angelfire.com/resources/un-bacio.htm un bacio a fior dacqua] [http://hewvey.ifrance.com/library/nelly-and/ nelly and christina tilt ya head back] [http://snerma.angelfire.com/text/vivavoce-nokia/ vivavoce nokia 3510] [http://hehalley.at.tut.by/content/view/ruby-cairo.htm ruby cairo] [http://thmastrie.angelfire.com/resources/renault-clio.htm renault clio diesel km0] [http://isabellaval.ifrance.com/cantar-hasta/ cantar hasta morir] [http://sagaddy.angelfire.com/styles/viaggio-allucinante/ viaggio allucinante] [http://dylantian.ifrance.com/styles/belinda-be.htm belinda be free] [http://wifry.angelfire.com/styles/maglia-uomo.htm maglia uomo puma] [http://dweejah.ifrance.com/new/trans-viterbo/ trans viterbo] [http://alena344.ifrance.com/text/garmin-streetpilot/ garmin streetpilot 2610] [http://wifry.angelfire.com/styles/galler-foto.htm galler foto porno] [http://brandalinden.ifrance.com/topic/lg-gsa.htm lg gsa 5160d] [http://casloan.angelfire.com/small/christina-aguilera/ christina aguilera virgin] [http://dweejah.ifrance.com/new/hd-videoregistratori/ hd videoregistratori] [http://macertot.ifrance.com/view/produzione-infisso.htm produzione infisso legno] [http://konstantinkar.ifrance.com/description/royal-plaza/ royal plaza hotel] [http://brandalinden.ifrance.com/topic/i-giganti.htm i giganti di roma] [http://macertot.ifrance.com/view/hardcore-film.htm hardcore film cult] [http://isabellaval.ifrance.com/siti-decoupage/ siti decoupage] [http://feralpaw-omm.ifrance.com/html/te-as/ te as uita] [http://casloan.angelfire.com/small/lady-rox/ lady rox] [http://macertot.ifrance.com/view/lettore-dvd.htm lettore dvd  radio] [http://pohogue.angelfire.com/library/remov-.htm remov 30 cpr 100 mg] [http://brandalinden.ifrance.com/topic/backstreet-boy.htm backstreet boy] [http://lehartfi.at.tut.by/lib/zen-lettore.htm zen lettore mp3] [http://dylantian.ifrance.com/styles/on-the.htm on the turning away] [http://habeard.angelfire.com/topic/cognato.htm cognato] [http://brandalinden.ifrance.com/topic/jvc.htm jvc 23] [http://konstantinkar.ifrance.com/description/wes-montgomery/ wes montgomery] [http://dylantian.ifrance.com/styles/garmin-map.htm garmin map 60cs] [http://kairikekui.ifrance.com/resources/sugo-alla/ sugo alla vodka] [http://isabellaval.ifrance.com/toshiba-satellite/ toshiba satellite p20101] [http://isabellaval.ifrance.com/lcd-/ lcd 7 amstrad] [http://dylantian.ifrance.com/styles/samsung-ps.htm samsung ps 42 s] [http://ariadnafeni.ifrance.com/resources/veicolare-bluetooth/ veicolare bluetooth] [http://spkleins.angelfire.com/basolato/ basolato] [http://feralpaw-omm.ifrance.com/html/dvd-e/ dvd 92e] [http://alena344.ifrance.com/text/ver-fotos/ ver fotos lesbianas] [http://stdamin.at.tut.by/html/rossini-giuseppe.htm rossini giuseppe libero it] [http://nymphbmbzl222.ifrance.com/peugeot-.htm peugeot 206 xs como] [http://isabellaval.ifrance.com/oh-bot/ oh bot] [http://kairikekui.ifrance.com/resources/telecomando-rc/ telecomando rc1] [http://wifry.angelfire.com/styles/inchiostro-per.htm inchiostro per stampanti epson] [http://habeard.angelfire.com/topic/primo-filmato.htm primo filmato per need for speed underground 2] [http://konstantinkar.ifrance.com/description/download-aventura/ download aventura cuando volveras mp3] [http://macertot.ifrance.com/view/cheats-gioco.htm cheats gioco gta vice city] [http://lidresze.angelfire.com/blog/oye-me.htm oye me canto nina sky and n o r e] [http://nymphbmbzl222.ifrance.com/download-spartito.htm download spartito vecchia senna] [http://macertot.ifrance.com/view/philips-.htm philips 32 pollici] [http://konstantinkar.ifrance.com/description/dendoh-robot/ dendoh robot] [http://ariadnafeni.ifrance.com/resources/weider-pro/ weider pro 435] [http://mckueltzi.at.tut.by/description/lib/dvea.htm dv1250ea] [http://mckueltzi.at.tut.by/description/lib/kit-auto.htm kit auto ventosa] [http://dylantian.ifrance.com/styles/fire-ant.htm fire ant] [http://dweejah.ifrance.com/new/ventole-socket/ ventole socket 939] [http://thmastrie.angelfire.com/resources/download-foto.htm download foto di baldoni] [http://dylantian.ifrance.com/styles/radeon-xxl.htm radeon x800xl] [http://wowinoth.at.tut.by/web/masterizzatore-interno.htm masterizzatore interno dvd pioneer] [http://jrosestar.ifrance.com/html/ix-fujifilm/ ix fujifilm] [http://ariadnafeni.ifrance.com/resources/cartuccia-hp/ cartuccia hp 2550 nero] [http://isabellaval.ifrance.com/video-nudo/ video nudo] [http://feralpaw-omm.ifrance.com/html/da-do/ da do da] [http://lasylvan.angelfire.com/new/videocamera-fotocamera/ videocamera fotocamera] [http://spkleins.angelfire.com/mbs/ mbs] [http://thmastrie.angelfire.com/resources/donne-cercano.htm donne cercano sigoli] [http://wifry.angelfire.com/styles/solo-foto.htm solo foto neonati] [http://habeard.angelfire.com/topic/bollo-mercedes.htm bollo mercedes ml 270] [http://ariadnafeni.ifrance.com/resources/topless-vacanza/ topless vacanza mare] [http://habeard.angelfire.com/topic/registra-dominio.htm registra dominio] [http://brandalinden.ifrance.com/topic/dainese-continental.htm dainese continental] [http://webancks.at.tut.by/images/small/motogenerale.htm moto(generale)] [http://lidresze.angelfire.com/blog/olimpiadi-.htm olimpiadi 2004 atene] [http://kairikekui.ifrance.com/resources/acdsee/ acdsee 8039] [http://macertot.ifrance.com/view/canzoni-rsi.htm canzoni rsi] [http://dylantian.ifrance.com/styles/pulp-fiction.htm pulp fiction tarantino] [http://beyonddreamingx.angelfire.com/content/uragano-express.htm uragano express] [http://kataevka.ifrance.com/articles/graditi-i/ graditi i singoli] [http://lehartfi.at.tut.by/lib/ragazza-cesano.htm ragazza cesano maderno] [http://alena344.ifrance.com/text/msi-scheda/ msi scheda madre] [http://dylantian.ifrance.com/styles/vibrante.htm vibrante] [http://ariadnafeni.ifrance.com/resources/red-army/ red army choir] [http://jrosestar.ifrance.com/html/canzone-amici/ canzone amici miei] [http://nymphbmbzl222.ifrance.com/piet.htm piet] [http://caravanserraglio.fast-girl.net.cn caravanserraglio] [http://waltermoney.cn/stampa-t.htm stampa t shirt] [http://pixie-bite.org.cn/graaffreinet.htm graaffreinet] [http://fierydarknez.cn/maschi-italiani.htm maschi italiani] [http://grosse.sound-of-sun.cn grosse] [http://nadzor-ns.cn/vallo-della.htm vallo della] [http://keokepa.com.cn/escort-shemale.htm escort shemale] [http://fierydarknez.cn/xiaoxiao.htm xiaoxiao2] [http://gronchi-rosa.greylopht.cn gronchi rosa francobolli] [http://web-sorting.cn/stand-up.htm stand up blue] [http://knuckles-lives.cn/testo-in.htm testo in italiano to nem ai] [http://phanta.fast-girl.net.cn phanta] [http://disegni-per.happi-thoughts.org.cn disegni per compleanno] [http://waltermoney.cn/bp-.htm bp 412 canon] [http://billclinton.kiushapo.cn billclinton] [http://router-ethernet.kiushapo.cn router ethernet dhcp] [http://waltermoney.cn/vestir-y.htm vestir y] [http://laser-colore.jowugow.cn laser colore epson] [http://tubo-zincato.greylopht.cn tubo zincato] [http://waltermoney.cn/charente-maritine.htm charente maritine] [http://keokepa.com.cn/slipknot-vermilion.htm slipknot vermilion] [http://web-sorting.cn/canon-proiettore.htm canon proiettore] [http://lampadari-per.kiushapo.cn lampadari per camerette] [http://usb.kiushapo.cn usb 1 gb sandisk] [http://box-world.greylopht.cn box world] [http://fierydarknez.cn/sony-videocamera.htm sony videocamera minidv dcrhc42] [http://fierydarknez.cn/gween-stefany.htm gween stefany] [http://corsi-latino.kiushapo.cn corsi latino americano] [http://petralona.panzerfausts.com.cn petralona] [http://vasco-rosso.kiushapo.cn vasco rosso malattia] [http://nonkomformist.cn/oh-freedom.htm oh freedom] [http://georgiil.net.cn/divx-fm.htm divx fm home theatre] [http://waltermoney.cn/peugeot-expert.htm peugeot expert] [http://fierydarknez.cn/repubblica-dominicana.htm repubblica dominicana cose fare] [http://macchina-caffe.panzerfausts.com.cn macchina caffe automatica saeco] [http://concerti-in.jowugow.cn concerti in emilia romagna] [http://nonkomformist.cn/una-estate.htm una estate italiana] [http://faktor.greylopht.cn faktor] [http://km.greylopht.cn km 0 diesel] [http://knuckles-lives.cn/trasferimento-termico.htm trasferimento termico] [http://missouri-hotel.taliabriscoe.cn missouri hotel] [http://orecchino-di.happi-thoughts.org.cn orecchino di perla dvd] [http://waltermoney.cn/cienciano.htm cienciano] [http://junior-da.jowugow.cn junior da hype] [http://bali-polinesia.digirb-backward.cn bali polinesia] [http://pignataro-interamna.kiushapo.cn pignataro interamna] [http://georgiil.net.cn/lestate-del.htm lestate del leone] [http://nonkomformist.cn/saverina.htm saverina] [http://keokepa.com.cn/apri-il.htm apri il cuore] [http://adidas-pubblicita.taliabriscoe.cn adidas pubblicita] [http://pixie-bite.org.cn/barbie-girl.htm barbie girl] [http://waltermoney.cn/bono-frank.htm bono frank sinatra] [http://nonkomformist.cn/dragostea-in.htm dragostea in tei haiducci] [http://dustybooks.cn/ksenos.htm ksenos] [http://georgiil.net.cn/strana-coppia.htm strana coppia] [http://a-chi.panzerfausts.com.cn a chi min dice] [http://club-privee.fast-girl.net.cn club privee lombardia] [http://knuckles-lives.cn/volkswagen-fox.htm volkswagen fox] [http://dr-stanleys.sound-of-sun.cn dr stanleys house] [http://memoria-ram.taliabriscoe.cn memoria ram rimm] [http://georgiil.net.cn/harry-a.htm harry a pezzi] [http://knuckles-lives.cn/scheda-memoria.htm scheda memoria xd] [http://nadzor-ns.cn/hub-usb.htm hub usb autoalimentato] [http://lettere-commerciali.jowugow.cn lettere commerciali italianotedesco] [http://hitman.mack-si.cn hitman] [http://keokepa.com.cn/all-in.htm all in wonder agp] [http://capoeira-regional.panzerfausts.com.cn capoeira regional] [http://un-angel.kiushapo.cn un angel no es] [http://televisori-retroproiezione.kiushapo.cn televisori retroproiezione sony] [http://samsung-gb.fast-girl.net.cn samsung 20gb] [http://catrina-del.kiushapo.cn catrina del gf5] [http://nokia-ngage.happi-thoughts.org.cn nokia ngage qd] [http://liviu-guta.happi-thoughts.org.cn liviu guta daniela de ce ma minti] [http://nadzor-ns.cn/gledam.htm gledam] [http://navigatori-satellitari.fast-girl.net.cn navigatori satellitari go 500] [http://uno-spostato.fast-girl.net.cn uno spostato sotto tiro] [http://nadzor-ns.cn/tafuri-manfredo.htm tafuri manfredo] [http://luigi-tenco.mack-si.cn luigi tenco video] [http://knuckles-lives.cn/militar.htm militar] [http://ragazze-modelle.mack-si.cn ragazze modelle] [http://knuckles-lives.cn/ballbusting-kicks.htm ballbusting kicks] [http://dustybooks.cn/frasi-pensione.htm frasi pensione] [http://mini-aspirapolvere.jowugow.cn mini aspirapolvere per auto] [http://dustybooks.cn/antenna-tv.htm antenna tv vendita] [http://festa-di.panzerfausts.com.cn festa di sant anna] [http://televisori-al.kiushapo.cn televisori al plasma 1024x1024] [http://intimo-dolce.digirb-backward.cn intimo dolce gabbana uomo] [http://georgiil.net.cn/concessionaria-alfa.htm concessionaria alfa romeo toscana] [http://nonkomformist.cn/kdc-psw.htm kdc psw9531] [http://dustybooks.cn/deumidificatore-de.htm deumidificatore de] [http://lei-lui.mack-si.cn lei lui per incontrarsi] [http://fierydarknez.cn/crazy-mary.htm crazy mary] [http://georgiil.net.cn/telecomando-vdo.htm telecomando vdo] [http://panasonic-lumix.kiushapo.cn panasonic lumix dmc lc1] [http://web-sorting.cn/a-e.htm a e g lavastoviglie] [http://pixie-bite.org.cn/sasso-d.htm sasso d assisi] [http://nadzor-ns.cn/chump.htm chump] [http://keokepa.com.cn/bartosiewicz-krawczyk.htm bartosiewicz krawczyk tak trudno] [http://gaia-bergamo.greylopht.cn gaia bergamo] [http://www-gloob.sound-of-sun.cn www gloob com] This page contains a list of topics, definitions, and results from Machine Learning course at University of Chicago taught by [http://www.cs.uchicago.edu/~niyogi this guy]. The proof is very sketchy and given without any intuition mainly because I'm so lazy. I would appreciate all kinds of help to make the note complete.  
+
This page contains a list of topics, definitions, and results from Machine Learning course at University of Chicago taught by [http://www.cs.uchicago.edu/~niyogi this guy]. The proof is very sketchy and given without any intuition mainly because I'm so lazy. I would appreciate all kinds of help to make the note complete.  
  
 
== Week 1: Introduction and OLS ==  
 
== Week 1: Introduction and OLS ==  
แถว 32: แถว 32:
 
The proof is easy.  
 
The proof is easy.  
  
:<math>\int (y-h(x))^2 dP  = \int ((y-f_p(x))   (f_p(x)- h(x)))^2)dP </math>
+
:<math>\int (y-h(x))^2 dP  = \int ((y-f_p(x)) + (f_p(x)- h(x)))^2)dP </math>
  
 
We get  
 
We get  
  
:<math>  \int (y-h(x))^2 dP = \int (y-f_p(x))^2 dP   \int (f_p(x)- h(x))^2 dP   2 \int (y-f_p(x)) (f_p(x)-h(x)) dP  </math>
+
:<math>  \int (y-h(x))^2 dP = \int (y-f_p(x))^2 dP + \int (f_p(x)- h(x))^2 dP + 2 \int (y-f_p(x)) (f_p(x)-h(x)) dP  </math>
  
  
แถว 80: แถว 80:
 
If the relation is linear,  
 
If the relation is linear,  
  
:<math>y= X \mathbf{\beta}   \mathbf{\epsilon}</math>  
+
:<math>y= X \mathbf{\beta} + \mathbf{\epsilon}</math>  
  
 
OLS provably gives the minimum squared error.  
 
OLS provably gives the minimum squared error.  
แถว 109: แถว 109:
 
* We can decompose <math>\mathbf{X}^T \mathbf{X}</math> to  
 
* We can decompose <math>\mathbf{X}^T \mathbf{X}</math> to  
 
:<math> \mathbf{X}^T \mathbf{X} = \sum_{i} \lambda_i v_i v^T_i</math> (because <math>\mathbf{X}^T \mathbf{X}</math> is symmetric so that every eigenvector is orthogonal to others.)
 
:<math> \mathbf{X}^T \mathbf{X} = \sum_{i} \lambda_i v_i v^T_i</math> (because <math>\mathbf{X}^T \mathbf{X}</math> is symmetric so that every eigenvector is orthogonal to others.)
Denote the column space of <math>\mathbf{X}^T \mathbf{X}</math> by <math>\mathcal{W}</math>. Assume WLOG that <math>\mathcal{W} = span\{v_1, \ldots, v_m\}</math> and <math>\mathcal{W}^{\bot} = ker \mathbf{X}^T \mathbf{X} = span\{v_{m 1}, \ldots, v_d\}</math>.
+
Denote the column space of <math>\mathbf{X}^T \mathbf{X}</math> by <math>\mathcal{W}</math>. Assume WLOG that <math>\mathcal{W} = span\{v_1, \ldots, v_m\}</math> and <math>\mathcal{W}^{\bot} = ker \mathbf{X}^T \mathbf{X} = span\{v_{m+1}, \ldots, v_d\}</math>.
  
 
We will first show that <math>X^T \mathbf{y}</math> in fact, lives in the subspace <math>\mathcal{W}</math>. Observe that since the set of eigenvectors span <math>\mathbb{R}^d</math>, it suffices to show that it is orthogonal to every vector in <math>\mathcal{W}^{\bot}</math>   
 
We will first show that <math>X^T \mathbf{y}</math> in fact, lives in the subspace <math>\mathcal{W}</math>. Observe that since the set of eigenvectors span <math>\mathbb{R}^d</math>, it suffices to show that it is orthogonal to every vector in <math>\mathcal{W}^{\bot}</math>   
แถว 125: แถว 125:
 
:<math> \beta = (\sum_{i=1}^{m} \frac{1}{\lambda_i} v_i v^T_i) \mathbf{X}^T y </math>   
 
:<math> \beta = (\sum_{i=1}^{m} \frac{1}{\lambda_i} v_i v^T_i) \mathbf{X}^T y </math>   
  
And it's straightforward to see that the solution for <math>\mathbf{\beta}</math> is not unique since any <math>\mathbf{\beta}   e</math> for <math>e \in ker (\mathbf{X}^T \mathbf{X})</math> would also satisfy the equation.
+
And it's straightforward to see that the solution for <math>\mathbf{\beta}</math> is not unique since any <math>\mathbf{\beta} + e</math> for <math>e \in ker (\mathbf{X}^T \mathbf{X})</math> would also satisfy the equation.
  
 
Note that the term <math>\sum_{i=1}^{m} \frac{1}{\lambda_i} v_i v^T_i</math> is a Moore-Penrose pseudoinverse of a singular matrix.
 
Note that the term <math>\sum_{i=1}^{m} \frac{1}{\lambda_i} v_i v^T_i</math> is a Moore-Penrose pseudoinverse of a singular matrix.
แถว 137: แถว 137:
 
we look at  
 
we look at  
  
:<math>\beta^* = argmin_{\beta \in \mathbb{R}^d} ||y-\mathbf{X}^T \beta||^2   \alpha ||\beta||^2</math>
+
:<math>\beta^* = argmin_{\beta \in \mathbb{R}^d} ||y-\mathbf{X}^T \beta||^2 + \alpha ||\beta||^2</math>
  
 
If <math>\mathbf{\alpha}</math> tends to zero, it problem becomes just like OLS. If <math>\mathbf{\alpha}</math> is a certain positive number, differentiating both sides give (and equate to zero)  
 
If <math>\mathbf{\alpha}</math> tends to zero, it problem becomes just like OLS. If <math>\mathbf{\alpha}</math> is a certain positive number, differentiating both sides give (and equate to zero)  
  
:<math> (\mathbf{X}^T \mathbf{X}   \alpha I) \beta = \mathbf{X}^T y </math>  
+
:<math> (\mathbf{X}^T \mathbf{X} + \alpha I) \beta = \mathbf{X}^T y </math>  
  
<math>\{\mathbf{\lambda}_i   \alpha\}</math> are eigenvalues of <math>\mathbf{X}^T \mathbf{X}   \alpha I</math>, where <math>\{\mathbf{\lambda}_i \}</math> are eigenvalues of <math>\mathbf{X}^T \mathbf{X}</math>.
+
<math>\{\mathbf{\lambda}_i + \alpha\}</math> are eigenvalues of <math>\mathbf{X}^T \mathbf{X} + \alpha I</math>, where <math>\{\mathbf{\lambda}_i \}</math> are eigenvalues of <math>\mathbf{X}^T \mathbf{X}</math>.
  
 
Observe that all eigenvalues are positive, so nullspace is zero. And thus, matrix on the left-hand side is invertible. The solution is '''unique'''.
 
Observe that all eigenvalues are positive, so nullspace is zero. And thus, matrix on the left-hand side is invertible. The solution is '''unique'''.
  
 
== Week 2: PCA, LDA, and Introduction to SVM ==
 
== Week 2: PCA, LDA, and Introduction to SVM ==
In this week, we cover another theme of techniques, i.e. Principle Component Analysis (PCA) and Fisher Linear discriminant analysis. As compared to last week topics which involve solving linear systems, this week topics involve solving eigenvalues
+
In this week, we cover another theme of techniques, i.e. Principle Component Analysis (PCA) and Fisher Linear discriminant analysis. As compared to last week topics which involve solving linear systems, this week topics involve solving eigenvalues & eigenvectors of matrix.
 +
 
 +
Since the theory of positive semidefinite will be crucial throughout the course, we give a quick introduction.
 +
 
 +
=== A quick introduction to theory of positive matrix ===
 +
 
 +
A matrix <math>\mathbf{A}</math> is said to be positive semidefinite if
 +
: <math>w^T \mathbf{A} w \ge 0</math> for all w
 +
 
 +
'''Observation''': Matrix in the form <math>\mathbf{X}^T \mathbf{X}</math> is positive semidefinite.
 +
 
 +
''Proof.'' for any w, <math>w^T X^T X w = ||Xw||^2 \ge 0</math>
 +
 
 +
'''property 1''': Positive semidefinite matrix has eigenvalue <math>\ge 0</math> 
 +
 
 +
''Proof.'' <math>\mathbf{A} \mathbf{v} = \lambda \mathbf{v} \Rightarrow v^T A v = \lambda v^T v.</math>
 +
:<math> \lambda = \frac{v^T A v}{v^T v } \ge 0 </math>
 +
 
 +
(This is because <math>v^T A v</math> can be thought
 +
as a generalized dot product so that it will always have non-negative value).
 +
 
 +
=== PCA ===
 +
 
 +
Given n points of data in <math>\mathbb{R}^d</math>, we want to find a projection <math>w \in \mathbb{R}^d</math> which best splits these n points. The notion of 'best' is measured by variance of the projected data. Note that we can assume WLOG that <math>\sum_{i=1}^{n} \mathbf{x}_i = 0</math>. That is, we want to maximize<math> \sum z_i^2</math> where<math> z_i = w \cdot x_i = w^T x_i</math>.
 +
 
 +
: <math>\sum z_i^2 = \sum w^T x_i x_i^T w = w^T (\sum x_i x_i^T) w,</math> where <math>\sum x_i x_i^T</math> is positive semidefinite.
 +
The problem is given as follows.
 +
: <math>\max_{w \in \mathbb{R}^d} w^T (\sum \mathbf{x}_i \mathbf{x}_i^T) w</math>
 +
:: s.t. <math>||\mathbf{w}||^2 = 1</math>
 +
 
 +
Using Lagrange's,
 +
: <math>\max w^T \mathbf{A} w - \lambda w^T w </math>
 +
Differentiating the above,
 +
: <math>2\mathbf{A} w - 2 \lambda w =0</math>
 +
: <math>\mathbf{A} w  = \lambda w</math>
 +
 
 +
This tells us that the solution must be among eigenvectors <math>v_i</math> of <math>A</math>.
 +
 
 +
: <math>v_i^T A v_i  = v_i^T \lambda_i v_i = \lambda_i v_i^T v_i = \lambda_i</math>
 +
 
 +
The solution becomes clear now, i.e. choose <math>\mathbf{w}</math> to be the eigenvectors corresponding to the ''largest'' eigenvalue of <math>\mathbf{A}</math>.
 +
 
 +
=== Fischer LDA ===
 +
 
 +
Another method along the same line was proposed by Fischer. The idea is to find the direction which best seperates the mean of different classes, as contrasted to PCA which finds the direction in which the projected data have highest variance. Fisher's criterion is as follows: (note that we consider the task of classifications with two labels, 1 and -1)
 +
 
 +
: <math>\max_{w \in \mathbb{R}^d} \frac{||w^T m_1 - w^T m_{-1}||^2}{\sum_{i \in I_1} ||w^T x_i - w^T m_1||^2 +  \sum_{i \in I_{-1}} ||w^T x_i - w^T m_{-1}||^2  }
 +
</math>
 +
 
 +
where <math>m_c = \frac{1}{|I_c|} \sum_{i \in I_c} x_i</math>
 +
 
 +
The numerator could be written as <math>w^T (m_1 - m_{-1})(m_1-m_{-1})^T w</math> which we denote the middle term by <math>\mathbb{S}_B</math>. Also, the denominator reduces to
 +
 
 +
: <math>w^T (\sum (x_i - m_1) (x_i- m_1)^T + \sum ( x_i - m_2)(x_i - m_2)^T) w</math>
 +
 
 +
which we denote the middle term by <math>\Sigma</math>. The problem becomes
 +
 
 +
: <math>\max_{w \in \mathbb{R}^d} \frac{w^T \mathbb{S}_B w}{w^T \Sigma w }</math>
 +
 
 +
This is equivalent to
 +
 
 +
: <math>\max_{w \in \mathbb{R}^d} w^T \mathbb{S}_B w </math>
 +
:  s.t. <math>\mathbf{w}^T \Sigma \mathbf{w} =1</math>
 +
 
 +
Again, consider Lagrange and differentiate the formula. we get
 +
 
 +
: <math> 2 \mathbb{S}_B w = 2 \lambda \Sigma w</math>
 +
: <math>\Sigma^{-1} \mathbb{S}_B w = \lambda w</math>
 +
 
 +
This would have been done if<math> \Sigma^{-1} \mathbb{S}_B</math> is symmetric and positive semidefinite. We can use the eigenvector solutions like before. We can work it out as follows:
 +
* Observe that <math>\mathbb{S}_B</math> is a rank-one symmetric, positive semidefinite matrix (rank-one is not important, though). So, the notation <math>\mathbb{S}^{1/2}_B</math> makes sense because we can consider changing to eigen basis, <math>\mathbb{S}_B = U D U^T</math>. So <math>\mathbb{S}^{1/2}_B = U D^{1/2} U^T</math>.
 +
 
 +
Replacing <math>\mathbb{S}^{1/2}_B w</math> with v, we get
 +
 
 +
: <math>\mathbb{S}^{1/2}_B \Sigma^{-1} \mathbb{S}^{1/2}_B v = \lambda v</math>
 +
 
 +
And one can check that the linear map on the lefthand is symmetric, positive semidefinite. Thus, we can find eigenvector solution <math>v_k</math> corresponding to <math>\lambda_k</math> and compute <math>w_k = \mathbb{S}^{-1/2}_B v_k </math> .
 +
 
 +
One can also verify that, plugging this solution back to the original equation, we get
 +
 
 +
: <math>\frac{w_k^T \mathbb{S}_B w_k}{w_k^T \Sigma w_k } = \frac{\lambda_k w^T_k \Sigma w_k}{w_k^T \Sigma w_k } = \lambda_k </math>
 +
 
 +
So, we can maximize the objective by choosing the vector <math>v_k</math> corresponding to largest eigenvalue.
 +
 
 +
== Week 3: Finishing up SVM ==
 +
This week, we will talk about support vector machines which involve solving quadratic programs. At the same time, we start the theory of VC dimension.
 +
 
 +
See [[Week 3 (Machine_Learning) | this]].
 +
 
 +
Please see chapter 10 in Vapnik's book ''Statistical Learning Theory'' for reference.
 +
 
 +
== Week 4: A theory of reproducing kernels ==
 +
This week, we start looking at kernel methods which will be a main focus for this quarter. The first class gives several definitions and concepts relating to reproducing kernel hilbert spaces. We talk about two constructions of r.k. Hilbert space (i.e. via a completion of linear combinations and via measure-theoretic view). These two methods (and some others in the liturature) are equivalent. Examples of the construction are given in the finite-domain case. 
 +
 
 +
[[Week4_Machine Learning | here]]
 +
 
 +
Please see the reference for more detail.
 +
 
 +
== Week 5: More on kernels ==
 +
The plan for this week is to talk about [http://people.cs.uchicago.edu/~niyogi/papersps/MinNiyYao06.pdf this paper]
 +
 
 +
==Useful Links==
 +
===General Theory===
 +
* [http://en.wikipedia.org/wiki/Eigenvalue,_eigenvector_and_eigenspace Wikipedia article on eigenvalue, eigenvector and eigenspace]
 +
* [http://en.wikipedia.org/wiki/Ordinary_least_squares Wikipedia article on ordinary least square]
 +
* [http://en.wikipedia.org/wiki/Singular_value_decomposition Wikipedia article on singular value decomposition]
 +
* [http://en.wikipedia.org/wiki/Principal_component_analysis Wikipedia article on principal component analysis]
 +
* [http://en.wikipedia.org/wiki/Tikhonov_regularization Wikipedia article on Tikhonov regularization]
 +
* [http://www.jstor.org/view/00029947/di963001/96p0042z/0 Reproducing kernels Hilbert spaces]
 +
 
 +
=== Famous Applications of PCA in machine learning ===
 +
* Eigenface for '''face recognition'''
 +
** [http://en.wikipedia.org/wiki/Eigenface Wikipedia article on eigenface]
 +
** [http://vismod.media.mit.edu/vismod/demos/facerec/basic.html MIT introduction to eigenface]

รุ่นแก้ไขปัจจุบันเมื่อ 09:00, 9 กันยายน 2550

This page contains a list of topics, definitions, and results from Machine Learning course at University of Chicago taught by this guy. The proof is very sketchy and given without any intuition mainly because I'm so lazy. I would appreciate all kinds of help to make the note complete.

Week 1: Introduction and OLS

The materials this week are just about the framework. We show that the problem of minimizing squared errors with respect to a distribution is the same as the problem of learning . If the examples are drawn from discrete domain, this is a classification problem, but in case of continuous domain, it's a regression problem.

After defining all necessary definitions, an example of linear regression is given as a motivation of how regression is done.

Learning problem

Given a distribution on . We want to learn the objective function (with respect to the distribution :).


Learning Algorithms

Let Z be the set of possible samples. The learning algorithm is a function that maps a number of samples to a measurable function (denoted here by F a class of all measurable functions). Sometimes we consider a class of computable functions instead.

Learning errors

Suppose the learning algorithm outputs h. The learning error can be measured by

One can prove that minimizing this quantity could be reduced to the problem of minimizing the following quantity.

And that's the reason why we try to learn

In other word, we claim that

The proof is easy.

We get


Then observe that,

  • The first term only depends on distribution
  • The third term is zero

Observe also that the term which is zero.


  • The second term is equal to

Example 1

When the class , i.e. classification problem, our objective reduces to

One can show that the function

minimizes the loss (proof omited)

Example 2

When , the problem is just like regression where we try to regress :

Learning lowerbound

Let be a class of probability distribution of interest. No algorithm achieve error smaller than

Ordinary Least Square

If the relation is linear,

OLS provably gives the minimum squared error.

Consider the error

Differentiating the error and make it zero, we get

If has no nullspace (or is full rank), then it has an inverse. We get

But if the matrix is not full rank, we can still show that

minimizes the error, where is the Moore-Penrose pseudoinverse of the matrix.

First, let be eigenvalues of and let be corresponding eigenvectors.

Note that:

  • Since is positive semidefinite, all eigenvalues are non-negative
  • We can decompose to
(because is symmetric so that every eigenvector is orthogonal to others.)

Denote the column space of by . Assume WLOG that and .

We will first show that in fact, lives in the subspace . Observe that since the set of eigenvectors span , it suffices to show that it is orthogonal to every vector in

, .

Let , that is . We get

This implies that and thus, .

From the fact that lives in , we can verify that the following is what we need.

And it's straightforward to see that the solution for is not unique since any for would also satisfy the equation.

Note that the term is a Moore-Penrose pseudoinverse of a singular matrix.

Tikhonov Regularization

Instead of solving

we look at

If tends to zero, it problem becomes just like OLS. If is a certain positive number, differentiating both sides give (and equate to zero)

are eigenvalues of , where are eigenvalues of .

Observe that all eigenvalues are positive, so nullspace is zero. And thus, matrix on the left-hand side is invertible. The solution is unique.

Week 2: PCA, LDA, and Introduction to SVM

In this week, we cover another theme of techniques, i.e. Principle Component Analysis (PCA) and Fisher Linear discriminant analysis. As compared to last week topics which involve solving linear systems, this week topics involve solving eigenvalues & eigenvectors of matrix.

Since the theory of positive semidefinite will be crucial throughout the course, we give a quick introduction.

A quick introduction to theory of positive matrix

A matrix is said to be positive semidefinite if

for all w

Observation: Matrix in the form is positive semidefinite.

Proof. for any w,

property 1: Positive semidefinite matrix has eigenvalue

Proof.

(This is because can be thought as a generalized dot product so that it will always have non-negative value).

PCA

Given n points of data in , we want to find a projection which best splits these n points. The notion of 'best' is measured by variance of the projected data. Note that we can assume WLOG that . That is, we want to maximize where.

where is positive semidefinite.

The problem is given as follows.

s.t.

Using Lagrange's,

Differentiating the above,

This tells us that the solution must be among eigenvectors of .

The solution becomes clear now, i.e. choose to be the eigenvectors corresponding to the largest eigenvalue of .

Fischer LDA

Another method along the same line was proposed by Fischer. The idea is to find the direction which best seperates the mean of different classes, as contrasted to PCA which finds the direction in which the projected data have highest variance. Fisher's criterion is as follows: (note that we consider the task of classifications with two labels, 1 and -1)

where

The numerator could be written as which we denote the middle term by . Also, the denominator reduces to

which we denote the middle term by . The problem becomes

This is equivalent to

s.t.

Again, consider Lagrange and differentiate the formula. we get

This would have been done if is symmetric and positive semidefinite. We can use the eigenvector solutions like before. We can work it out as follows:

  • Observe that is a rank-one symmetric, positive semidefinite matrix (rank-one is not important, though). So, the notation makes sense because we can consider changing to eigen basis, . So .

Replacing with v, we get

And one can check that the linear map on the lefthand is symmetric, positive semidefinite. Thus, we can find eigenvector solution corresponding to and compute .

One can also verify that, plugging this solution back to the original equation, we get

So, we can maximize the objective by choosing the vector corresponding to largest eigenvalue.

Week 3: Finishing up SVM

This week, we will talk about support vector machines which involve solving quadratic programs. At the same time, we start the theory of VC dimension.

See this.

Please see chapter 10 in Vapnik's book Statistical Learning Theory for reference.

Week 4: A theory of reproducing kernels

This week, we start looking at kernel methods which will be a main focus for this quarter. The first class gives several definitions and concepts relating to reproducing kernel hilbert spaces. We talk about two constructions of r.k. Hilbert space (i.e. via a completion of linear combinations and via measure-theoretic view). These two methods (and some others in the liturature) are equivalent. Examples of the construction are given in the finite-domain case.

here

Please see the reference for more detail.

Week 5: More on kernels

The plan for this week is to talk about this paper

Useful Links

General Theory

Famous Applications of PCA in machine learning