<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://replica.wiki.extremist.software/index.php?action=history&amp;feed=atom&amp;title=Machine_Learning_Meetup_Notes%3A_2010-08-18</id>
	<title>Machine Learning Meetup Notes: 2010-08-18 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://replica.wiki.extremist.software/index.php?action=history&amp;feed=atom&amp;title=Machine_Learning_Meetup_Notes%3A_2010-08-18"/>
	<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-18&amp;action=history"/>
	<updated>2026-04-04T12:03:01Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-18&amp;diff=12373&amp;oldid=prev</id>
		<title>SpammerHellDontDelete: Created page with &#039;====Mike - HMMs==== HMM used for time series data  Markov Chains:  matrix of transition probabilities:  a_i_j is prob to go from state i to j the prob funciton is only a function…&#039;</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-18&amp;diff=12373&amp;oldid=prev"/>
		<updated>2010-08-19T04:22:31Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;#039;====Mike - HMMs==== HMM used for time series data  Markov Chains:  matrix of transition probabilities:  a_i_j is prob to go from state i to j the prob funciton is only a function…&amp;#039;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;====Mike - HMMs====&lt;br /&gt;
HMM used for time series data&lt;br /&gt;
&lt;br /&gt;
Markov Chains:  matrix of transition probabilities:  a_i_j is prob to go from state i to j&lt;br /&gt;
the prob funciton is only a function of the previous state&lt;br /&gt;
&lt;br /&gt;
HMM: example is you are trying to predict the weather based on a diary of someone&amp;#039;s ice cream eating habits&lt;br /&gt;
hidden state = what you are trying to predict (weather)&lt;br /&gt;
obs state: ice cream&lt;br /&gt;
&lt;br /&gt;
3 problems HMMs can solve: Likelihood, Decoding, Training&lt;br /&gt;
Likelihood: find all the paths&lt;br /&gt;
Decoding: whats the best sequence of hidden states that produce your obs seq (Viterbi Algorithm), which is similar to Maximum Likelihood&lt;br /&gt;
Training: Given an obs seq, learn the state transition probs and the emission probs of an HMM (Expectation Maximization, Wells-Baum, Forward-Backward Algorithm)&lt;br /&gt;
&lt;br /&gt;
====Thomas - HMM in R====&lt;br /&gt;
three packages: &lt;br /&gt;
HMM - doesnt allow for multiple chains&lt;br /&gt;
hmm.discnp&lt;br /&gt;
msm - allows for time based HMMs vs discrete time steps, you can fit in time inbetween states&lt;br /&gt;
&lt;br /&gt;
====Glen - protein prediction====&lt;br /&gt;
&lt;br /&gt;
[http://www.uniprot.org uniprot.org], fasta&lt;br /&gt;
&lt;br /&gt;
&amp;gt;sp|P69906|HBA_PANPA Hemoglobin subunit alpha OS=Pan paniscus GN=HBA1 PE=1 SV=2&lt;br /&gt;
MVLSPADKTNVKAAWGKVGAHAGEYGAEALERMFLSFPTTKTYFPHFDLSHGSAQVKGHG&lt;br /&gt;
KKVADALTNAVAHVDDMPNALSALSDLHAHKLRVDPVNFKLLSHCLLVTLAAHLPAEFTP&lt;br /&gt;
AVHASLDKFLASVSTVLTSKYR&lt;br /&gt;
&lt;br /&gt;
&amp;gt;sp|P69907|HBA_PANTR Hemoglobin subunit alpha OS=Pan troglodytes GN=HBA1 PE=1 SV=2&lt;br /&gt;
MVLSPADKTNVKAAWGKVGAHAGEYGAEALERMFLSFPTTKTYFPHFDLSHGSAQVKGHG&lt;br /&gt;
KKVADALTNAVAHVDDMPNALSALSDLHAHKLRVDPVNFKLLSHCLLVTLAAHLPAEFTP&lt;br /&gt;
AVHASLDKFLASVSTVLTSKYR&lt;br /&gt;
&lt;br /&gt;
&amp;gt;sp|P01942|HBA_MOUSE Hemoglobin subunit alpha OS=Mus musculus GN=Hba PE=1 SV=2&lt;br /&gt;
MVLSGEDKSNIKAAWGKIGGHGAEYGAEALERMFASFPTTKTYFPHFDVSHGSAQVKGHG&lt;br /&gt;
KKVADALASAAGHLDDLPGALSALSDLHAHKLRVDPVNFKLLSHCLLVTLASHHPADFTP&lt;br /&gt;
AVHASLDKFLASVSTVLTSKYR&lt;br /&gt;
&lt;br /&gt;
*BLOSUM62 is used for determining probabilities for protein mutation (ie. V -&amp;gt; I is more likely than V -&amp;gt; W)&lt;br /&gt;
&lt;br /&gt;
====Mike - Speech recognition====&lt;br /&gt;
&lt;br /&gt;
*fourier transform takes a wave and turns it into frequencies&lt;br /&gt;
*spectogram - time frequency representation&lt;br /&gt;
*speech and language processing: daniel jurafsky and james martin&lt;/div&gt;</summary>
		<author><name>SpammerHellDontDelete</name></author>
	</entry>
</feed>