<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://replica.wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=ThomasLotze</id>
	<title>Noisebridge - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://replica.wiki.extremist.software/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=ThomasLotze"/>
	<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/wiki/Special:Contributions/ThomasLotze"/>
	<updated>2026-04-06T22:11:31Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13812</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13812"/>
		<updated>2010-11-11T05:36:03Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Projected Topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 11/10/2010 @ 8:15-9:15pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Ridge regression and algorithm parameter selection&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Erin&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
* Independent Component Analysis(Mike, unscheduled -- possible 11/24)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled -- possibly 12/15)&lt;br /&gt;
* RPy?&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[DataSF.org]]&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-11-03]] -- GLMS in R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-10-27]] -- Linear Classification with scikits.learn&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13731</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13731"/>
		<updated>2010-11-06T19:46:08Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 11/10/2010 @ 8:15-9:15pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Ridge regression and algorithm parameter selection&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Erin&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
* Independent Component Analysis(Mike, unscheduled -- possible 11/24)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled -- possibly 12/15)&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[DataSF.org]]&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-11-03]] -- GLMS in R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-10-27]] -- Linear Classification with scikits.learn&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13730</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13730"/>
		<updated>2010-11-06T19:45:11Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Projected Topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 11/10/2010 @ 8:15-9:15pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Ridge regression and model parameter selection&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Erin&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
* Independent Component Analysis(Mike, unscheduled -- possible 11/24)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled -- possibly 12/15)&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[DataSF.org]]&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-11-03]] -- GLMS in R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-10-27]] -- Linear Classification with scikits.learn&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13729</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13729"/>
		<updated>2010-11-06T19:44:20Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 11/10/2010 @ 8:15-9:15pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Ridge regression and model parameter selection&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Erin&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled -- possibly 12/15)&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[DataSF.org]]&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-11-03]] -- GLMS in R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-10-27]] -- Linear Classification with scikits.learn&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-11-03&amp;diff=13675</id>
		<title>Machine Learning Meetup Notes: 2010-11-03</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-11-03&amp;diff=13675"/>
		<updated>2010-11-03T05:08:56Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;==Slides== * https://docs.google.com/present/edit?id=0AbsiP6ppTHqbZGdkajc4aGtfMjAyc2NuNGN4ZmY&amp;amp;hl=en&amp;amp;authkey=CJrup_cL  ==Setup instructions==  ===Install R=== * Download a precomp…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Slides==&lt;br /&gt;
* https://docs.google.com/present/edit?id=0AbsiP6ppTHqbZGdkajc4aGtfMjAyc2NuNGN4ZmY&amp;amp;hl=en&amp;amp;authkey=CJrup_cL&lt;br /&gt;
&lt;br /&gt;
==Setup instructions==&lt;br /&gt;
&lt;br /&gt;
===Install R===&lt;br /&gt;
* Download a precompiled binary from http://cran.r-project.org/&lt;br /&gt;
&lt;br /&gt;
===Download the example code===&lt;br /&gt;
* mkdir ~/glm; cd ~/glm; wget http://www.thomaslotze.com/glm.zip; unzip glm.zip&lt;br /&gt;
&lt;br /&gt;
==Code examples==&lt;br /&gt;
&lt;br /&gt;
===Run the logistic GLM example===&lt;br /&gt;
* R CMD BATCH ~/glm/binomial_example.r&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13644</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13644"/>
		<updated>2010-11-02T05:03:13Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Projected Topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 11/3/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: GLMs in R&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Thomas&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled -- possibly 12/15)&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[DataSF.org]]&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-11-03]] -- GLMS in R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-10-27]] -- Linear Classification with scikits.learn&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=DataSF.org&amp;diff=13241</id>
		<title>DataSF.org</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=DataSF.org&amp;diff=13241"/>
		<updated>2010-10-07T03:31:57Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;http://www.datasf.org/  The idea is to try to come up with some application for the techniques from the CS229 machine learning course.&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;http://www.datasf.org/&lt;br /&gt;
&lt;br /&gt;
The idea is to try to come up with some application for the techniques from the [[CS229]] machine learning course.&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13240</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13240"/>
		<updated>2010-10-07T03:31:11Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 10/6/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Self Organizing Maps&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Christian&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* Cross-validation (Mike, 10/13)&lt;br /&gt;
* GLMs in R (Thomas, 10/27)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled)&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[DataSF.org]]&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13239</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=13239"/>
		<updated>2010-10-07T03:28:45Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Projected Topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 10/6/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Self Organizing Maps&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: Christian&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* Cross-validation (Mike, 10/13)&lt;br /&gt;
* GLMs in R (Thomas, 10/27)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled)&lt;br /&gt;
* [[CS229]] second problem set&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12887</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12887"/>
		<updated>2010-09-25T19:53:52Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 9/29/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Stanford Machine Learning [[CS229]]: finish and bring [[File:CS229 ps1.pdf | Problem Set 1]] to compare answers/discuss, also discussion of Lecture 5 (and earlier)&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: (group)&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] first problem set (9/29)&lt;br /&gt;
* Self-organizing Maps (Christian, 10/6)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled)&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12715</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12715"/>
		<updated>2010-09-16T04:05:21Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Projected Topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 9/22/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Stanford Machine Learning [[CS229]]: Lecture 4 (and earlier)&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenter: (group)&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* [[CS229]] first problem set (9/29)&lt;br /&gt;
* Self-organizing Maps (Christian, 10/6)&lt;br /&gt;
* Boosting and Bagging (Thomas, unscheduled)&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-09-15]] -- Information Retrieval talk&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12461</id>
		<title>CS229</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12461"/>
		<updated>2010-08-26T03:24:10Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Overview ==&lt;br /&gt;
CS229 is the undergraduate machine learning course at Stanford. You can see the lectures from iTunesU and Youtube. We are going to be working through the course at one lecture a week starting 1 September 2010 and finishing 22 December 2010. There are four problem sets which we&#039;ll be doing every 4 weeks.&lt;br /&gt;
&lt;br /&gt;
[http://www.stanford.edu/class/cs229/ http://www.stanford.edu/class/cs229/] &lt;br /&gt;
&lt;br /&gt;
=== Course Description ===&lt;br /&gt;
&lt;br /&gt;
This course provides a broad introduction to machine learning and&lt;br /&gt;
statistical pattern recognition. Topics include: supervised learning&lt;br /&gt;
(generative/discriminative learning, parametric/non-parametric&lt;br /&gt;
learning, neural networks, support vector machines); unsupervised&lt;br /&gt;
learning (clustering, dimensionality reduction, kernel methods);&lt;br /&gt;
learning theory (bias/variance tradeoffs; VC theory; large margins);&lt;br /&gt;
reinforcement learning and adaptive control. The course will also&lt;br /&gt;
discuss recent applications of machine learning, such as to robotic&lt;br /&gt;
control, data mining, autonomous navigation, bioinformatics, speech&lt;br /&gt;
recognition, and text and web data processing.&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
* one lecture a week&lt;br /&gt;
* one problem set every five weeks&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/embed?src=cWE3bGFpNnZxazdpamNjbmc4bXJsY2hyNGdAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ  Google Calendar of schedule]&lt;br /&gt;
&lt;br /&gt;
==Progress: Watching Lectures ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Lecture 1&lt;br /&gt;
| Lecture 2&lt;br /&gt;
| Lecture 3&lt;br /&gt;
| Lecture 4&lt;br /&gt;
| Lecture 5&amp;lt;br /&amp;gt; 9/29&lt;br /&gt;
| Lecture 6&lt;br /&gt;
| Lecture 7&lt;br /&gt;
| Lecture 8&lt;br /&gt;
| Lecture 9&lt;br /&gt;
| Lecture 10&amp;lt;br /&amp;gt; 11/3&lt;br /&gt;
| Lecture 11&lt;br /&gt;
| Lecture 12&lt;br /&gt;
| Lecture 13&lt;br /&gt;
| Lecture 14&lt;br /&gt;
| Lecture 15&amp;lt;br /&amp;gt; 12/8&lt;br /&gt;
| Lecture 16&lt;br /&gt;
| Lecture 17&lt;br /&gt;
| Lecture 18&lt;br /&gt;
| Lecture 19&lt;br /&gt;
| Lecture 20&amp;lt;br /&amp;gt; 1/12&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Progress: Assignments ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Problem set 1&amp;lt;br /&amp;gt; due 9/29&lt;br /&gt;
| Problem set 2&amp;lt;br /&amp;gt; due 11/3&lt;br /&gt;
| Problem set 3&amp;lt;br /&amp;gt; due 12/8&lt;br /&gt;
| Problem set 4&amp;lt;br /&amp;gt; due 1/20&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12460</id>
		<title>CS229</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12460"/>
		<updated>2010-08-26T03:23:17Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Progress: Watching Lectures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Overview ==&lt;br /&gt;
CS229 is the undergraduate machine learning course at Stanford. You can see the lectures from iTunesU and Youtube. We are going to be working through the course at one lecture a week starting 1 September 2010 and finishing 22 December 2010. There are four problem sets which we&#039;ll be doing every 4 weeks.&lt;br /&gt;
&lt;br /&gt;
[http://www.stanford.edu/class/cs229/ http://www.stanford.edu/class/cs229/] &lt;br /&gt;
&lt;br /&gt;
=== Course Description ===&lt;br /&gt;
&lt;br /&gt;
This course provides a broad introduction to machine learning and&lt;br /&gt;
statistical pattern recognition. Topics include: supervised learning&lt;br /&gt;
(generative/discriminative learning, parametric/non-parametric&lt;br /&gt;
learning, neural networks, support vector machines); unsupervised&lt;br /&gt;
learning (clustering, dimensionality reduction, kernel methods);&lt;br /&gt;
learning theory (bias/variance tradeoffs; VC theory; large margins);&lt;br /&gt;
reinforcement learning and adaptive control. The course will also&lt;br /&gt;
discuss recent applications of machine learning, such as to robotic&lt;br /&gt;
control, data mining, autonomous navigation, bioinformatics, speech&lt;br /&gt;
recognition, and text and web data processing.&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
* one lecture a week&lt;br /&gt;
* one problem set every five weeks&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/embed?src=cWE3bGFpNnZxazdpamNjbmc4bXJsY2hyNGdAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ  Google Calendar of schedule]&lt;br /&gt;
&lt;br /&gt;
==Progress: Watching Lectures ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Lecture 1&lt;br /&gt;
| Lecture 2&lt;br /&gt;
| Lecture 3&lt;br /&gt;
| Lecture 4&lt;br /&gt;
| Lecture 5&amp;lt;br /&amp;gt; 9/29&lt;br /&gt;
| Lecture 6&lt;br /&gt;
| Lecture 7&lt;br /&gt;
| Lecture 8&lt;br /&gt;
| Lecture 9&lt;br /&gt;
| Lecture 10&amp;lt;br /&amp;gt; 11/3&lt;br /&gt;
| Lecture 11&lt;br /&gt;
| Lecture 12&lt;br /&gt;
| Lecture 13&lt;br /&gt;
| Lecture 14&lt;br /&gt;
| Lecture 15&amp;lt;br /&amp;gt; 12/8&lt;br /&gt;
| Lecture 16&lt;br /&gt;
| Lecture 17&lt;br /&gt;
| Lecture 18&lt;br /&gt;
| Lecture 19&lt;br /&gt;
| Lecture 20&amp;lt;br /&amp;gt; 1/12&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Progress: Assignments ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Problem set 1&lt;br /&gt;
| Problem set 2&lt;br /&gt;
| Problem set 3&lt;br /&gt;
| Problem set 4&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12458</id>
		<title>CS229</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12458"/>
		<updated>2010-08-26T03:15:04Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Overview */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Overview ==&lt;br /&gt;
CS229 is the undergraduate machine learning course at Stanford. You can see the lectures from iTunesU and Youtube. We are going to be working through the course at one lecture a week starting 1 September 2010 and finishing 22 December 2010. There are four problem sets which we&#039;ll be doing every 4 weeks.&lt;br /&gt;
&lt;br /&gt;
[http://www.stanford.edu/class/cs229/ http://www.stanford.edu/class/cs229/] &lt;br /&gt;
&lt;br /&gt;
=== Course Description ===&lt;br /&gt;
&lt;br /&gt;
This course provides a broad introduction to machine learning and&lt;br /&gt;
statistical pattern recognition. Topics include: supervised learning&lt;br /&gt;
(generative/discriminative learning, parametric/non-parametric&lt;br /&gt;
learning, neural networks, support vector machines); unsupervised&lt;br /&gt;
learning (clustering, dimensionality reduction, kernel methods);&lt;br /&gt;
learning theory (bias/variance tradeoffs; VC theory; large margins);&lt;br /&gt;
reinforcement learning and adaptive control. The course will also&lt;br /&gt;
discuss recent applications of machine learning, such as to robotic&lt;br /&gt;
control, data mining, autonomous navigation, bioinformatics, speech&lt;br /&gt;
recognition, and text and web data processing.&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/embed?src=cWE3bGFpNnZxazdpamNjbmc4bXJsY2hyNGdAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ  Google Calendar of schedule]&lt;br /&gt;
&lt;br /&gt;
==Progress: Watching Lectures ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Lecture 1&lt;br /&gt;
| Lecture 2&lt;br /&gt;
| Lecture 3&lt;br /&gt;
| Lecture 4&lt;br /&gt;
| Lecture 5&lt;br /&gt;
| Lecture 6&lt;br /&gt;
| Lecture 7&lt;br /&gt;
| Lecture 8&lt;br /&gt;
| Lecture 9&lt;br /&gt;
| Lecture 10&lt;br /&gt;
| Lecture 11&lt;br /&gt;
| Lecture 12&lt;br /&gt;
| Lecture 13&lt;br /&gt;
| Lecture 14&lt;br /&gt;
| Lecture 15&lt;br /&gt;
| Lecture 16&lt;br /&gt;
| Lecture 17&lt;br /&gt;
| Lecture 18&lt;br /&gt;
| Lecture 19&lt;br /&gt;
| Lecture 20&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Progress: Assignments ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Problem set 1&lt;br /&gt;
| Problem set 2&lt;br /&gt;
| Problem set 3&lt;br /&gt;
| Problem set 4&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12457</id>
		<title>CS229</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12457"/>
		<updated>2010-08-26T03:14:39Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Overview ==&lt;br /&gt;
CS229 is the undergraduate machine learning course at Stanford. You can see the lectures from iTunesU and Youtube. We are going to be working through the course at one lecture a week starting 1 September 2010 and finishing 22 December 2010. There are four problem sets which we&#039;ll be doing every 4 weeks.&lt;br /&gt;
&lt;br /&gt;
[http://www.stanford.edu/class/cs229/] &lt;br /&gt;
&lt;br /&gt;
=== Course Description ===&lt;br /&gt;
&lt;br /&gt;
This course provides a broad introduction to machine learning and&lt;br /&gt;
statistical pattern recognition. Topics include: supervised learning&lt;br /&gt;
(generative/discriminative learning, parametric/non-parametric&lt;br /&gt;
learning, neural networks, support vector machines); unsupervised&lt;br /&gt;
learning (clustering, dimensionality reduction, kernel methods);&lt;br /&gt;
learning theory (bias/variance tradeoffs; VC theory; large margins);&lt;br /&gt;
reinforcement learning and adaptive control. The course will also&lt;br /&gt;
discuss recent applications of machine learning, such as to robotic&lt;br /&gt;
control, data mining, autonomous navigation, bioinformatics, speech&lt;br /&gt;
recognition, and text and web data processing.&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/embed?src=cWE3bGFpNnZxazdpamNjbmc4bXJsY2hyNGdAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ  Google Calendar of schedule]&lt;br /&gt;
&lt;br /&gt;
==Progress: Watching Lectures ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Lecture 1&lt;br /&gt;
| Lecture 2&lt;br /&gt;
| Lecture 3&lt;br /&gt;
| Lecture 4&lt;br /&gt;
| Lecture 5&lt;br /&gt;
| Lecture 6&lt;br /&gt;
| Lecture 7&lt;br /&gt;
| Lecture 8&lt;br /&gt;
| Lecture 9&lt;br /&gt;
| Lecture 10&lt;br /&gt;
| Lecture 11&lt;br /&gt;
| Lecture 12&lt;br /&gt;
| Lecture 13&lt;br /&gt;
| Lecture 14&lt;br /&gt;
| Lecture 15&lt;br /&gt;
| Lecture 16&lt;br /&gt;
| Lecture 17&lt;br /&gt;
| Lecture 18&lt;br /&gt;
| Lecture 19&lt;br /&gt;
| Lecture 20&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Progress: Assignments ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Problem set 1&lt;br /&gt;
| Problem set 2&lt;br /&gt;
| Problem set 3&lt;br /&gt;
| Problem set 4&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
| Joe&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12456</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12456"/>
		<updated>2010-08-26T03:11:52Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 9/8/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Undetermined; certainly to include talking about the first two CS229 lectures&lt;br /&gt;
*Details: &lt;br /&gt;
*Presenters:&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* Boosting and Bagging (Thomas)&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
[[CS229]] - The Stanford Machine learning Course @ noisebridge&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] -- Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] -- Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12449</id>
		<title>CS229</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=CS229&amp;diff=12449"/>
		<updated>2010-08-26T03:04:04Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Progress */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Schedule ==&lt;br /&gt;
&lt;br /&gt;
[http://www.google.com/calendar/embed?src=cWE3bGFpNnZxazdpamNjbmc4bXJsY2hyNGdAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ  Google Calendar of schedule]&lt;br /&gt;
&lt;br /&gt;
==Progress ==&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; cellpadding=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
| Name&lt;br /&gt;
| Problem set 1&lt;br /&gt;
| Problem set 2&lt;br /&gt;
| Problem set 3&lt;br /&gt;
| Problem set 4&lt;br /&gt;
|-&lt;br /&gt;
| You!&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
| Thomas&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12448</id>
		<title>Machine Learning Meetup Notes: 2010-08-25</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12448"/>
		<updated>2010-08-26T03:03:33Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Full details will be at [[CS229]]&lt;br /&gt;
&lt;br /&gt;
We will be going through the Stanford course [http://www.stanford.edu/class/cs229/ Machine Learning CS 229], one lecture per week.&lt;br /&gt;
&lt;br /&gt;
We will discuss the lectures over the normal machine learning mailing list (creating a new mailing list if people complain about too much discussion)&lt;br /&gt;
&lt;br /&gt;
We will also be able to talk about the lecture topics after (or as the main focus of) the weekly Machine Learning meetings.&lt;br /&gt;
&lt;br /&gt;
Asignments will be due each month.&lt;br /&gt;
&lt;br /&gt;
Micah created a [http://www.google.com/calendar/embed?src=cWE3bGFpNnZxazdpamNjbmc4bXJsY2hyNGdAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ Google calendar] with the schedule.&lt;br /&gt;
&lt;br /&gt;
We will keep track of our progress in watching lectures and completing problem sets on the wiki (to keep each other honest and motivated)&lt;br /&gt;
&lt;br /&gt;
Interested Participants:&lt;br /&gt;
* Glen&lt;br /&gt;
* Erin&lt;br /&gt;
* Paul&lt;br /&gt;
* Micah&lt;br /&gt;
* Thomas&lt;br /&gt;
* Shahin&lt;br /&gt;
* Tim&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12446</id>
		<title>Machine Learning Meetup Notes: 2010-08-25</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12446"/>
		<updated>2010-08-26T02:54:49Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Full details will be at [[CS229]]&lt;br /&gt;
&lt;br /&gt;
We will be going through the Stanford course [http://www.stanford.edu/class/cs229/ Machine Learning CS 229], one lecture per week.&lt;br /&gt;
&lt;br /&gt;
We will discuss the lectures over the normal machine learning mailing list (creating a new mailing list if people complain about too much discussion)&lt;br /&gt;
&lt;br /&gt;
We will also be able to talk about the lecture topics after (or as the main focus of) the weekly Machine Learning meetings&lt;br /&gt;
&lt;br /&gt;
Micah will create a Google calendar with the schedule&lt;br /&gt;
&lt;br /&gt;
We will keep track of our progress in watching lectures and completing problem sets on the wiki (to keep each other honest and motivated)&lt;br /&gt;
&lt;br /&gt;
Interested Participants:&lt;br /&gt;
* Glen&lt;br /&gt;
* Erin&lt;br /&gt;
* Paul&lt;br /&gt;
* Micah&lt;br /&gt;
* Thomas&lt;br /&gt;
* Shahin&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12444</id>
		<title>Machine Learning Meetup Notes: 2010-08-25</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12444"/>
		<updated>2010-08-26T02:53:21Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We will be going through the Stanford course [http://www.stanford.edu/class/cs229/ Machine Learning CS 229], one lecture per week.&lt;br /&gt;
&lt;br /&gt;
We will discuss the lectures over the normal machine learning mailing list (creating a new mailing list if people complain about too much discussion)&lt;br /&gt;
&lt;br /&gt;
We will also be able to talk about the lecture topics after (or as the main focus of) the weekly Machine Learning meetings&lt;br /&gt;
&lt;br /&gt;
Micah will create a Google calendar with the schedule&lt;br /&gt;
&lt;br /&gt;
We will keep track of our progress in watching lectures and completing problem sets on the wiki (to keep each other honest and motivated)&lt;br /&gt;
&lt;br /&gt;
Interested Participants:&lt;br /&gt;
* Glen&lt;br /&gt;
* Erin&lt;br /&gt;
* Paul&lt;br /&gt;
* Micah&lt;br /&gt;
* Thomas&lt;br /&gt;
* Shahin&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12443</id>
		<title>Machine Learning Meetup Notes: 2010-08-25</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12443"/>
		<updated>2010-08-26T02:52:35Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We will be going through the Stanford course [http://www.stanford.edu/class/cs229/ Machine Learning CS 229], one lecture per week.&lt;br /&gt;
&lt;br /&gt;
We will discuss the lectures over the normal machine learning mailing list (creating a new mailing list if people complain about too much discussion)&lt;br /&gt;
&lt;br /&gt;
We will also be able to talk about with &lt;br /&gt;
&lt;br /&gt;
Micah will create a Google calendar with the schedule&lt;br /&gt;
&lt;br /&gt;
We will keep track of our progress in watching lectures and completing problem sets on the wiki (to keep each other honest and motivated)&lt;br /&gt;
&lt;br /&gt;
Interested Participants:&lt;br /&gt;
* Glen&lt;br /&gt;
* Erin&lt;br /&gt;
* Paul&lt;br /&gt;
* Micah&lt;br /&gt;
* Thomas&lt;br /&gt;
* Shahin&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12442</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=12442"/>
		<updated>2010-08-26T02:49:57Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Notes from Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 8/25/2010 @ 7:30-9:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Random Group Discussion&lt;br /&gt;
*Details: Let&#039;s talk about Joe&#039;s idea for a study group using the CS229 Course&lt;br /&gt;
*Presenters: &lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Tools ===&lt;br /&gt;
* [[Machine Learning/weka]]&lt;br /&gt;
* [[Machine Learning/moa]]&lt;br /&gt;
* [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
=== Next Projected Topics ===&lt;br /&gt;
* Starting out in R (Erin)&lt;br /&gt;
* Boosting and Bagging (Thomas)&lt;br /&gt;
* Hidden Markov Models (Thomas)&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**[[Machine_Learning/HMM|Hidden Markov Models]]&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-25]] - Organizing to go through CS 229&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-08-18]] - Hidden Markov Models (HMMs)&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-21]] -- Intro to R&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-14]] -- Neural Networks &lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-07-07]] -- Kaggle HIV, Edit Distance&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-30]] -- DNA Overview, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-22]] -- PIG Tutorial&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-16]] -- MOA, Kaggle HIV&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-09]] -- KDD Recap, JUNG/Graph Clustering&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12441</id>
		<title>Machine Learning Meetup Notes: 2010-08-25</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-08-25&amp;diff=12441"/>
		<updated>2010-08-26T02:49:32Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;We will be going through the Stanford course [http://www.stanford.edu/class/cs229/ Machine Learning CS 229], one lecture per week.  We will discuss the lectures over the normal m…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We will be going through the Stanford course [http://www.stanford.edu/class/cs229/ Machine Learning CS 229], one lecture per week.&lt;br /&gt;
&lt;br /&gt;
We will discuss the lectures over the normal machine learning mailing list (creating a new mailing list if people complain about too much discussion)&lt;br /&gt;
&lt;br /&gt;
We will also be able to talk about with &lt;br /&gt;
&lt;br /&gt;
Micah will create a Google calendar with the schedule&lt;br /&gt;
&lt;br /&gt;
Interested Participants:&lt;br /&gt;
* Glen&lt;br /&gt;
* Erin&lt;br /&gt;
* Paul&lt;br /&gt;
* Micah&lt;br /&gt;
* Thomas&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Category:Events&amp;diff=12440</id>
		<title>Category:Events</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Category:Events&amp;diff=12440"/>
		<updated>2010-08-26T02:45:23Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Recurring Events edit */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;!-- Note that this page uses transclusion. Content between the &amp;quot;onlyinclude&amp;quot; tags below will be pushed to the main page --&amp;gt;&lt;br /&gt;
Official, Semi-Official, one-off and other events at the Noisebridge space.&lt;br /&gt;
&lt;br /&gt;
=Event Calendar=&lt;br /&gt;
Not all events make it onto this calendar. Many events only make it to the Discussion or Announcements [[Mailinglist | mailing lists]], [[IRC]] or in person at [[Category:Meeting_Notes | Tuesday meetings]]. Best of all, Noisebridge is about people getting together at the space in San Francisco to do stuff... like in person. Some events just happen.  Pay attention!&lt;br /&gt;
&lt;br /&gt;
Event posters are encouraged to crosspost to the Google Calendar. View the  [http://www.google.com/calendar/embed?src=vo3i3c0qtjnkjr2ojasd0ftt8s%40group.calendar.google.com&amp;amp;ctz=America/Los_Angeles Google Calendar], view the [http://www.google.com/calendar/feeds/vo3i3c0qtjnkjr2ojasd0ftt8s%40group.calendar.google.com/public/basic Google Calendar in XML], or the [http://www.google.com/calendar/ical/vo3i3c0qtjnkjr2ojasd0ftt8s%40group.calendar.google.com/public/basic.ics Google Calendar in ical] format.&lt;br /&gt;
&lt;br /&gt;
To post Google Calendar entries for your event, contact a Noisebridge member for access.&lt;br /&gt;
&lt;br /&gt;
(Wouldn&#039;t it be great if there were a gCal mediawiki plugin so crossposting wasn&#039;t needed? Do you know of a good one? Help us!) &amp;lt;- working on this, need to upgrade Mediawiki in order to use some plugins.&lt;br /&gt;
&amp;lt;!-- Items inside this &amp;quot;onlyinclude&amp;quot; tag will be pushed to the main page --&amp;gt;&amp;lt;onlyinclude&amp;gt;&lt;br /&gt;
=== Upcoming Events &amp;lt;small&amp;gt;[https://www.noisebridge.net/index.php?title=Category:Events&amp;amp;action=edit&amp;amp;section=2 edit]&amp;lt;/small&amp;gt; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Sunday, August 22, 19:00 CLUB-MATE DROPOFF AND TASTING PARTY&#039;&#039;&#039; Nick Farr will be in town to drop off Club-Mate ordered by San Franciscans!&lt;br /&gt;
&lt;br /&gt;
=== Recurring Events &amp;lt;small&amp;gt;[https://www.noisebridge.net/index.php?title=Category:Events&amp;amp;action=edit&amp;amp;section=3 edit]&amp;lt;/small&amp;gt; ===&lt;br /&gt;
&amp;lt;!-- Large turnout events should be written in &#039;&#039;&#039;bold&#039;&#039;&#039;. --&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;Monday&#039;&#039;&#039;&lt;br /&gt;
** Anytime [[https://www.noisebridge.net/wiki/House_Keeping#Trash_and_Recycling| Trash Night]]  - Dont forget to take out the trash for Tuesday morning!&lt;br /&gt;
** &#039;&#039;&#039;18:00 [[iPhone OS developer weekly meetup]]&#039;&#039;&#039; - NOTE: July 5th meeting cancelled. We make teh applukashuns, joyn us 2 make dem 2! http://meetup.com/iphonedevsf&lt;br /&gt;
** 18:30 [[PyClass]] - Learn how to program using the Python programming language.&lt;br /&gt;
** &#039;&#039;&#039;19:00 [[Circuit Hacking Mondays]]&#039;&#039;&#039; - Learn to solder! Mitch will bring kits to make cool, hackable things that you can bring home after you make them.  Bring your own projects to hack!&lt;br /&gt;
** 19:00 1st and 3rd Mondays the BACE Timebank group meets to help organize community mutual aid by trading in equal time credits, wherever there is space. For more info. mira (at) sfbace.org or to join go to timebank.sfbace.org&lt;br /&gt;
* &#039;&#039;&#039;Tuesday&#039;&#039;&#039;&lt;br /&gt;
** 15:00 [[Linux System Administration Study Group]] - Study Linux admining in the Turing classroom.&lt;br /&gt;
** 18:30 Bay Area Community Exchange Project Roundtable Meeting (third Tues. of every month)-discussion of alternative currencies in the back classroom.&lt;br /&gt;
** 19:00 [[Origami|Learn You A Origami!]] - Learn how make folded-paper models. Beginners welcome!&lt;br /&gt;
** &#039;&#039;&#039;20:00 [[#Meetings|Noisebridge Weekly Meeting]]&#039;&#039;&#039; - Introducing new people and events to the space, general discussion, and decision making.&lt;br /&gt;
** 20:30 [[Spacebridge]] - Noisebridge&#039;s space program (project update meeting) &lt;br /&gt;
* &#039;&#039;&#039;Wednesday&#039;&#039;&#039;&lt;br /&gt;
** 18:00 [[LinuxDiscussion|Linux Discussion]] - Play with Linux in the Turing classroom.&lt;br /&gt;
** &#039;&#039;&#039;17:00 &#039;&#039;&#039;[[BarCamp Staff Meeting]]  - Meeting for BarCamp Staff to discuss plans for San Francisco BarCamp.&lt;br /&gt;
** &#039;&#039;&#039;18:00 [[Gamebridge|Gamebridge Unityversity]]&#039;&#039;&#039; - Collab and learn to make video games with geeks, if it&#039;s your first night you will actually get to make a game!&lt;br /&gt;
** &#039;&#039;&#039;19:00 [[SCoW]]&#039;&#039;&#039; - Sewing, Crafting, Or Whatever! Come make cool stuff with geeks.&lt;br /&gt;
** &#039;&#039;&#039;19:30 [[Machine Learning]]&#039;&#039;&#039; - Teach computers to learn stuff using artificial intelligence and other techniques.&lt;br /&gt;
* &#039;&#039;&#039;Thursday&#039;&#039;&#039;&lt;br /&gt;
** Anytime [[https://www.noisebridge.net/wiki/House_Keeping#Trash_and_Recycling| Trash Night]]  - Dont forget to take out the trash for Friday morning!&lt;br /&gt;
** 19:30 [[Games]] - Play games with geeks.&lt;br /&gt;
** 20:00 [http://baha.bitrot.info/ Bay Area Hacker&#039;s Association - security meeting] (2nd Thursdays)&lt;br /&gt;
** &#039;&#039;&#039;20:00 [[Five_Minutes_of_Fame | Five Minutes of Fame]]&#039;&#039;&#039; (3rd Thursdays)&lt;br /&gt;
* &#039;&#039;&#039;Friday&#039;&#039;&#039; &lt;br /&gt;
** 15:00 [[Linux System Administration Study Group]] - Study Linux admining in the Turing classroom. &lt;br /&gt;
** 19:00 [[Science, Engineering &amp;amp; Design Huddle]] - Weekly group to discuss design approach, share techniques, and solve any problem you may be having with your project(s).&lt;br /&gt;
** 20:00 [[Moving/2169 Mission/Buildout|2169 Buildout planning]] - Discussion &amp;amp; execution of how to renovate our new space.&lt;br /&gt;
** 18:00 [[RantMeet]] (1st Fridays) [http://www.rantmedia.ca Rant Media] is a global hacker/survival/indy media phyle that meets up around the world.&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Saturday&#039;&#039;&#039;&lt;br /&gt;
** 15:00 [[Zergbridge R&amp;amp;D]] 3pm-10pm and beyond - Come fend off zombies, aliens, pyros and scientific experiments gone mad at our weekly LAN party.  If you&#039;re interested in game development come earlier and you can collaborate on our weekly gaming project.&lt;br /&gt;
** 20:00 [[NSFW]] - Now Showing From the Web, last Saturday of the month. Share interesting videos you&#039;ve found on the web for the past month or bring content you made.&lt;br /&gt;
* &#039;&#039;&#039;Sunday&#039;&#039;&#039;&lt;br /&gt;
** &#039;&#039;&#039;15:00 [[Go]]&#039;&#039;&#039; - Playing of the Go boardgame. On nice days we often take the boards to Dolores Park and play there.&lt;br /&gt;
** 15:00 [[Locks!]] - Lock sport, sundays when there is demand. ( See [[locks!]] for more information. )&lt;br /&gt;
** 17:00 [[Rsync Users Group]] - A twelve step program for those who have poor *nix habits.&lt;br /&gt;
** 18:00 [[Spacebridge]] - Noisebridge&#039;s space program&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/onlyinclude&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Proposed Future Events and Classes ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[German]] - Learn German, all levels. 7pm beginners, 8pm advanced. RSVP 24 hours in advance for the benefit of the instructor. Events ran May-November 2009 on Mondays. Currently on hiatus. Get on the mailing list.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Mandarin Corner|Mandarin]] - Learn or practice Mandarin, all levels. Also currently on hiatus. Get on the mailing list.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Movie Night!]] - [[User:ThOMG|Thom]] wants to build community through nerdy sci-fi! (+Bill+Ted+Excellence++)&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Introduction to the AVR Microcontroller]] - [[User:Mightyohm|Jeff]] and [[User:Maltman23|Mitch]] are planning an introductory class for people wanting to make cool projects with AVRs.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Basic Chemistry Lab Techniques]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Cuddle Puddle for the Economy]] - Stress-hacking with informal massage exchange.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Milk and Cookies]] - Come read your favorite selections out loud. With Milk and Cookies (and yeah, probably beer too).&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Processing Workshop 2]] - [[User:Scmurray|Scott]] is interested in teaching this, and is busy thinking about what, where, when, why, and how.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;:  [[Hack your Hardware]] -- We call BS on &amp;quot;no user-serviceable parts inside&amp;quot;&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Homebrew Instruction Class]] - The Wort (pt 1/3)&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Trip to Shooting Range]] - Field trip to a shooting range, to shoot guns.  Express interest at [[Trip to Shooting Range]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Surface Mount Soldering Workshop]] - Learn how to solder cicuits with small surface mount parts.  [[User:maltman23|Mitch Altman]] and Martin Bogomolni and others will show their tricks.  [[User:maltman23|Mitch]] will bring hackable kits that uses surface mounts for you to solder.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039; - [[Locksport and Lockpicking]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039; - [[Version control tutorial]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039; - [[Foreign language learning for rocket scientists]] - I&#039;m near-native (fool people when I try) in (French and) Japanese, and a pro trans/terpreter and will share my shortcuts (skill-order, vocab, speed/articulation, translation≅grammar). No expertise on tonal languages yet... so if you know how to remember tones or how tone-sandhi interacts with speed and/or how nuances of speaker attitude are expressed in them (what we do with rythm/inflection/sentence-intonation and stress in Eng., and with particles and ??? in e.g. Cantonese) please chime in or call me (415-608-0564) so I can convey your wisdom. [also looking for a from-scratch Arabic partner]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Getting started with Arduino]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Distributed Databases]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Scrum Club]] - I though I&#039;d test the waters and see if anyone was interested in a noisebridge scrum club details are here http://scrumclub.org/scrum-clubs/ if inturested hit me up twitter: @theabcasian, facebook: http://www.facebook.com/theabcasian&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[CNC Mill Workshop]] - Who wants to make stuff on the [[MaxNCMill]]?&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Math &amp;amp; Science Help]] - If you would like some math, science or engineering help, I&#039;m down to lend a hand.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Cyborg Group|Cyborg Group / Sensebridge]] - Work on projects like artificial senses.  Someone needs to lead this!&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[OpenEEG]] - Brain tech. Has historically met on Sundays, at the behest of interested parties.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Programming_for_Poets | Programming for Poets]] -  Gentle intro to programming using Processing&lt;br /&gt;
&lt;br /&gt;
= Past Events =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;June 5th, 12:00-19:00 - [[NoiseBridgeRehab]]&#039;&#039;&#039; - Help make the space more usable and accessible! Noisebridge needs your help!&lt;br /&gt;
* &#039;&#039;&#039;June 5th, 16:00-20:00 - [[Science For Juggalos]]&#039;&#039;&#039; - Science Fair in front of the Warfield Theater teaching magnetism to Juggalos&lt;br /&gt;
* &#039;&#039;&#039;June 6th, 15:00 - [[AVC Meetup]]&#039;&#039;&#039; - Entrepreneurial bonding &amp;amp; matchmaking&lt;br /&gt;
* &#039;&#039;&#039;June 9th, 21:00 - Your liver supports Noisebridge&#039;&#039;&#039; - Come to Elixir @ 16th &amp;amp; Guerrero anytime after 21:00 and drink, drink, drink! 50% of tips go to Noisebridge&lt;br /&gt;
* &#039;&#039;&#039;February 27th, 20:00 - [[Hacker EPROM]]&#039;&#039;&#039; - Noisebridge&#039;s first prom! Nice tie and a (robot) date required. We will have a DJ and punch.&lt;br /&gt;
* &#039;&#039;&#039;February 24th, 19:00, Wednesday - Joris Peels, of [http://www.shapeways.com Shapeways]&#039;&#039;&#039;, and expert on 3D printing, will give a [[ShaperwaysPresentation | talk and demonstration]] at Noisebridge!.&lt;br /&gt;
* &#039;&#039;&#039;February 23rd, 18:00 - Cleaning day&#039;&#039;&#039; - Come and help clean Noisebridge, because everyone loves a clean hack space.&lt;br /&gt;
* &#039;&#039;&#039;February 12th, 21:00 - visit from Steve Jackson&#039;&#039;&#039;. Game designer [http://en.wikipedia.org/wiki/Steve_Jackson_%28US_game_designer%29 Steve Jackson], founder of Steve Jackson Games, will visit Noisebridge.&lt;br /&gt;
* &#039;&#039;&#039;January 27th, 18:00-20:00 - [[beatrixjar event|Circuit Bending Workshop]]&#039;&#039;&#039; - [http://www.beatrixjar.com/ Beatrix*JAR] (contact [[User:Gpvillamil|Gian Pablo]] for more info)&lt;br /&gt;
* &#039;&#039;&#039;January 27th, 20:00-22:00 - [[beatrixjar event|Circuit Bending Performance]]&#039;&#039;&#039; - [http://www.beatrixjar.com/ Beatrix*JAR] - &amp;quot;Celebrate a night of new sound that will change your idea of music forever!&amp;quot;&lt;br /&gt;
* &#039;&#039;&#039;January 25th, 19:30 - [[Bag Porn]]&#039;&#039;&#039; - What&#039;s in your bag?&lt;br /&gt;
* &#039;&#039;&#039;January 20th, 19:00-21:00 - [http://groups.google.com/group/bacat/about Bay Categories &amp;amp; Types]&#039;&#039;&#039; - Categories, monoids, monads, functors and more! Held in the Alonzo Church classroom.&lt;br /&gt;
* &#039;&#039;&#039;January 20th, 19:00 - [[User Experience Book Club SF]]&#039;&#039;&#039; - Our book this month is &amp;quot;A Theory of Fun for Game Design&amp;quot; by Raph Koster - http://is.gd/6sEqw (meets in Turing)&lt;br /&gt;
* &#039;&#039;&#039;January 21st, 20:00 - [[Five Minutes of Fame]]&#039;&#039;&#039; - Monthly set of lightning talks on diverse topics&lt;br /&gt;
* &#039;&#039;&#039;January 22nd, 17:00 - [[CleaningParty| Cleaning Party]]&#039;&#039;&#039; - Come help clean up Noisebridge! Awsum fun!&lt;br /&gt;
* ...January 14th,16th, and 17th 1:00- ??? Build Out day for kitchen/bathroom/laundry bring yourself and a good attitude, learn a few things as well&lt;br /&gt;
* &#039;&#039;&#039;January 15th, 18:00 - [[CNC_Mill_Workshop]]&#039;&#039;&#039; - Learn to use the CNC mill for 2D engraving and circuit board routing&lt;br /&gt;
* Thursdays 17:00 [[ASL Group|American Sign Language]] - Learn how to talk without using your voice (or just come chat in ASL). &amp;lt;small&amp;gt;[http://whenisgood.net/noisebridge/asl/generic click to reschedule]&amp;lt;/small&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;November 18th, 19:30&#039;&#039;&#039; - [[Dorkbot_2009_11_18|Dorkbot]]&lt;br /&gt;
* &#039;&#039;&#039;November 19th, 18:00&#039;&#039;&#039; - [[Mesh meetup]]&lt;br /&gt;
* &#039;&#039;&#039;November 19th, 20:00&#039;&#039;&#039; - [[Five Minutes of Fame]]&lt;br /&gt;
* &#039;&#039;&#039;November 20th, 18:00&#039;&#039;&#039; - Loud Objects [http://www.flickr.com/photos/createdigitalmedia/3428249036/ Noise Toy workshop].&lt;br /&gt;
* &#039;&#039;&#039;November 20th, 20:00&#039;&#039;&#039; - Performance by [http://www.loudobjects.com/ Loud Objects], (featuring Tristan Perich and Lesley Flanigan) and [http://www.myspace.com/jibkidder Jib Kidder].&lt;br /&gt;
:&#039;&#039;&#039;2009-11-05&#039;&#039;&#039; - [http://www.server-sky.com/ Server Sky presentation: Internet and Computation in Orbit] by Keith Lofstrom&lt;br /&gt;
:&#039;&#039;&#039;2009-11-05&#039;&#039;&#039; - [[Mesh meetup]]&lt;br /&gt;
:&#039;&#039;&#039;2009-11-02&#039;&#039;&#039; - [[French]] book club meeting to discuss  [http://www.amazon.com/exec/obidos/tg/detail/-/2842612892/ref=ord_cart_shr?_encoding=UTF8&amp;amp;m=ATVPDKIKX0DER&amp;amp;v=glance Une Si Longue Lettre]&lt;br /&gt;
: &#039;&#039;&#039; October 1st, 18:00&#039;&#039;&#039; - [[Wireless_Mesh_Network_Meetup | Mesh wireless meetup]]&lt;br /&gt;
: &#039;&#039;&#039; October 1st, 19:00&#039;&#039;&#039; - [http://groups.google.com/group/bacat Bay Area Categories and Types]&lt;br /&gt;
: &#039;&#039;&#039;2009-10-03&#039;&#039;&#039; [[Year 1 Open Hacker House]]&lt;br /&gt;
:&#039;&#039;&#039;Friday&#039;&#039;&#039;: [[CrazyCryptoNight]] - Discussion of cryptography for beginners through experts. 6-???&lt;br /&gt;
:&#039;&#039;&#039;Sunday&#039;&#039;&#039; : [[OpenEEG | OpenEEG Hacking]] Sundays, at 3-5pm.&lt;br /&gt;
:&#039;&#039;&#039;Tuesday&#039;&#039;&#039;: [[Haskell/Haschool]] - Learn Haskell with Jason Dusek.  6PM - 7:30PM, from May until we&#039;re all experts.&lt;br /&gt;
:&#039;&#039;&#039;Wednesday&#039;&#039;&#039;: [[Adobe_Lightroom|Adobe Lightroom]] - Become a more organized photographer. Weekly class (mostly held off site).&lt;br /&gt;
:&#039;&#039;&#039;Thursday&#039;&#039;&#039;: [[Professional VFX Compositing With Adobe After Effects]] - Taught by [[User:SFSlim|Aaron Muszalski]]. 7:30PM - 10PM, most Thursdays in May &amp;amp; June &amp;amp; ? (click through dammit)&lt;br /&gt;
:&#039;&#039;&#039;2009-09-17&#039;&#039;&#039;: [[Five Minutes of Fame]] 3D Edition&lt;br /&gt;
:&#039;&#039;&#039;2009-09-17&#039;&#039;&#039;: [[Wireless Mesh Network Meetup | Mesh wireless meetup]]&lt;br /&gt;
:&#039;&#039;&#039;2009-08-20&#039;&#039;&#039;: [[Five Minutes of Fame]] One Dee Edition&lt;br /&gt;
:&#039;&#039;&#039;2009-07-16&#039;&#039;&#039;: [[Five Minutes of Fame]] Zero Dee&lt;br /&gt;
:&#039;&#039;&#039;2009-07-02 - 2009-07-05&#039;&#039;&#039;: [http://toorcamp.org Toorcamp]&lt;br /&gt;
:&#039;&#039;&#039;2009-07-01&#039;&#039;&#039;: Noisedroid meeting to discuss location logging on Android platform (and other stuff too, I&#039;m sure)&lt;br /&gt;
:&#039;&#039;&#039;2009-06-30&#039;&#039;&#039;: [[Powerbocking Class|Powerbocking class]]&lt;br /&gt;
:&#039;&#039;&#039;2009-06-30&#039;&#039;&#039;: &amp;quot;Suing Telemarketers for Fun and Profit&amp;quot; (Toorcamp talk preview)&lt;br /&gt;
:&#039;&#039;&#039;2009-06-28&#039;&#039;&#039;: &amp;quot;Meditation for Hackers&amp;quot; (Toorcamp workshop preview)&lt;br /&gt;
:&#039;&#039;&#039;2009-06-18&#039;&#039;&#039;: [[Five Minutes of Fame]]&lt;br /&gt;
:&#039;&#039;&#039;2009-06-15&#039;&#039;&#039;: [[Eagle Workshop]]  Session two of the Eagle CAD workshop.&lt;br /&gt;
:&#039;&#039;&#039;2009-06-13&#039;&#039;&#039;: [[RoboGames 2009]] Noisebridge had a booth staffed by vounteers, great fun!&lt;br /&gt;
:&#039;&#039;&#039;2009-05-21&#039;&#039;&#039;: [[Five Minutes of Fame]]&lt;br /&gt;
:&#039;&#039;&#039;2009-04-27&#039;&#039;&#039;: [[EagleCAD workshop]] -- learn to use this CAD tool for printed circuit board design&lt;br /&gt;
:&#039;&#039;&#039;2009-04-16&#039;&#039;&#039;: [[Five Minutes of Fame]] April showers &amp;amp; flowers edition&lt;br /&gt;
:&#039;&#039;&#039;2009-04-11&#039;&#039;&#039;: [[RFID Hacking]] weekend workshop  (this event moved from the original March date)&lt;br /&gt;
:&#039;&#039;&#039;2009-04-05&#039;&#039;&#039;: [[First aid and CPR class]] Learning how to not only not die, but also reduce scarring!&lt;br /&gt;
:&#039;&#039;&#039;2009-04-03&#039;&#039;&#039;: [[Sudo pop]] 2PM and on. Making the first batch of a Noisebridge label yerba mate-niated rootbrew, gratis and DIY&lt;br /&gt;
:&#039;&#039;&#039;2009-03-26&#039;&#039;&#039;: [[OpenEEG | OpenEEG Hacking]] first meet up for this new group: 8 pm&lt;br /&gt;
:&#039;&#039;&#039;2009-03-19&#039;&#039;&#039;: [[Five Minutes of Fame]]&lt;br /&gt;
:&#039;&#039;&#039;2009-03-12&#039;&#039;&#039;: [[OpenBTS and GSM]] talk by David Burgess&lt;br /&gt;
:&#039;&#039;&#039;2009-02-14&#039;&#039;&#039;: [[Open Heart Workshop]] Valentine&#039;s Day blinkyheart soldering party! &lt;br /&gt;
:&#039;&#039;&#039;2009-02-13&#039;&#039;&#039;: [[Time-t_Party|&amp;lt;tt&amp;gt;time_t&amp;lt;/tt&amp;gt; Party]] to celebrate 1,234,567,890 since the Unix epoch.&lt;br /&gt;
:&#039;&#039;&#039;2009-02-09&#039;&#039;&#039;: [[Spanish learning at 8:30]]&lt;br /&gt;
:&#039;&#039;&#039;2009-02-05&#039;&#039;&#039;: [[PGP Key Workshop]]&lt;br /&gt;
:&#039;&#039;&#039;2009-01-31&#039;&#039;&#039;: [[Locksport and Lockpicking]]&lt;br /&gt;
:&#039;&#039;&#039;2008-12-27&#039;&#039;&#039;: [[25C3]] Chaos Computer Congress in Berlin&lt;br /&gt;
:&#039;&#039;&#039;2008-12-20 &amp;amp; 21&#039;&#039;&#039;: [[Creme Brulee]] Workshop on creating a french dessert, with bonus propane torch.&lt;br /&gt;
:&#039;&#039;&#039;2008-12-17 20:00&#039;&#039;&#039;: [[Machine Learning]] Birds-of-a-feather&lt;br /&gt;
:&#039;&#039;&#039;2008-11-24&#039;&#039;&#039;: [[Circuit Hacking Monday]] circuit design workshop&lt;br /&gt;
:&#039;&#039;&#039;2008-11-21, 7pm&#039;&#039;&#039;:[[Milk and Cookies]] -- [[User:Dmolnar|David Molnar]] hosts Milk and Cookies at 83C. Bring a short 5-7minute thing to read to others. Bring a potluck cookie/snack/drink if you like. David will bring milk and cookies.&lt;br /&gt;
:&#039;&#039;&#039;2008-11-17, 7:30pm&#039;&#039;&#039;: [[Basic Bicycle Maintain]] - [[User:rubin110|Rubin]] and [[User:rigel|rigel]] hate it when we see a bike that isn&#039;t maintained. Screechy chains and clacking derailleur can go to hell. Basic bike tune up, sharing the smarts on simple things you can do at home to make your ride suck a whole lot less.&lt;br /&gt;
:&#039;&#039;&#039;2008-11-16, 5:00pm&#039;&#039;&#039;: [[RepRap Soldering Party]] - help assemble RepRap!  RSVPs required on wiki! [[User:Adi|adi]]&lt;br /&gt;
:&#039;&#039;&#039;2008-11-16, 3:00pm&#039;&#039;&#039;: [[Oscilloscopes]] - Learn how to use this versatile tool to test electronic circuits.  Maximum 6 slots, please sign up ahead of time! [[User:dstaff|dstaff]]&lt;br /&gt;
:&#039;&#039;&#039;2008-10-31&#039;&#039;&#039;: [[Halloween Open House]] - NoiseBridge&#039;s own [[PPPC]] threw an awesome open house/halloween gala. Post pictures if you got &#039;em!&lt;br /&gt;
:&#039;&#039;&#039;2008-10-25&#039;&#039;&#039;: [[Soldering Workshop]] and Pumpkin Hackin&#039; - Learn to solder for total newbies (or learn to solder better!), including surface mount. Additionally, carve your halloween pumpkins and enjoy some experimental pumpkin pie and/or soup.&lt;br /&gt;
:&#039;&#039;&#039;2008-10-07&#039;&#039;&#039;: (tuesday before meeting) - Etch a circuit board. I&#039;ll be trying a photo resist etching and a basic printed mask etching. This is step 1/3 for a project called &amp;quot;annoying USB thingie&amp;quot; which will execute pre-defined keystrokes by sneaking a tiny USB dongle onto a victim^h^h^h^h^h buddy&#039;s computer.&lt;br /&gt;
:&#039;&#039;&#039;2008-09-13&#039;&#039;&#039;: [[Processing Workshop]] — Learn this very easy-to-use programming language! - [[Processing Workshop Report]]&lt;br /&gt;
:&#039;&#039;&#039;2008-02-16&#039;&#039;&#039;: [[Brain Machine Workshop|Brain Machine Making Workshop]]: Our first hardware sprint!&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/HMM_R_Example&amp;diff=12250</id>
		<title>Machine Learning/HMM R Example</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/HMM_R_Example&amp;diff=12250"/>
		<updated>2010-08-05T04:22:40Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;Examples of using HMM R packages, based on the model in &amp;quot;[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.7020&amp;amp;rep=rep1&amp;amp;type=pdf A Bayes Net Toolkit for Student Model…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Examples of using HMM R packages, based on the model in &amp;quot;[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.7020&amp;amp;rep=rep1&amp;amp;type=pdf A Bayes Net Toolkit for Student Modeling in Intelligent Tutoring Systems]&amp;quot; by Chang, et. al.  We&#039;re trying to come up with an estimate for how well each student knows a certain area of knowledge (which we&#039;re calling a skill).  We observe each student&#039;s performance on answering some number of questions that use this skill, and mark whether they got them correct or incorrect.&lt;br /&gt;
&lt;br /&gt;
We assume that at each time point, a student is in one of two states: either they &amp;quot;know&amp;quot; the skill, or they &amp;quot;do not know&amp;quot; the skill.  If they know they skill, they are more likely to generate a correct output; if not, they are less likely; but in each case, it is stochastic (a student has a probability of guessing the correct answer even if they don&#039;t know the skill, and of slipping/getting it wrong even if they do know the skill).  Between each time point, there is a transition probability from know -&amp;gt; don&#039;t know (forgetting, which Change et. al constrain to 0) and from don&#039;t know -&amp;gt; know (learning).  Finally, there is a probability that the student enters already knowing the skill.  So we have five parameters: two transition probabilities (learn and forget), two outcome probabilities based on state (guess and slip), and initial state probabilities (already know).&lt;br /&gt;
&lt;br /&gt;
The data (student_outcomes.csv) is for a single skill, measuring various students&#039; performance on that skill: a series of correct/incorrect responses, at various times.  We&#039;re ignoring the time data for the moment (other than for ordering purposes), and trying to fit the HMM model.  Once we have it, we can then figure out, for each student, an estimated likelihood of being in the &amp;quot;know&amp;quot; state at their last observed output.&lt;br /&gt;
&lt;br /&gt;
===hmm.discnp===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
require(&amp;quot;hmm.discnp&amp;quot;)&lt;br /&gt;
student_outcomes = read.csv(&amp;quot;student_outcomes.csv&amp;quot;, header=TRUE)&lt;br /&gt;
&lt;br /&gt;
# convert created_at from a string&lt;br /&gt;
student_outcomes$created_at = as.POSIXct(as.character(student_outcomes$created_at))&lt;br /&gt;
&lt;br /&gt;
# remove users with few observations on this skill&lt;br /&gt;
by_user = split(student_outcomes, student_outcomes$student_id)&lt;br /&gt;
obs_by_user = sapply(by_user, nrow)&lt;br /&gt;
valid_users = names(obs_by_user[obs_by_user &amp;gt; 10])&lt;br /&gt;
student_outcomes = student_outcomes[student_outcomes$student_id %in% valid_users,]&lt;br /&gt;
&lt;br /&gt;
by_good_user = split(student_outcomes, student_outcomes$student_id)&lt;br /&gt;
&lt;br /&gt;
# attempt to estimate model parameters&lt;br /&gt;
my_hmm = hmm(by_good_user, yval=c(0,1),&lt;br /&gt;
    par0=list(tpm=rbind(c(0.8,0.2),c(0.01,0.99)),&lt;br /&gt;
              Rho=rbind(c(0.75,0.25),c(0.25,0.75))),&lt;br /&gt;
    stationary=FALSE)&lt;br /&gt;
if (!my_hmm$converged) {&lt;br /&gt;
  print(sprintf(&amp;quot;Error!  HMM did not converge for skill %s!&amp;quot;, skill))&lt;br /&gt;
} else {&lt;br /&gt;
  for (user_id in valid_users) {&lt;br /&gt;
    student_est = sp(correct_by_user[[user_id]], object = my_hmm, means=TRUE)&lt;br /&gt;
    print(sprintf(&amp;quot;%s/%s: %f chance know, %f chance correct&amp;quot;, skill, user_id, student_est$probs[2,ncol(student_est$probs)], student_est$means[length(student_est$means)]))&lt;br /&gt;
    # print(correct_by_user[[user_id]])&lt;br /&gt;
  }&lt;br /&gt;
}&lt;br /&gt;
# transition probability matrix&lt;br /&gt;
my_hmm$tpm&lt;br /&gt;
# output probabilities&lt;br /&gt;
my_hmm$Rho&lt;br /&gt;
# initial probabilities (don&#039;t know/know)&lt;br /&gt;
my_hmm$ispd&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===msm===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
student_outcomes = read.csv(&amp;quot;student_outcomes.csv&amp;quot;, header=TRUE)&lt;br /&gt;
&lt;br /&gt;
# convert created_at from a string&lt;br /&gt;
student_outcomes$created_at = as.POSIXct(as.character(student_outcomes$created_at))&lt;br /&gt;
&lt;br /&gt;
# remove users with few observations on this skill&lt;br /&gt;
min_observations = 10&lt;br /&gt;
by_user = split(student_outcomes, student_outcomes$student_id)&lt;br /&gt;
obs_by_user = sapply(by_user, nrow)&lt;br /&gt;
valid_users = names(obs_by_user[obs_by_user &amp;gt;= min_observations])&lt;br /&gt;
student_outcomes = student_outcomes[student_outcomes$student_id %in% valid_users,]&lt;br /&gt;
&lt;br /&gt;
require(&amp;quot;msm&amp;quot;)&lt;br /&gt;
# convert time to simple sequence&lt;br /&gt;
student_outcomes$created_index = c(sapply(by_user, function(df) {1:nrow(df)}), recursive=TRUE)&lt;br /&gt;
my_hmm = msm(correct ~ created_index, subject = student_id, data = student_outcomes,&lt;br /&gt;
                qmatrix = rbind(c(NA,0.25),c(0.25,NA)),&lt;br /&gt;
                hmodel = list(hmmBinom(1,0.3), hmmBinom(1,0.7)),&lt;br /&gt;
                obstype = 2,&lt;br /&gt;
                initprobs = c(0.5,0.5),&lt;br /&gt;
                est.initprobs = TRUE,&lt;br /&gt;
                method=&amp;quot;BFGS&amp;quot;&lt;br /&gt;
               )&lt;br /&gt;
# display final probability for each user&lt;br /&gt;
for (user_id in valid_users) {&lt;br /&gt;
    student_est = estimate_knowledge(correct_by_user[[user_id]], my_msm)&lt;br /&gt;
    print(sprintf(&amp;quot;%s/%s: %f chance know, %f chance correct&amp;quot;, skill, user_id, student_est[[&amp;quot;p_know&amp;quot;]], student_est[[&amp;quot;p_correct&amp;quot;]]))&lt;br /&gt;
    print(correct_by_user[[user_id]])&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/HMM&amp;diff=12249</id>
		<title>Machine Learning/HMM</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/HMM&amp;diff=12249"/>
		<updated>2010-08-05T04:15:19Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* R */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Hidden Markov Models ==&lt;br /&gt;
&lt;br /&gt;
=== Papers/Tutorials ===&lt;br /&gt;
*[http://www.ece.ucsb.edu/Faculty/Rabiner/ece259/Reprints/tutorial%20on%20hmm%20and%20applications.pdf HMMs and Speech]&lt;br /&gt;
&lt;br /&gt;
=== Implementations ===&lt;br /&gt;
*[http://ghmm.sourceforge.net/ GHMM] (C)&lt;br /&gt;
*[http://hmmer.janelia.org HMMER] (compiled C-apps for Protein (possibly speech) analysis)&lt;br /&gt;
*[http://www.logilab.org/912/ logilab-hmm] (Python)&lt;br /&gt;
*[http://www.cs.ubc.ca/~murphyk/Software/HMM/hmm.html HMM Toolbox] (MATLAB)&lt;br /&gt;
*[http://code.google.com/p/jahmm/ jahmm] (Java)&lt;br /&gt;
** I found [http://www.mblondel.org/journal/2009/05/19/java-jruby-or-jython-for-scientific-computing-a-test-case-with-hidden-markov-models/ Mathieu Blondel&#039;s writeup] really helpful -- jahmm is a good package&lt;br /&gt;
&lt;br /&gt;
==== R ====&lt;br /&gt;
* [http://cran.r-project.org/web/packages/HMM/index.html HMM]: very simple hidden markov models&lt;br /&gt;
* [http://cran.r-project.org/web/packages/hmm.discnp/index.html hmm.discnp]: allows observations of multiple runs&lt;br /&gt;
* [http://cran.r-project.org/web/packages/msm/index.html msm]: continuous time, with covariates, multiple runs&lt;br /&gt;
* Example&lt;br /&gt;
** [[Machine Learning/HMM R Example | example]]&lt;br /&gt;
** [[Machine_Learning/student_data.csv | student_data.csv]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/HMM&amp;diff=12248</id>
		<title>Machine Learning/HMM</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/HMM&amp;diff=12248"/>
		<updated>2010-08-05T04:15:08Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* R */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Hidden Markov Models ==&lt;br /&gt;
&lt;br /&gt;
=== Papers/Tutorials ===&lt;br /&gt;
*[http://www.ece.ucsb.edu/Faculty/Rabiner/ece259/Reprints/tutorial%20on%20hmm%20and%20applications.pdf HMMs and Speech]&lt;br /&gt;
&lt;br /&gt;
=== Implementations ===&lt;br /&gt;
*[http://ghmm.sourceforge.net/ GHMM] (C)&lt;br /&gt;
*[http://hmmer.janelia.org HMMER] (compiled C-apps for Protein (possibly speech) analysis)&lt;br /&gt;
*[http://www.logilab.org/912/ logilab-hmm] (Python)&lt;br /&gt;
*[http://www.cs.ubc.ca/~murphyk/Software/HMM/hmm.html HMM Toolbox] (MATLAB)&lt;br /&gt;
*[http://code.google.com/p/jahmm/ jahmm] (Java)&lt;br /&gt;
** I found [http://www.mblondel.org/journal/2009/05/19/java-jruby-or-jython-for-scientific-computing-a-test-case-with-hidden-markov-models/ Mathieu Blondel&#039;s writeup] really helpful -- jahmm is a good package&lt;br /&gt;
&lt;br /&gt;
==== R ====&lt;br /&gt;
* [http://cran.r-project.org/web/packages/HMM/index.html HMM]: very simple hidden markov models&lt;br /&gt;
* [http://cran.r-project.org/web/packages/hmm.discnp/index.html hmm.discnp]: allows observations of multiple runs&lt;br /&gt;
* [http://cran.r-project.org/web/packages/msm/index.html msm]: continuous time, with covariates, multiple runs&lt;br /&gt;
* Example&lt;br /&gt;
** [[Machine Learning/HMM Example | example]]&lt;br /&gt;
** [[Machine_Learning/student_data.csv | student_data.csv]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/student_data.csv&amp;diff=12247</id>
		<title>Machine Learning/student data.csv</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/student_data.csv&amp;diff=12247"/>
		<updated>2010-08-05T04:13:53Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;&amp;lt;pre&amp;gt; &amp;quot;student_id&amp;quot;,&amp;quot;created_at&amp;quot;,&amp;quot;correct&amp;quot; 1,&amp;quot;2009-02-18 09:33:00&amp;quot;,1 2,&amp;quot;2009-01-14 14:59:35&amp;quot;,0 2,&amp;quot;2009-01-14 15:02:58&amp;quot;,1 2,&amp;quot;2009-01-14 15:11:00&amp;quot;,1 2,&amp;quot;2009-01-19 13:06:48&amp;quot;,1 2,&amp;quot;200…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;quot;student_id&amp;quot;,&amp;quot;created_at&amp;quot;,&amp;quot;correct&amp;quot;&lt;br /&gt;
1,&amp;quot;2009-02-18 09:33:00&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-14 14:59:35&amp;quot;,0&lt;br /&gt;
2,&amp;quot;2009-01-14 15:02:58&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-14 15:11:00&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-19 13:06:48&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-19 13:16:06&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-20 08:59:43&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-20 09:04:25&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-20 09:29:37&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 07:37:51&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 07:54:05&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 08:01:53&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:38:54&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:39:41&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:41:23&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:44:41&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:46:17&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:50:12&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:54:48&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:55:13&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 09:55:29&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-01-26 10:01:36&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:30:15&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:39:24&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:44:32&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:53:59&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:55:56&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:57:00&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 16:58:41&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-13 17:02:08&amp;quot;,0&lt;br /&gt;
2,&amp;quot;2009-02-17 17:00:56&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-17 17:11:33&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-17 17:13:04&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-17 17:35:11&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-17 18:15:29&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-17 18:46:15&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-19 18:18:01&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-19 18:21:00&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-19 18:34:07&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-02-19 19:01:52&amp;quot;,0&lt;br /&gt;
2,&amp;quot;2009-02-19 19:08:41&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-04-21 15:32:46&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-04-21 16:34:54&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-04-21 16:37:36&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-04-21 16:58:31&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-04-21 17:10:40&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-05-14 09:09:48&amp;quot;,1&lt;br /&gt;
2,&amp;quot;2009-05-14 09:10:24&amp;quot;,1&lt;br /&gt;
3,&amp;quot;2010-01-13 12:23:33&amp;quot;,1&lt;br /&gt;
3,&amp;quot;2010-01-13 12:51:17&amp;quot;,1&lt;br /&gt;
3,&amp;quot;2010-01-13 13:32:44&amp;quot;,0&lt;br /&gt;
4,&amp;quot;2010-02-10 14:17:33&amp;quot;,1&lt;br /&gt;
5,&amp;quot;2010-06-01 02:22:01&amp;quot;,0&lt;br /&gt;
6,&amp;quot;2009-06-19 15:02:33&amp;quot;,1&lt;br /&gt;
7,&amp;quot;2009-02-18 15:47:14&amp;quot;,0&lt;br /&gt;
8,&amp;quot;2010-01-04 21:43:41&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-04 21:44:31&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-04 21:44:56&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-04 21:46:46&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-04 22:00:17&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-04 22:05:43&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-04 22:08:21&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-01-17 10:43:26&amp;quot;,1&lt;br /&gt;
8,&amp;quot;2010-05-22 07:44:49&amp;quot;,1&lt;br /&gt;
9,&amp;quot;2009-11-01 07:02:58&amp;quot;,1&lt;br /&gt;
9,&amp;quot;2009-11-01 07:05:48&amp;quot;,1&lt;br /&gt;
10,&amp;quot;2010-03-01 00:23:30&amp;quot;,1&lt;br /&gt;
11,&amp;quot;2010-02-24 02:25:15&amp;quot;,0&lt;br /&gt;
11,&amp;quot;2010-02-24 02:26:35&amp;quot;,0&lt;br /&gt;
11,&amp;quot;2010-02-24 02:40:38&amp;quot;,0&lt;br /&gt;
11,&amp;quot;2010-02-24 02:44:12&amp;quot;,0&lt;br /&gt;
11,&amp;quot;2010-02-24 02:49:57&amp;quot;,0&lt;br /&gt;
12,&amp;quot;2009-03-25 04:04:10&amp;quot;,0&lt;br /&gt;
12,&amp;quot;2009-03-25 04:14:06&amp;quot;,1&lt;br /&gt;
12,&amp;quot;2009-03-25 15:03:34&amp;quot;,0&lt;br /&gt;
13,&amp;quot;2009-05-25 06:22:37&amp;quot;,1&lt;br /&gt;
13,&amp;quot;2009-05-25 06:25:58&amp;quot;,1&lt;br /&gt;
13,&amp;quot;2009-05-25 06:39:16&amp;quot;,1&lt;br /&gt;
13,&amp;quot;2009-05-25 06:46:43&amp;quot;,1&lt;br /&gt;
14,&amp;quot;2009-11-04 13:05:13&amp;quot;,0&lt;br /&gt;
15,&amp;quot;2009-11-11 01:40:30&amp;quot;,0&lt;br /&gt;
15,&amp;quot;2009-11-11 01:56:14&amp;quot;,0&lt;br /&gt;
15,&amp;quot;2009-11-11 02:20:38&amp;quot;,1&lt;br /&gt;
16,&amp;quot;2009-06-12 11:07:37&amp;quot;,1&lt;br /&gt;
17,&amp;quot;2010-06-03 19:04:15&amp;quot;,1&lt;br /&gt;
17,&amp;quot;2010-06-03 19:12:41&amp;quot;,1&lt;br /&gt;
18,&amp;quot;2009-12-26 05:11:58&amp;quot;,1&lt;br /&gt;
19,&amp;quot;2009-03-05 07:21:33&amp;quot;,1&lt;br /&gt;
20,&amp;quot;2009-10-02 06:53:12&amp;quot;,1&lt;br /&gt;
20,&amp;quot;2009-10-15 18:19:06&amp;quot;,0&lt;br /&gt;
20,&amp;quot;2009-10-15 18:34:00&amp;quot;,1&lt;br /&gt;
20,&amp;quot;2009-10-15 18:34:55&amp;quot;,0&lt;br /&gt;
21,&amp;quot;2010-06-15 22:12:13&amp;quot;,1&lt;br /&gt;
21,&amp;quot;2010-06-15 22:14:46&amp;quot;,1&lt;br /&gt;
21,&amp;quot;2010-06-15 22:18:32&amp;quot;,1&lt;br /&gt;
21,&amp;quot;2010-06-15 22:27:23&amp;quot;,1&lt;br /&gt;
22,&amp;quot;2009-11-22 07:17:40&amp;quot;,0&lt;br /&gt;
23,&amp;quot;2009-11-18 05:28:41&amp;quot;,1&lt;br /&gt;
23,&amp;quot;2009-11-18 05:29:09&amp;quot;,1&lt;br /&gt;
23,&amp;quot;2009-11-18 07:29:18&amp;quot;,1&lt;br /&gt;
23,&amp;quot;2009-11-18 07:30:26&amp;quot;,1&lt;br /&gt;
23,&amp;quot;2009-11-18 07:47:06&amp;quot;,1&lt;br /&gt;
24,&amp;quot;2009-07-01 17:14:47&amp;quot;,1&lt;br /&gt;
24,&amp;quot;2010-05-06 06:06:05&amp;quot;,1&lt;br /&gt;
25,&amp;quot;2010-05-19 06:47:49&amp;quot;,0&lt;br /&gt;
26,&amp;quot;2010-03-29 17:47:25&amp;quot;,1&lt;br /&gt;
26,&amp;quot;2010-03-29 18:07:14&amp;quot;,1&lt;br /&gt;
26,&amp;quot;2010-03-29 18:11:51&amp;quot;,0&lt;br /&gt;
27,&amp;quot;2009-11-04 19:46:58&amp;quot;,1&lt;br /&gt;
28,&amp;quot;2009-08-31 20:32:39&amp;quot;,0&lt;br /&gt;
29,&amp;quot;2009-10-26 15:40:34&amp;quot;,1&lt;br /&gt;
29,&amp;quot;2009-10-26 15:42:36&amp;quot;,0&lt;br /&gt;
30,&amp;quot;2009-06-11 13:25:17&amp;quot;,1&lt;br /&gt;
30,&amp;quot;2009-06-11 13:27:03&amp;quot;,0&lt;br /&gt;
31,&amp;quot;2009-11-13 07:06:26&amp;quot;,1&lt;br /&gt;
31,&amp;quot;2009-11-13 07:38:12&amp;quot;,0&lt;br /&gt;
32,&amp;quot;2010-04-26 00:28:56&amp;quot;,0&lt;br /&gt;
33,&amp;quot;2008-12-23 02:20:39&amp;quot;,1&lt;br /&gt;
33,&amp;quot;2008-12-23 02:22:15&amp;quot;,1&lt;br /&gt;
34,&amp;quot;2010-05-05 16:59:38&amp;quot;,1&lt;br /&gt;
34,&amp;quot;2010-05-05 17:27:47&amp;quot;,0&lt;br /&gt;
34,&amp;quot;2010-05-05 18:42:17&amp;quot;,1&lt;br /&gt;
35,&amp;quot;2010-05-23 05:31:10&amp;quot;,0&lt;br /&gt;
36,&amp;quot;2009-05-28 21:07:11&amp;quot;,0&lt;br /&gt;
36,&amp;quot;2009-05-28 21:08:01&amp;quot;,1&lt;br /&gt;
37,&amp;quot;2010-06-01 03:26:48&amp;quot;,1&lt;br /&gt;
38,&amp;quot;2009-04-10 17:21:34&amp;quot;,0&lt;br /&gt;
39,&amp;quot;2009-07-27 20:18:19&amp;quot;,0&lt;br /&gt;
40,&amp;quot;2009-04-29 21:21:51&amp;quot;,0&lt;br /&gt;
40,&amp;quot;2009-04-29 21:40:01&amp;quot;,1&lt;br /&gt;
41,&amp;quot;2010-03-09 14:17:26&amp;quot;,1&lt;br /&gt;
41,&amp;quot;2010-03-09 14:32:27&amp;quot;,1&lt;br /&gt;
41,&amp;quot;2010-03-18 11:36:08&amp;quot;,1&lt;br /&gt;
42,&amp;quot;2009-02-17 14:30:28&amp;quot;,1&lt;br /&gt;
43,&amp;quot;2009-03-05 12:17:56&amp;quot;,1&lt;br /&gt;
44,&amp;quot;2010-02-10 09:52:19&amp;quot;,0&lt;br /&gt;
44,&amp;quot;2010-02-10 09:59:05&amp;quot;,0&lt;br /&gt;
44,&amp;quot;2010-02-10 10:02:13&amp;quot;,1&lt;br /&gt;
44,&amp;quot;2010-02-10 10:14:38&amp;quot;,1&lt;br /&gt;
44,&amp;quot;2010-02-10 11:09:27&amp;quot;,1&lt;br /&gt;
45,&amp;quot;2009-08-26 03:50:29&amp;quot;,1&lt;br /&gt;
45,&amp;quot;2009-08-26 04:12:19&amp;quot;,0&lt;br /&gt;
46,&amp;quot;2010-01-27 16:37:18&amp;quot;,1&lt;br /&gt;
47,&amp;quot;2010-05-08 08:27:17&amp;quot;,1&lt;br /&gt;
48,&amp;quot;2009-09-07 08:35:14&amp;quot;,1&lt;br /&gt;
48,&amp;quot;2009-09-07 08:45:07&amp;quot;,0&lt;br /&gt;
48,&amp;quot;2009-09-07 09:17:25&amp;quot;,1&lt;br /&gt;
48,&amp;quot;2009-09-07 09:20:27&amp;quot;,1&lt;br /&gt;
49,&amp;quot;2009-05-12 18:36:58&amp;quot;,1&lt;br /&gt;
49,&amp;quot;2009-05-12 18:39:00&amp;quot;,0&lt;br /&gt;
50,&amp;quot;2010-06-08 03:38:24&amp;quot;,0&lt;br /&gt;
51,&amp;quot;2009-09-28 22:50:35&amp;quot;,1&lt;br /&gt;
51,&amp;quot;2009-09-28 22:51:57&amp;quot;,1&lt;br /&gt;
51,&amp;quot;2009-09-28 22:57:56&amp;quot;,1&lt;br /&gt;
51,&amp;quot;2009-09-28 23:21:40&amp;quot;,1&lt;br /&gt;
51,&amp;quot;2009-09-28 23:22:44&amp;quot;,1&lt;br /&gt;
51,&amp;quot;2009-09-28 23:32:08&amp;quot;,1&lt;br /&gt;
52,&amp;quot;2010-01-18 07:00:47&amp;quot;,0&lt;br /&gt;
53,&amp;quot;2010-06-12 15:14:33&amp;quot;,1&lt;br /&gt;
53,&amp;quot;2010-06-12 15:53:35&amp;quot;,1&lt;br /&gt;
53,&amp;quot;2010-06-12 15:55:52&amp;quot;,1&lt;br /&gt;
53,&amp;quot;2010-06-12 16:00:52&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2009-11-23 00:31:02&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-11-28 15:32:31&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2009-11-28 15:35:56&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-05 21:45:05&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-05 21:46:01&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-05 21:56:08&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-05 22:14:37&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-13 22:49:03&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-13 22:50:25&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-15 03:02:51&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-16 03:52:21&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-16 04:06:22&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2009-12-25 17:37:59&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-27 19:32:13&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-29 20:15:00&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-29 20:17:50&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-29 21:27:27&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-29 21:28:40&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-29 21:36:50&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-30 03:26:48&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-30 03:40:57&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-30 03:57:24&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2009-12-30 04:18:55&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-30 04:20:51&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2009-12-31 21:02:58&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-01-05 02:42:43&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-06 21:03:24&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-06 21:40:14&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2010-03-09 03:53:02&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-09 04:04:29&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-14 17:31:18&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2010-03-14 17:54:39&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-14 18:13:13&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-20 20:44:19&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2010-03-20 20:47:28&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-20 20:48:59&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-20 20:54:35&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-20 21:20:34&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-21 18:07:23&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-28 21:51:14&amp;quot;,0&lt;br /&gt;
54,&amp;quot;2010-03-28 22:35:13&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-03-28 22:43:13&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-04-11 16:40:58&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-04-17 17:10:31&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-04-23 02:19:12&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-04-25 22:08:57&amp;quot;,1&lt;br /&gt;
54,&amp;quot;2010-04-25 22:31:10&amp;quot;,1&lt;br /&gt;
55,&amp;quot;2009-12-02 22:12:03&amp;quot;,1&lt;br /&gt;
56,&amp;quot;2009-05-06 10:23:50&amp;quot;,1&lt;br /&gt;
57,&amp;quot;2009-10-10 10:49:31&amp;quot;,1&lt;br /&gt;
57,&amp;quot;2009-10-10 10:53:25&amp;quot;,0&lt;br /&gt;
57,&amp;quot;2009-10-18 11:24:42&amp;quot;,1&lt;br /&gt;
57,&amp;quot;2009-10-18 11:27:00&amp;quot;,1&lt;br /&gt;
57,&amp;quot;2009-10-18 11:32:51&amp;quot;,0&lt;br /&gt;
58,&amp;quot;2010-06-12 07:10:20&amp;quot;,1&lt;br /&gt;
59,&amp;quot;2010-02-06 03:58:10&amp;quot;,1&lt;br /&gt;
59,&amp;quot;2010-02-06 04:02:07&amp;quot;,0&lt;br /&gt;
59,&amp;quot;2010-02-06 04:45:47&amp;quot;,0&lt;br /&gt;
60,&amp;quot;2009-05-28 03:43:38&amp;quot;,1&lt;br /&gt;
60,&amp;quot;2009-05-28 04:01:18&amp;quot;,1&lt;br /&gt;
61,&amp;quot;2010-01-17 20:11:49&amp;quot;,1&lt;br /&gt;
61,&amp;quot;2010-01-17 20:58:01&amp;quot;,1&lt;br /&gt;
62,&amp;quot;2009-06-08 11:11:41&amp;quot;,1&lt;br /&gt;
63,&amp;quot;2009-02-27 18:17:51&amp;quot;,0&lt;br /&gt;
64,&amp;quot;2009-08-28 15:03:57&amp;quot;,1&lt;br /&gt;
65,&amp;quot;2009-07-27 15:42:16&amp;quot;,1&lt;br /&gt;
65,&amp;quot;2009-07-27 16:24:38&amp;quot;,1&lt;br /&gt;
66,&amp;quot;2009-12-01 22:20:47&amp;quot;,1&lt;br /&gt;
66,&amp;quot;2009-12-01 23:35:47&amp;quot;,1&lt;br /&gt;
67,&amp;quot;2009-10-28 17:29:48&amp;quot;,0&lt;br /&gt;
67,&amp;quot;2009-10-28 17:35:57&amp;quot;,0&lt;br /&gt;
68,&amp;quot;2009-09-14 15:57:20&amp;quot;,0&lt;br /&gt;
68,&amp;quot;2009-09-14 16:02:32&amp;quot;,1&lt;br /&gt;
68,&amp;quot;2009-09-14 16:09:46&amp;quot;,1&lt;br /&gt;
68,&amp;quot;2009-09-16 15:39:51&amp;quot;,1&lt;br /&gt;
68,&amp;quot;2009-09-18 16:17:04&amp;quot;,0&lt;br /&gt;
68,&amp;quot;2009-09-21 16:05:17&amp;quot;,0&lt;br /&gt;
68,&amp;quot;2009-09-21 22:27:37&amp;quot;,1&lt;br /&gt;
69,&amp;quot;2010-06-04 15:33:21&amp;quot;,1&lt;br /&gt;
70,&amp;quot;2009-08-16 06:42:41&amp;quot;,1&lt;br /&gt;
70,&amp;quot;2009-08-17 23:23:21&amp;quot;,0&lt;br /&gt;
70,&amp;quot;2009-08-26 00:23:21&amp;quot;,1&lt;br /&gt;
71,&amp;quot;2010-05-09 19:27:32&amp;quot;,0&lt;br /&gt;
72,&amp;quot;2009-10-29 06:51:42&amp;quot;,0&lt;br /&gt;
72,&amp;quot;2009-10-29 06:53:59&amp;quot;,0&lt;br /&gt;
73,&amp;quot;2009-09-15 04:53:27&amp;quot;,1&lt;br /&gt;
74,&amp;quot;2009-11-10 15:31:14&amp;quot;,1&lt;br /&gt;
74,&amp;quot;2009-11-10 16:07:48&amp;quot;,1&lt;br /&gt;
74,&amp;quot;2009-11-10 16:35:12&amp;quot;,1&lt;br /&gt;
75,&amp;quot;2009-07-20 14:09:35&amp;quot;,1&lt;br /&gt;
76,&amp;quot;2010-01-12 02:43:17&amp;quot;,0&lt;br /&gt;
76,&amp;quot;2010-01-12 03:34:01&amp;quot;,1&lt;br /&gt;
77,&amp;quot;2009-09-24 09:40:29&amp;quot;,1&lt;br /&gt;
78,&amp;quot;2009-03-07 06:10:43&amp;quot;,0&lt;br /&gt;
79,&amp;quot;2009-09-10 08:31:18&amp;quot;,0&lt;br /&gt;
79,&amp;quot;2009-09-10 09:01:12&amp;quot;,1&lt;br /&gt;
79,&amp;quot;2009-09-10 09:39:12&amp;quot;,0&lt;br /&gt;
80,&amp;quot;2009-05-19 23:38:13&amp;quot;,0&lt;br /&gt;
80,&amp;quot;2009-05-19 23:53:04&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-12 09:03:09&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-12 09:49:12&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-12 10:04:07&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-12 10:28:54&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-13 09:29:02&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-13 09:41:59&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-13 09:44:12&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-13 10:13:07&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-14 06:44:01&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-14 06:48:04&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-14 06:52:36&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-18 09:17:43&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-20 09:27:03&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-20 09:43:31&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-20 10:06:43&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-25 08:38:21&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-26 10:23:28&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-26 10:33:32&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-26 10:35:40&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-26 10:36:07&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-26 10:46:16&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-27 06:34:05&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-27 06:35:58&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-27 06:37:51&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-27 06:47:52&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-05-27 06:52:00&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-27 09:20:36&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-05-31 06:57:54&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-02 16:34:12&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-08 10:00:36&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-08 10:44:41&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-08 11:04:10&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-08 11:13:55&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-10 08:41:23&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-11 18:52:34&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-12 06:35:33&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-06-12 06:47:13&amp;quot;,0&lt;br /&gt;
81,&amp;quot;2010-06-12 06:48:28&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-14 06:04:35&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-14 06:23:48&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-15 09:10:38&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-15 09:17:04&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-16 06:33:53&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-16 06:37:01&amp;quot;,1&lt;br /&gt;
81,&amp;quot;2010-06-16 08:57:40&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-12 22:53:37&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-12 22:55:49&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-12 23:00:18&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-12 23:44:37&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-13 00:00:49&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-13 00:16:32&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-13 00:40:24&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-13 00:46:56&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-14 17:35:31&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-14 18:02:44&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-14 18:03:56&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-14 19:15:14&amp;quot;,0&lt;br /&gt;
82,&amp;quot;2010-06-14 20:13:52&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-14 20:24:37&amp;quot;,0&lt;br /&gt;
82,&amp;quot;2010-06-14 20:28:51&amp;quot;,0&lt;br /&gt;
82,&amp;quot;2010-06-14 20:35:43&amp;quot;,1&lt;br /&gt;
82,&amp;quot;2010-06-14 21:26:36&amp;quot;,0&lt;br /&gt;
82,&amp;quot;2010-06-14 21:30:10&amp;quot;,1&lt;br /&gt;
83,&amp;quot;2009-09-08 00:49:45&amp;quot;,1&lt;br /&gt;
84,&amp;quot;2009-12-12 20:37:05&amp;quot;,1&lt;br /&gt;
84,&amp;quot;2009-12-12 20:39:11&amp;quot;,1&lt;br /&gt;
85,&amp;quot;2009-12-01 10:35:23&amp;quot;,0&lt;br /&gt;
86,&amp;quot;2009-01-12 14:48:01&amp;quot;,1&lt;br /&gt;
86,&amp;quot;2009-01-12 14:49:03&amp;quot;,1&lt;br /&gt;
86,&amp;quot;2009-01-17 09:15:45&amp;quot;,1&lt;br /&gt;
86,&amp;quot;2009-01-17 09:25:49&amp;quot;,1&lt;br /&gt;
87,&amp;quot;2009-08-11 16:31:59&amp;quot;,1&lt;br /&gt;
88,&amp;quot;2009-11-12 22:54:18&amp;quot;,0&lt;br /&gt;
88,&amp;quot;2009-11-12 23:14:50&amp;quot;,1&lt;br /&gt;
88,&amp;quot;2009-11-12 23:19:19&amp;quot;,1&lt;br /&gt;
88,&amp;quot;2009-11-12 23:29:54&amp;quot;,1&lt;br /&gt;
88,&amp;quot;2009-11-12 23:30:18&amp;quot;,1&lt;br /&gt;
89,&amp;quot;2009-08-18 00:14:45&amp;quot;,1&lt;br /&gt;
90,&amp;quot;2010-05-23 19:20:51&amp;quot;,0&lt;br /&gt;
91,&amp;quot;2009-08-25 15:52:34&amp;quot;,0&lt;br /&gt;
92,&amp;quot;2009-03-05 20:45:22&amp;quot;,1&lt;br /&gt;
93,&amp;quot;2010-01-13 03:58:53&amp;quot;,0&lt;br /&gt;
94,&amp;quot;2010-01-23 13:07:19&amp;quot;,0&lt;br /&gt;
95,&amp;quot;2009-11-18 23:32:30&amp;quot;,1&lt;br /&gt;
96,&amp;quot;2010-05-22 11:09:38&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-02-03 08:23:51&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-02-03 12:29:38&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-02-03 12:32:34&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-02-03 13:11:21&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-05-09 14:13:26&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-05-09 17:32:11&amp;quot;,0&lt;br /&gt;
97,&amp;quot;2010-05-10 06:07:16&amp;quot;,0&lt;br /&gt;
97,&amp;quot;2010-05-12 07:25:05&amp;quot;,1&lt;br /&gt;
97,&amp;quot;2010-05-12 07:26:49&amp;quot;,1&lt;br /&gt;
98,&amp;quot;2010-04-24 06:11:15&amp;quot;,0&lt;br /&gt;
99,&amp;quot;2010-01-10 19:54:03&amp;quot;,1&lt;br /&gt;
100,&amp;quot;2010-06-06 14:06:28&amp;quot;,0&lt;br /&gt;
101,&amp;quot;2009-07-13 14:39:24&amp;quot;,0&lt;br /&gt;
102,&amp;quot;2009-06-04 19:30:10&amp;quot;,1&lt;br /&gt;
103,&amp;quot;2009-10-12 18:33:22&amp;quot;,0&lt;br /&gt;
103,&amp;quot;2009-10-12 18:55:35&amp;quot;,1&lt;br /&gt;
103,&amp;quot;2009-10-12 18:58:02&amp;quot;,1&lt;br /&gt;
104,&amp;quot;2009-09-07 18:35:21&amp;quot;,1&lt;br /&gt;
105,&amp;quot;2009-08-11 05:56:12&amp;quot;,1&lt;br /&gt;
105,&amp;quot;2009-08-17 05:47:36&amp;quot;,0&lt;br /&gt;
105,&amp;quot;2009-08-17 06:00:45&amp;quot;,0&lt;br /&gt;
106,&amp;quot;2010-04-11 20:33:34&amp;quot;,1&lt;br /&gt;
106,&amp;quot;2010-04-11 20:37:39&amp;quot;,1&lt;br /&gt;
107,&amp;quot;2009-08-31 14:47:41&amp;quot;,1&lt;br /&gt;
108,&amp;quot;2009-10-10 17:55:30&amp;quot;,1&lt;br /&gt;
108,&amp;quot;2009-10-10 17:57:24&amp;quot;,0&lt;br /&gt;
109,&amp;quot;2009-06-16 18:16:17&amp;quot;,1&lt;br /&gt;
110,&amp;quot;2010-01-15 17:04:34&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-06 16:54:29&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-06 17:03:41&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-11 14:30:02&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-16 19:22:29&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-16 19:32:42&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-16 20:05:54&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-18 22:23:59&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-18 22:35:16&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-19 20:01:17&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-19 20:04:09&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-19 20:06:50&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-19 20:12:08&amp;quot;,0&lt;br /&gt;
111,&amp;quot;2010-05-25 01:56:57&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-26 01:22:26&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-26 18:53:26&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-28 01:17:50&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-28 01:21:29&amp;quot;,1&lt;br /&gt;
111,&amp;quot;2010-05-28 01:27:14&amp;quot;,1&lt;br /&gt;
112,&amp;quot;2010-01-20 20:34:18&amp;quot;,1&lt;br /&gt;
113,&amp;quot;2009-08-01 12:33:02&amp;quot;,1&lt;br /&gt;
113,&amp;quot;2009-08-14 07:18:06&amp;quot;,0&lt;br /&gt;
114,&amp;quot;2009-03-31 08:16:42&amp;quot;,1&lt;br /&gt;
115,&amp;quot;2009-09-19 09:09:48&amp;quot;,1&lt;br /&gt;
116,&amp;quot;2009-03-19 03:59:06&amp;quot;,0&lt;br /&gt;
117,&amp;quot;2009-12-24 07:05:44&amp;quot;,1&lt;br /&gt;
118,&amp;quot;2010-02-02 16:51:12&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-07 10:09:56&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-07 10:44:51&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-07 10:46:09&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-07 17:45:17&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-07 17:56:09&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-08 16:44:36&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-08 19:28:13&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-08 19:37:15&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-09 19:51:33&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-12 16:32:41&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-12 16:43:36&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-12 17:15:36&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-12 17:16:46&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-12 17:22:08&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-12 17:34:43&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-13 08:42:56&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-13 22:08:56&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-14 11:24:38&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-14 11:45:19&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-14 18:39:40&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-14 20:27:00&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-14 20:44:14&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-16 15:30:44&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-21 06:16:04&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-21 10:23:20&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-21 10:45:29&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-22 17:23:15&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-22 17:34:13&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-22 17:43:54&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-06-23 09:40:32&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-23 10:17:34&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-23 14:36:52&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-23 15:12:57&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-23 15:13:49&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-06-23 15:18:43&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-04 09:13:02&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-04 09:15:20&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-04 09:34:30&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-04 18:14:56&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-06 15:21:23&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-07 16:56:14&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-07 17:35:49&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-07 18:50:52&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-07 19:02:22&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-09 16:53:20&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-09 16:55:43&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-09 17:06:48&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-09 17:13:55&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-10 06:46:19&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-10 07:19:04&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-10 15:46:18&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-10 15:59:13&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-10 17:01:09&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-14 13:08:43&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-14 13:19:13&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-15 18:32:33&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-18 20:04:19&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-18 20:09:14&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-18 20:10:46&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-18 20:11:33&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-18 20:14:34&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-18 20:23:34&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-07-18 20:26:46&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-18 20:31:57&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-19 09:24:00&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-19 09:34:13&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-23 19:42:57&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-07-30 19:06:33&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-03 19:44:44&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-03 19:46:38&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-03 21:15:14&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-04 16:06:45&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-04 16:07:36&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-04 16:08:04&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-04 16:09:53&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-05 18:37:27&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-06 18:07:40&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-06 18:14:19&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-08-06 18:24:16&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-06 18:58:01&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-08-06 19:03:50&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-09 19:37:23&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-09 19:41:34&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-09 22:42:06&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-16 18:27:37&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-16 19:41:25&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-16 19:43:35&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-21 19:48:46&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-21 20:04:15&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-21 20:09:02&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-21 20:37:49&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-08-28 18:43:45&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-09-20 19:20:29&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-09-20 19:27:12&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-03 13:14:43&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-03 13:18:30&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-03 13:19:26&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-03 13:25:10&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-10-03 13:41:25&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-10-03 20:01:56&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-03 20:05:43&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-05 19:55:01&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-05 20:19:27&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-12 18:49:14&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-10-29 16:06:10&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-29 16:08:48&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-29 16:38:29&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-30 16:27:08&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-30 16:31:23&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-30 16:47:21&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-10-30 16:51:00&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-10-30 16:56:55&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-10-31 12:58:10&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-11-01 08:57:34&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-11-01 09:09:18&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-11-01 09:15:20&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-11-01 09:23:31&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-12-10 15:59:09&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2009-12-27 09:45:33&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-12-27 09:57:40&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-12-27 18:23:09&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2009-12-27 19:02:15&amp;quot;,0&lt;br /&gt;
119,&amp;quot;2010-02-06 09:01:09&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 09:17:36&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 09:19:22&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 09:46:05&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 09:47:23&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 10:08:18&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 10:11:53&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 10:42:57&amp;quot;,1&lt;br /&gt;
119,&amp;quot;2010-02-06 10:56:21&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-08 13:27:25&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-08 13:45:41&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-08 15:23:18&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-08 15:38:39&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-08 15:42:52&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-10 07:09:08&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-10 07:30:54&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-10 07:38:33&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-10 07:57:28&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-10 08:08:40&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-11 03:47:18&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-11 04:09:05&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-14 03:10:52&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-14 03:56:33&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-14 03:57:20&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-14 03:58:46&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-14 04:00:15&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-14 04:04:00&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-14 04:04:23&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-14 04:04:33&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 01:03:48&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 01:04:43&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 01:09:38&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-15 01:13:27&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 01:18:52&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-15 01:29:42&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 05:29:42&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 05:32:07&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 05:43:39&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-15 06:38:20&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 16:20:10&amp;quot;,0&lt;br /&gt;
120,&amp;quot;2009-11-15 16:21:43&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 16:22:32&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 16:34:08&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 16:39:41&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 20:02:01&amp;quot;,1&lt;br /&gt;
120,&amp;quot;2009-11-15 20:06:20&amp;quot;,0&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Category:Events&amp;diff=11795</id>
		<title>Category:Events</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Category:Events&amp;diff=11795"/>
		<updated>2010-06-17T06:55:47Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;!-- Note that this page uses transclusion. Content between the &amp;quot;onlyinclude&amp;quot; tags below will be pushed to the main page --&amp;gt;&lt;br /&gt;
Official, Semi-Official, one-off and other events at the Noisebridge space.&lt;br /&gt;
&lt;br /&gt;
=Event Calendar=&lt;br /&gt;
Not all events make it onto this calendar. Many events only make it to the Discussion or Announcements [[Mailinglist | mailing lists]], [[IRC]] or in person at [[Category:Meeting_Notes | Tuesday meetings]]. Best of all, Noisebridge is about people getting together at the space in San Francisco to do stuff... like in person. Some events just happen.  Pay attention!&lt;br /&gt;
&lt;br /&gt;
Event posters are encouraged to crosspost to the Google Calendar. View the  [http://www.google.com/calendar/embed?src=vo3i3c0qtjnkjr2ojasd0ftt8s%40group.calendar.google.com&amp;amp;ctz=America/Los_Angeles Google Calendar], view the [http://www.google.com/calendar/feeds/vo3i3c0qtjnkjr2ojasd0ftt8s%40group.calendar.google.com/public/basic Google Calendar in XML], or the [http://www.google.com/calendar/ical/vo3i3c0qtjnkjr2ojasd0ftt8s%40group.calendar.google.com/public/basic.ics Google Calendar in ical] format.&lt;br /&gt;
&lt;br /&gt;
To post Google Calendar entries for your event, contact a Noisebridge member for access.&lt;br /&gt;
&lt;br /&gt;
(Wouldn&#039;t it be great if there were a gCal mediawiki plugin so crossposting wasn&#039;t needed? Do you know of a good one? Help us!) &amp;lt;- working on this, need to upgrade Mediawiki in order to use some plugins.&lt;br /&gt;
&amp;lt;!-- Items inside this &amp;quot;onlyinclude&amp;quot; tag will be pushed to the main page --&amp;gt;&amp;lt;onlyinclude&amp;gt;&lt;br /&gt;
=== Upcoming Events &amp;lt;small&amp;gt;[https://www.noisebridge.net/index.php?title=Category:Events&amp;amp;action=edit&amp;amp;section=2 edit]&amp;lt;/small&amp;gt; ===&lt;br /&gt;
* &#039;&#039;&#039;Thurs June 17, 20:00&#039;&#039;&#039; - [[Five Minutes of Fame]] - an evening of &amp;quot;five&amp;quot; minute talks&lt;br /&gt;
* &#039;&#039;&#039;Thurs June 24, 18:00-22:00&#039;&#039;&#039; - [[Fuzzy_Chef_Cooking_Class|Fuzzy Chef Cooking Class]]&lt;br /&gt;
&lt;br /&gt;
=== Recurring Events &amp;lt;small&amp;gt;[https://www.noisebridge.net/index.php?title=Category:Events&amp;amp;action=edit&amp;amp;section=3 edit]&amp;lt;/small&amp;gt; ===&lt;br /&gt;
&amp;lt;!-- Large turnout events should be written in &#039;&#039;&#039;bold&#039;&#039;&#039;. --&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;Monday&#039;&#039;&#039;&lt;br /&gt;
** Anytime [[https://www.noisebridge.net/wiki/House_Keeping#Trash_and_Recycling| Trash Night]]  - Dont forget to take out the trash for Tuesday morning!&lt;br /&gt;
** 18:30 [[PyClass]] - Learn how to program using the Python programming language.&lt;br /&gt;
** &#039;&#039;&#039;19:00 [[Circuit Hacking Mondays]]&#039;&#039;&#039; - Learn to solder! Mitch will bring kits to make cool, hackable things that you can bring home after you make them.  Bring your own projects to hack!&lt;br /&gt;
** &#039;&#039;&#039;18:00 [[iPhone OS developer weekly meetup]]&#039;&#039;&#039; - we make teh applukashuns, joyn us 2 make dem 2! http://meetup.com/iphonedevsf&lt;br /&gt;
** 19:00 [[Linux System Administration Study Group]] - Study Linux admining in the Turing classroom.&lt;br /&gt;
** 19:00 1st and 3rd Mondays the BACE Timebank group meets to help organize community mutual aid by trading in equal time credits, wherever there is space. For more info. mira (at) sfbace.org or to join go to timebank.sfbace.org&lt;br /&gt;
* &#039;&#039;&#039;Tuesday&#039;&#039;&#039;&lt;br /&gt;
** 15:00 [[Linux System Administration Study Group]] - Study Linux admining in the Turing classroom.&lt;br /&gt;
** 18:30 Bay Area Community Exchange Project Roundtable Meeting (third Tues. of every month)-discussion of alternative currencies in the back classroom.&lt;br /&gt;
** &#039;&#039;&#039;20:00 [[#Meetings|Noisebridge Weekly Meeting]]&#039;&#039;&#039; - Introducing new people and events to the space, general discussion, and decision making.&lt;br /&gt;
** 20:30 [[Spacebridge]] - Noisebridge&#039;s space program (project update meeting) &lt;br /&gt;
** 21:00 [[Machine Learning]] - Teach computers to learn stuff using artificial intelligence and other techniques.  (Formerly Wednesdays)&lt;br /&gt;
* &#039;&#039;&#039;Wednesday&#039;&#039;&#039;&lt;br /&gt;
** 18:00 [[LinuxDiscussion|Linux Discussion]] - Play with Linux in the Turing classroom.&lt;br /&gt;
** &#039;&#039;&#039;19:00 [[SCoW]]&#039;&#039;&#039; - Sewing, Crafting, Or Whatever! Come make cool stuff with geeks.&lt;br /&gt;
** &#039;&#039;&#039;19:00 [[NBShooters]]&#039;&#039;&#039; -  Digital Landscape and Artistic Photojournalism Class.  Bring your own Camera (BYOC).&lt;br /&gt;
** &#039;&#039;&#039;20:00 [[Gamebridge|Gamebridge Unityversity]]&#039;&#039;&#039; - Collab and learn to make video games with geeks, if it&#039;s your first night you will actually get to make a game!&lt;br /&gt;
* &#039;&#039;&#039;Thursday&#039;&#039;&#039;&lt;br /&gt;
** Anytime [[https://www.noisebridge.net/wiki/House_Keeping#Trash_and_Recycling| Trash Night]]  - Dont forget to take out the trash for Friday morning!&lt;br /&gt;
** 19:30 [[Games]] - Play games with geeks.&lt;br /&gt;
** &#039;&#039;&#039;20:00 [[Five_Minutes_of_Fame | Five Minutes of Fame]]&#039;&#039;&#039; (3rd Thursdays)&lt;br /&gt;
** 20:00 [[Programming_for_Poets | Programming for Poets]] -  Gentle intro to programming using Processing (when no 5MoF)&lt;br /&gt;
* &#039;&#039;&#039;Friday&#039;&#039;&#039; &lt;br /&gt;
** 15:00 [[Linux System Administration Study Group]] - Study Linux admining in the Turing classroom. &lt;br /&gt;
** 19:00 [[Science, Engineering &amp;amp; Design Huddle]] - Weekly group to discuss design approach, share techniques, and solve any problem you may be having with your project(s).&lt;br /&gt;
** 20:00 [[Moving/2169 Mission/Buildout|2169 Buildout planning]] - Discussion &amp;amp; execution of how to renovate our new space.&lt;br /&gt;
** 18:00 [[RantMeet]] (1st Fridays) [http://www.rantmedia.ca Rant Media] is a global hacker/survival/indy media phyle that meets up around the world.&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Saturday&#039;&#039;&#039;&lt;br /&gt;
** 20:00 [[NSFW]] - Now Showing From the Web, last Saturday of the month. Share interesting videos you&#039;ve found on the web for the past month or bring content you made.&lt;br /&gt;
* &#039;&#039;&#039;Sunday&#039;&#039;&#039;&lt;br /&gt;
** 13:00 [[Cyborg Group|Cyborg Group / Sensebridge]] - Work on projects like artificial senses.&lt;br /&gt;
** 15:00 [[OpenEEG]] - how to read your mind.&lt;br /&gt;
** 15:00 [[KnotTyingWorkshop|Knot Tying Workshop]] - (1st Sunday of the month)- work on knotting projects and share ropework knowledge&lt;br /&gt;
** &#039;&#039;&#039;15:00 [[Go]]&#039;&#039;&#039; - Playing of the Go boardgame. On nice days we often take the boards to Dolores Park and play there.&lt;br /&gt;
** 15:00 [[Locks!]] - Lock sport, sundays when there is demand. ( See [[locks!]] for more information. )&lt;br /&gt;
** 17:00 [[Rsync Users Group]] - A twelve step program for those who have poor *nix habits.&lt;br /&gt;
** 18:00 [[Spacebridge]] - Noisebridge&#039;s space program&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;To be scheduled&#039;&#039;&#039;&lt;br /&gt;
** [[CNC Mill Workshop]] - Who wants to make stuff on the [[MaxNCMill]]?&lt;br /&gt;
** [[Math &amp;amp; Science Help]] - If you would like some math, science or engineering help, I&#039;m down to lend a hand.&lt;br /&gt;
&amp;lt;/onlyinclude&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Proposed Future Events and Classes ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
:19:00 [[German]] - Learn German, all levels. 7pm beginners, 8pm advanced. RSVP 24 hours in advance for the benefit of the instructor. Events ran May-November 2009 on Mondays. Currently on hiatus. Get on the mailing list.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Movie Night!]] - [[User:ThOMG|Thom]] wants to build community through nerdy sci-fi! (+Bill+Ted+Excellence++)&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Introduction to the AVR Microcontroller]] - [[User:Mightyohm|Jeff]] and [[User:Maltman23|Mitch]] are planning an introductory class for people wanting to make cool projects with AVRs.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Basic Chemistry Lab Techniques]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Cuddle Puddle for the Economy]] - Stress-hacking with informal massage exchange.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Milk and Cookies]] - Come read your favorite selections out loud. With Milk and Cookies (and yeah, probably beer too).&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Processing Workshop 2]] - [[User:Scmurray|Scott]] is interested in teaching this, and is busy thinking about what, where, when, why, and how.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Programming for Poets]] - For those who think they &amp;quot;can&#039;t program,&amp;quot; a gentle introduction via processing&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;:  [[Hack your Hardware]] -- We call BS on &amp;quot;no user-serviceable parts inside&amp;quot;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Homebrew Instruction Class]] - The Wort (pt 1/3)&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Trip to Shooting Range]] - Field trip to a shooting range, to shoot guns.  Express interest at [[Trip to Shooting Range]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Surface Mount Soldering Workshop]] - Learn how to solder cicuits with small surface mount parts.  [[User:maltman23|Mitch Altman]] and Martin Bogomolni and others will show their tricks.  [[User:maltman23|Mitch]] will bring hackable kits that uses surface mounts for you to solder.&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039; - [[Locksport and Lockpicking]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039; - [[Version control tutorial]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039; - [[Foreign language learning for rocket scientists]] - I&#039;m near-native (fool people when I try) in (French and) Japanese, and a pro trans/terpreter and will share my shortcuts (skill-order, vocab, speed/articulation, translation≅grammar). No expertise on tonal languages yet... so if you know how to remember tones or how tone-sandhi interacts with speed and/or how nuances of speaker attitude are expressed in them (what we do with rythm/inflection/sentence-intonation and stress in Eng., and with particles and ??? in e.g. Cantonese) please chime in or call me (415-608-0564) so I can convey your wisdom. [also looking for a from-scratch Arabic partner]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Getting started with Arduino]]&lt;br /&gt;
:&#039;&#039;&#039;(TBD)&#039;&#039;&#039;: [[Distributed Databases]]&lt;br /&gt;
&lt;br /&gt;
= Past Events =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;June 5th, 12:00-19:00 - [[NoiseBridgeRehab]]&#039;&#039;&#039; - Help make the space more usable and accessible! Noisebridge needs your help!&lt;br /&gt;
* &#039;&#039;&#039;June 5th, 16:00-20:00 - [[Science For Juggalos]]&#039;&#039;&#039; - Science Fair in front of the Warfield Theater teaching magnetism to Juggalos&lt;br /&gt;
* &#039;&#039;&#039;June 6th, 15:00 - [[AVC Meetup]]&#039;&#039;&#039; - Entrepreneurial bonding &amp;amp; matchmaking&lt;br /&gt;
* &#039;&#039;&#039;June 9th, 21:00 - Your liver supports Noisebridge&#039;&#039;&#039; - Come to Elixir @ 16th &amp;amp; Guerrero anytime after 21:00 and drink, drink, drink! 50% of tips go to Noisebridge&lt;br /&gt;
* &#039;&#039;&#039;February 27th, 20:00 - [[Hacker EPROM]]&#039;&#039;&#039; - Noisebridge&#039;s first prom! Nice tie and a (robot) date required. We will have a DJ and punch.&lt;br /&gt;
* &#039;&#039;&#039;February 24th, 19:00, Wednesday - Joris Peels, of [http://www.shapeways.com Shapeways]&#039;&#039;&#039;, and expert on 3D printing, will give a [[ShaperwaysPresentation | talk and demonstration]] at Noisebridge!.&lt;br /&gt;
* &#039;&#039;&#039;February 23rd, 18:00 - Cleaning day&#039;&#039;&#039; - Come and help clean Noisebridge, because everyone loves a clean hack space.&lt;br /&gt;
* &#039;&#039;&#039;February 12th, 21:00 - visit from Steve Jackson&#039;&#039;&#039;. Game designer [http://en.wikipedia.org/wiki/Steve_Jackson_%28US_game_designer%29 Steve Jackson], founder of Steve Jackson Games, will visit Noisebridge.&lt;br /&gt;
* &#039;&#039;&#039;January 27th, 18:00-20:00 - [[beatrixjar event|Circuit Bending Workshop]]&#039;&#039;&#039; - [http://www.beatrixjar.com/ Beatrix*JAR] (contact [[User:Gpvillamil|Gian Pablo]] for more info)&lt;br /&gt;
* &#039;&#039;&#039;January 27th, 20:00-22:00 - [[beatrixjar event|Circuit Bending Performance]]&#039;&#039;&#039; - [http://www.beatrixjar.com/ Beatrix*JAR] - &amp;quot;Celebrate a night of new sound that will change your idea of music forever!&amp;quot;&lt;br /&gt;
* &#039;&#039;&#039;January 25th, 19:30 - [[Bag Porn]]&#039;&#039;&#039; - What&#039;s in your bag?&lt;br /&gt;
* &#039;&#039;&#039;January 20th, 19:00-21:00 - [http://groups.google.com/group/bacat/about Bay Categories &amp;amp; Types]&#039;&#039;&#039; - Categories, monoids, monads, functors and more! Held in the Alonzo Church classroom.&lt;br /&gt;
* &#039;&#039;&#039;January 20th, 19:00 - [[User Experience Book Club SF]]&#039;&#039;&#039; - Our book this month is &amp;quot;A Theory of Fun for Game Design&amp;quot; by Raph Koster - http://is.gd/6sEqw (meets in Turing)&lt;br /&gt;
* &#039;&#039;&#039;January 21st, 20:00 - [[Five Minutes of Fame]]&#039;&#039;&#039; - Monthly set of lightning talks on diverse topics&lt;br /&gt;
* &#039;&#039;&#039;January 22nd, 17:00 - [[CleaningParty| Cleaning Party]]&#039;&#039;&#039; - Come help clean up Noisebridge! Awsum fun!&lt;br /&gt;
* ...January 14th,16th, and 17th 1:00- ??? Build Out day for kitchen/bathroom/laundry bring yourself and a good attitude, learn a few things as well&lt;br /&gt;
* &#039;&#039;&#039;January 15th, 18:00 - [[CNC_Mill_Workshop]]&#039;&#039;&#039; - Learn to use the CNC mill for 2D engraving and circuit board routing&lt;br /&gt;
* Thursdays 17:00 [[ASL Group|American Sign Language]] - Learn how to talk without using your voice (or just come chat in ASL). &amp;lt;small&amp;gt;[http://whenisgood.net/noisebridge/asl/generic click to reschedule]&amp;lt;/small&amp;gt;&lt;br /&gt;
* &#039;&#039;&#039;November 18th, 19:30&#039;&#039;&#039; - [[Dorkbot_2009_11_18|Dorkbot]]&lt;br /&gt;
* &#039;&#039;&#039;November 19th, 18:00&#039;&#039;&#039; - [[Mesh meetup]]&lt;br /&gt;
* &#039;&#039;&#039;November 19th, 20:00&#039;&#039;&#039; - [[Five Minutes of Fame]]&lt;br /&gt;
* &#039;&#039;&#039;November 20th, 18:00&#039;&#039;&#039; - Loud Objects [http://www.flickr.com/photos/createdigitalmedia/3428249036/ Noise Toy workshop].&lt;br /&gt;
* &#039;&#039;&#039;November 20th, 20:00&#039;&#039;&#039; - Performance by [http://www.loudobjects.com/ Loud Objects], (featuring Tristan Perich and Lesley Flanigan) and [http://www.myspace.com/jibkidder Jib Kidder].&lt;br /&gt;
:&#039;&#039;&#039;2009-11-05&#039;&#039;&#039; - [http://www.server-sky.com/ Server Sky presentation: Internet and Computation in Orbit] by Keith Lofstrom&lt;br /&gt;
:&#039;&#039;&#039;2009-11-05&#039;&#039;&#039; - [[Mesh meetup]]&lt;br /&gt;
:&#039;&#039;&#039;2009-11-02&#039;&#039;&#039; - [[French]] book club meeting to discuss  [http://www.amazon.com/exec/obidos/tg/detail/-/2842612892/ref=ord_cart_shr?_encoding=UTF8&amp;amp;m=ATVPDKIKX0DER&amp;amp;v=glance Une Si Longue Lettre]&lt;br /&gt;
: &#039;&#039;&#039; October 1st, 18:00&#039;&#039;&#039; - [[Wireless_Mesh_Network_Meetup | Mesh wireless meetup]]&lt;br /&gt;
: &#039;&#039;&#039; October 1st, 19:00&#039;&#039;&#039; - [http://groups.google.com/group/bacat Bay Area Categories and Types]&lt;br /&gt;
: &#039;&#039;&#039;2009-10-03&#039;&#039;&#039; [[Year 1 Open Hacker House]]&lt;br /&gt;
:&#039;&#039;&#039;Friday&#039;&#039;&#039;: [[CrazyCryptoNight]] - Discussion of cryptography for beginners through experts. 6-???&lt;br /&gt;
:&#039;&#039;&#039;Sunday&#039;&#039;&#039; : [[OpenEEG | OpenEEG Hacking]] Sundays, at 3-5pm.&lt;br /&gt;
:&#039;&#039;&#039;Tuesday&#039;&#039;&#039;: [[Haskell/Haschool]] - Learn Haskell with Jason Dusek.  6PM - 7:30PM, from May until we&#039;re all experts.&lt;br /&gt;
:&#039;&#039;&#039;Wednesday&#039;&#039;&#039;: [[Adobe_Lightroom|Adobe Lightroom]] - Become a more organized photographer. Weekly class (mostly held off site).&lt;br /&gt;
:&#039;&#039;&#039;Thursday&#039;&#039;&#039;: [[Professional VFX Compositing With Adobe After Effects]] - Taught by [[User:SFSlim|Aaron Muszalski]]. 7:30PM - 10PM, most Thursdays in May &amp;amp; June &amp;amp; ? (click through dammit)&lt;br /&gt;
:&#039;&#039;&#039;2009-09-17&#039;&#039;&#039;: [[Five Minutes of Fame]] 3D Edition&lt;br /&gt;
:&#039;&#039;&#039;2009-09-17&#039;&#039;&#039;: [[Wireless Mesh Network Meetup | Mesh wireless meetup]]&lt;br /&gt;
:&#039;&#039;&#039;2009-08-20&#039;&#039;&#039;: [[Five Minutes of Fame]] One Dee Edition&lt;br /&gt;
:&#039;&#039;&#039;2009-07-16&#039;&#039;&#039;: [[Five Minutes of Fame]] Zero Dee&lt;br /&gt;
:&#039;&#039;&#039;2009-07-02 - 2009-07-05&#039;&#039;&#039;: [http://toorcamp.org Toorcamp]&lt;br /&gt;
:&#039;&#039;&#039;2009-07-01&#039;&#039;&#039;: Noisedroid meeting to discuss location logging on Android platform (and other stuff too, I&#039;m sure)&lt;br /&gt;
:&#039;&#039;&#039;2009-06-30&#039;&#039;&#039;: [[Powerbocking Class|Powerbocking class]]&lt;br /&gt;
:&#039;&#039;&#039;2009-06-30&#039;&#039;&#039;: &amp;quot;Suing Telemarketers for Fun and Profit&amp;quot; (Toorcamp talk preview)&lt;br /&gt;
:&#039;&#039;&#039;2009-06-28&#039;&#039;&#039;: &amp;quot;Meditation for Hackers&amp;quot; (Toorcamp workshop preview)&lt;br /&gt;
:&#039;&#039;&#039;2009-06-18&#039;&#039;&#039;: [[Five Minutes of Fame]]&lt;br /&gt;
:&#039;&#039;&#039;2009-06-15&#039;&#039;&#039;: [[Eagle Workshop]]  Session two of the Eagle CAD workshop.&lt;br /&gt;
:&#039;&#039;&#039;2009-06-13&#039;&#039;&#039;: [[RoboGames 2009]] Noisebridge had a booth staffed by vounteers, great fun!&lt;br /&gt;
:&#039;&#039;&#039;2009-05-21&#039;&#039;&#039;: [[Five Minutes of Fame]]&lt;br /&gt;
:&#039;&#039;&#039;2009-04-27&#039;&#039;&#039;: [[EagleCAD workshop]] -- learn to use this CAD tool for printed circuit board design&lt;br /&gt;
:&#039;&#039;&#039;2009-04-16&#039;&#039;&#039;: [[Five Minutes of Fame]] April showers &amp;amp; flowers edition&lt;br /&gt;
:&#039;&#039;&#039;2009-04-11&#039;&#039;&#039;: [[RFID Hacking]] weekend workshop  (this event moved from the original March date)&lt;br /&gt;
:&#039;&#039;&#039;2009-04-05&#039;&#039;&#039;: [[First aid and CPR class]] Learning how to not only not die, but also reduce scarring!&lt;br /&gt;
:&#039;&#039;&#039;2009-04-03&#039;&#039;&#039;: [[Sudo pop]] 2PM and on. Making the first batch of a Noisebridge label yerba mate-niated rootbrew, gratis and DIY&lt;br /&gt;
:&#039;&#039;&#039;2009-03-26&#039;&#039;&#039;: [[OpenEEG | OpenEEG Hacking]] first meet up for this new group: 8 pm&lt;br /&gt;
:&#039;&#039;&#039;2009-03-19&#039;&#039;&#039;: [[Five Minutes of Fame]]&lt;br /&gt;
:&#039;&#039;&#039;2009-03-12&#039;&#039;&#039;: [[OpenBTS and GSM]] talk by David Burgess&lt;br /&gt;
:&#039;&#039;&#039;2009-02-14&#039;&#039;&#039;: [[Open Heart Workshop]] Valentine&#039;s Day blinkyheart soldering party! &lt;br /&gt;
:&#039;&#039;&#039;2009-02-13&#039;&#039;&#039;: [[Time-t_Party|&amp;lt;tt&amp;gt;time_t&amp;lt;/tt&amp;gt; Party]] to celebrate 1,234,567,890 since the Unix epoch.&lt;br /&gt;
:&#039;&#039;&#039;2009-02-09&#039;&#039;&#039;: [[Spanish learning at 8:30]]&lt;br /&gt;
:&#039;&#039;&#039;2009-02-05&#039;&#039;&#039;: [[PGP Key Workshop]]&lt;br /&gt;
:&#039;&#039;&#039;2009-01-31&#039;&#039;&#039;: [[Locksport and Lockpicking]]&lt;br /&gt;
:&#039;&#039;&#039;2008-12-27&#039;&#039;&#039;: [[25C3]] Chaos Computer Congress in Berlin&lt;br /&gt;
:&#039;&#039;&#039;2008-12-20 &amp;amp; 21&#039;&#039;&#039;: [[Creme Brulee]] Workshop on creating a french dessert, with bonus propane torch.&lt;br /&gt;
:&#039;&#039;&#039;2008-12-17 20:00&#039;&#039;&#039;: [[Machine Learning]] Birds-of-a-feather&lt;br /&gt;
:&#039;&#039;&#039;2008-11-24&#039;&#039;&#039;: [[Circuit Hacking Monday]] circuit design workshop&lt;br /&gt;
:&#039;&#039;&#039;2008-11-21, 7pm&#039;&#039;&#039;:[[Milk and Cookies]] -- [[User:Dmolnar|David Molnar]] hosts Milk and Cookies at 83C. Bring a short 5-7minute thing to read to others. Bring a potluck cookie/snack/drink if you like. David will bring milk and cookies.&lt;br /&gt;
:&#039;&#039;&#039;2008-11-17, 7:30pm&#039;&#039;&#039;: [[Basic Bicycle Maintain]] - [[User:rubin110|Rubin]] and [[User:rigel|rigel]] hate it when we see a bike that isn&#039;t maintained. Screechy chains and clacking derailleur can go to hell. Basic bike tune up, sharing the smarts on simple things you can do at home to make your ride suck a whole lot less.&lt;br /&gt;
:&#039;&#039;&#039;2008-11-16, 5:00pm&#039;&#039;&#039;: [[RepRap Soldering Party]] - help assemble RepRap!  RSVPs required on wiki! [[User:Adi|adi]]&lt;br /&gt;
:&#039;&#039;&#039;2008-11-16, 3:00pm&#039;&#039;&#039;: [[Oscilloscopes]] - Learn how to use this versatile tool to test electronic circuits.  Maximum 6 slots, please sign up ahead of time! [[User:dstaff|dstaff]]&lt;br /&gt;
:&#039;&#039;&#039;2008-10-31&#039;&#039;&#039;: [[Halloween Open House]] - NoiseBridge&#039;s own [[PPPC]] threw an awesome open house/halloween gala. Post pictures if you got &#039;em!&lt;br /&gt;
:&#039;&#039;&#039;2008-10-25&#039;&#039;&#039;: [[Soldering Workshop]] and Pumpkin Hackin&#039; - Learn to solder for total newbies (or learn to solder better!), including surface mount. Additionally, carve your halloween pumpkins and enjoy some experimental pumpkin pie and/or soup.&lt;br /&gt;
:&#039;&#039;&#039;2008-10-07&#039;&#039;&#039;: (tuesday before meeting) - Etch a circuit board. I&#039;ll be trying a photo resist etching and a basic printed mask etching. This is step 1/3 for a project called &amp;quot;annoying USB thingie&amp;quot; which will execute pre-defined keystrokes by sneaking a tiny USB dongle onto a victim^h^h^h^h^h buddy&#039;s computer.&lt;br /&gt;
:&#039;&#039;&#039;2008-09-13&#039;&#039;&#039;: [[Processing Workshop]] — Learn this very easy-to-use programming language! - [[Processing Workshop Report]]&lt;br /&gt;
:&#039;&#039;&#039;2008-02-16&#039;&#039;&#039;: [[Brain Machine Workshop|Brain Machine Making Workshop]]: Our first hardware sprint!&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11789</id>
		<title>Machine Learning/moa</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11789"/>
		<updated>2010-06-17T03:14:10Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Setup Instructions==&lt;br /&gt;
* Download and unzip http://thomaslotze.com/kdd/moa_prep.tgz&lt;br /&gt;
OR&lt;br /&gt;
* Create a directory to run your moa programs from; we&#039;ll assume it is ~/moa&lt;br /&gt;
* Download the moa release .tar.gz file from http://sourceforge.net/projects/moa-datastream/ and extract it&lt;br /&gt;
** copy moa.jar into ~/moa&lt;br /&gt;
* Download the weka release .zip file from http://sourceforge.net/projects/weka/ and extract it&lt;br /&gt;
** copy weka.jar into ~/moa&lt;br /&gt;
* Download http://jroller.com/resources/m/maxim/sizeofag.jar and copy it into ~/moa&lt;br /&gt;
&lt;br /&gt;
==Training MOA models==&lt;br /&gt;
* Your data will need to be in [http://www.cs.waikato.ac.nz/~ml/weka/arff.html ARFF format]&lt;br /&gt;
* To evaluate the performance of different models, you can run varying prequential classifiers and look at their performance; for example,&lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluatePrequential -l NaiveBayes -s (ArffFileStream -f atrain.arff -c -1) -O amodel_bayes.moa&amp;quot;&lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluatePrequential -l HoeffdingTree -s (ArffFileStream -f atrain.arff -c -1) -O amodel_hoeffding.moa&amp;quot;&lt;br /&gt;
* To actually generate the final model, you can run a command line like the following: &lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;LearnModel -l NaiveBayes -s (ArffFileStream -f atrain.arff -c -1) -O amodel_bayes.moa&amp;quot;&lt;br /&gt;
&lt;br /&gt;
==Generating MOA model predictions==&lt;br /&gt;
To generate predictions for a test set, you will need your test set to be in ARFF format, with the same columns as the training data (including output class; I just set this to all-0&#039;s)&lt;br /&gt;
&lt;br /&gt;
To do this, you will also need the moa_personal.jar file in the same directory as your other jar files; you can get all the jar files needed from http://thomaslotze.com/kdd/jarfiles.tgz&lt;br /&gt;
&lt;br /&gt;
You can then run the following (after generating a model using the above steps)&lt;br /&gt;
 java -cp .:moa_personal.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluateModel -e BasicLoggingClassificationPerformanceEvaluator -m file:amodel_bayes.moa -s (ArffFileStream -f atest.arff -c -1)&amp;quot; &amp;gt; a_bayes_predicted.txt&lt;br /&gt;
&lt;br /&gt;
This generates a comma-separated file, which contains the item number as the first column and the probability of class 1 (in our case, cfa=1) as the second column&lt;br /&gt;
&lt;br /&gt;
Thomas is going to develop the evaluator to be more general and robust, and hopefully submit it back for inclusion in the main MOA trunk.  Right now, it will only work for examples with two classes.&lt;br /&gt;
&lt;br /&gt;
==Other Resources==&lt;br /&gt;
* MOA site: http://www.cs.waikato.ac.nz/~abifet/MOA/&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/Kaggle_HIV&amp;diff=11708</id>
		<title>Machine Learning/Kaggle HIV</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/Kaggle_HIV&amp;diff=11708"/>
		<updated>2010-06-10T04:37:10Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;http://kaggle.com/hivprogression&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;http://kaggle.com/hivprogression&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11707</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11707"/>
		<updated>2010-06-10T04:37:00Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 6/16/2010 @ 8:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Hadoop tutorial, How to use MOA, discussion of [[Machine Learning/Kaggle HIV | HIV competition]]&lt;br /&gt;
*Presenter: Vikram, Thomas, group&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Possible Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[Online Optimization &amp;amp; Machine Learning Toolkit]]&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11706</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11706"/>
		<updated>2010-06-10T04:36:18Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 6/16/2010 @ 8:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: How to use MOA, discussion of HIV competition&lt;br /&gt;
*Presenter: Thomas, group&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
*[[Machine Learning/Kaggle HIV | HIV]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Possible Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[Online Optimization &amp;amp; Machine Learning Toolkit]]&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/Hadoop&amp;diff=11704</id>
		<title>Machine Learning/Hadoop</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/Hadoop&amp;diff=11704"/>
		<updated>2010-06-10T04:05:21Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;===About===&lt;br /&gt;
* Google had so much data that &#039;&#039;reading&#039;&#039; the data from disk took a lot of time, much less processing&lt;br /&gt;
** So they needed to parallelize everything, even disk access&lt;br /&gt;
** Make the processing local to where the data is, to avoid network issues&lt;br /&gt;
* Parallelization is hard/error-prone&lt;br /&gt;
** Want to have a &amp;quot;shared-nothing&amp;quot; architecture&lt;br /&gt;
** Functional programming&lt;br /&gt;
* Map&lt;br /&gt;
Runs the function on each item in the list, returns the list of output from running the function on each item&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def map(func, list):&lt;br /&gt;
  return [func(item) for item in list]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Example:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def twice(num):&lt;br /&gt;
  return num*2&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* Reduce&lt;br /&gt;
Take a function (which takes two arguments) and a list, and iteratively continues through, accumulating&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def reduce(func, list):&lt;br /&gt;
  a = func(list[0], list[1])&lt;br /&gt;
  for &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Examples/Actual===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def map(key,value):&lt;br /&gt;
  # process&lt;br /&gt;
  emit(another_key, another_value)&lt;br /&gt;
def reduce(key, values):&lt;br /&gt;
  # process the key and all values associated with it&lt;br /&gt;
  emit(something)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Average&lt;br /&gt;
** keys are line numbers, values are what&#039;s in it&lt;br /&gt;
** file:&lt;br /&gt;
*** 1  (1,2)&lt;br /&gt;
*** 4  (2,4)&lt;br /&gt;
*** 5  (3,5)&lt;br /&gt;
*** 6  (4,6)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def map(key,value):&lt;br /&gt;
  emit(&amp;quot;exist&amp;quot;,1)&lt;br /&gt;
  emit(&amp;quot;x&amp;quot;,value)&lt;br /&gt;
def reduce(key, values):&lt;br /&gt;
  # process the key and all values associated with it&lt;br /&gt;
  emit(something)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Tutorials===&lt;br /&gt;
* http://www.cloudera.com/videos/introduction_to_pig&lt;br /&gt;
&lt;br /&gt;
===How to Debug===&lt;br /&gt;
* To debug a streaming Hadoop process, cat your source file, pipe it to the mapper, then to sort, then to the reducer&lt;br /&gt;
** Ex: cat princess_bride.txt | scripts/word-count/mapper.py | sort | scripts/word-count/reducer.py&lt;br /&gt;
&lt;br /&gt;
===Tools===&lt;br /&gt;
* Hadoop&lt;br /&gt;
* Hive&lt;br /&gt;
* Pig: A high-level language for compiling down to MapReduce programs&lt;br /&gt;
* MapReduce on Amazon (?)&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11703</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11703"/>
		<updated>2010-06-10T04:03:18Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Next Meeting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 6/16/2010 @ 8:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: How to use MOA, discussion of HIV competition&lt;br /&gt;
*Presenter: Thomas, group&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**Clustering: PCA, k-Means, Expectation-Maximization&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Possible Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[Online Optimization &amp;amp; Machine Learning Toolkit]]&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-06-02]] -- Final official meeting before KDD submission deadline&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-26]] -- Clustering, KDD Data Reduction&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-23]] -- Unofficial meetup to nail down KDD cup problem set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-19]] -- Presentation on Hadoop and MapReduce&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=KDD_Competition_2010&amp;diff=11650</id>
		<title>KDD Competition 2010</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=KDD_Competition_2010&amp;diff=11650"/>
		<updated>2010-06-07T00:35:30Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We&#039;re interested in working on the KDD Competition, as a way to focus our machine learning exploration -- and maybe even finding some interesting aspects to the data!  If you&#039;re interested, drop us a note, show up at a weekly Machine Learning meeting, and we&#039;ll use this space to keep track of our ideas.&lt;br /&gt;
&lt;br /&gt;
==Resources==&lt;br /&gt;
* [[Machine Learning]]&lt;br /&gt;
* [https://pslcdatashop.web.cmu.edu/KDDCup/rules_data_format.jsp KDD Rules and Data Format]&lt;br /&gt;
* [http://cran.r-project.org/ R language]&lt;br /&gt;
* [http://www.csie.ntu.edu.tw/~cjlin/libsvm/ libsvm]&lt;br /&gt;
* [http://www.cs.waikato.ac.nz/ml/weka/ Weka]&lt;br /&gt;
* [http://www.kdnuggets.com/datasets/competitions.html List of other competitions in which we could engage]&lt;br /&gt;
* [[Machine Learning/Hadoop | Hadoop]]&lt;br /&gt;
* [http://lucene.apache.org/mahout/ Mahout -- machine learning libraries for Hadoop]&lt;br /&gt;
* [http://www.cloudera.com/videos/introduction_to_pig So-so intro to Pig Video]&lt;br /&gt;
* [http://s3.amazonaws.com/awsVideos/AmazonElasticMapReduce/ElasticMapReduce-PigTutorial.html An AWESOME intro to Pig on Elastic Map Reduce!]&lt;br /&gt;
* [http://hadoop.apache.org/pig/ Pig language]&lt;br /&gt;
* [http://hadoop.apache.org/pig/docs/r0.3.0/piglatin.html Pig Latin Manual]&lt;br /&gt;
* [http://www.cloudera.com/ Cloudera -- see videos for Hadoop intro]&lt;br /&gt;
* [http://github.com/voberoi/hadoop-mrutils Vikram&#039;s awesome Hadoop/EC2 scripts]&lt;br /&gt;
* [https://www.noisebridge.net/mailman/listinfo/ml Our mailing list]&lt;br /&gt;
* [http://www.s3fox.net/ S3Fox]&lt;br /&gt;
* [[Machine_Learning/SqliteImport | Importing data into Sqlite]] for SQL&#039;ing the data&lt;br /&gt;
* [[Machine_Learning/OmniscopeVisualization | Visualizing Sqlite data in Omniscope]] for understanding the data&lt;br /&gt;
* [http://swarmfinancial.com/dumps.zip Datasets for Thomas to merge with Chance!!!]&lt;br /&gt;
&lt;br /&gt;
==TODOs==&lt;br /&gt;
&lt;br /&gt;
* Vikram -- will create a guide for Mahout setup &lt;br /&gt;
* Thomas -- Attempt clustering skills (subskills, traced skills and rules) using Mahout&lt;br /&gt;
** put together a [[Machine_Learning/kdd_sample | perl script]] which will take random samples from the data, for working on smaller instances&lt;br /&gt;
** put together a [[Machine_Learning/kdd_r | simple R script]] for loading the data&lt;br /&gt;
* Andy --  define features for sub-problems (student iq, step difficulty); Do remaining feature transforms: Replace step name with unique step name; remove given features; add features: step success chance, student IQ, complexity&lt;br /&gt;
* Erin -- &lt;br /&gt;
* Paul -- Create overview of the data: histograms, notable features etc. Visualization? &lt;br /&gt;
&lt;br /&gt;
== Notes ==&lt;br /&gt;
* For KDD submission: to zip the submission file on OSX: use command line, otherwise will complain about __MACOSX file: e.g.:  zip asdf.zip algebra_2008_2009_submission.txt&lt;br /&gt;
* We will need to make sure we don&#039;t get disqualified for people belonging to multiple teams! Do not sign up anybody else for the competition without asking first.&lt;br /&gt;
&lt;br /&gt;
== Ideas == &lt;br /&gt;
* Add new features by computing their values from existing columns -- e.g. correlation between skills based on their co-occurence within problems. Could use Decision tree to define boundaries between e.g. new &amp;quot;good student, medium student, bad student&amp;quot; feature&lt;br /&gt;
* Dimensionality reduction -- transform into numerical values appropriate for consumption by SVM&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Who we are ==&lt;br /&gt;
* Andy; Machine Learning&lt;br /&gt;
* Paul; Machine Learning&lt;br /&gt;
* Thomas; Statistics&lt;br /&gt;
* Erin; Maths&lt;br /&gt;
* Vikram; Hadoop&lt;br /&gt;
(insert your name/contact info/expertise here)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== How to run Weka (quick &#039;n very dirty tutorial) == &lt;br /&gt;
* Download and install Weka&lt;br /&gt;
* Get your KDD data &amp;amp; preprocess your data: &lt;br /&gt;
this command takes 1000 lines from the given training data set and converts it into .csv file&lt;br /&gt;
attention, in the last sed command you need to replace the long whitespace with a tab.  In OSX terminal, you do that by pressing CONTROL+V and then tab. (Copying and pasting the command below won&#039;t work, since it interprets the whitespace as spaces)&lt;br /&gt;
 head -n 1000 algebra_2006_2007_train.txt | sed -e &#039;s/[&amp;quot;,]/ /g&#039; | sed &#039;s/       /,/g&#039; &amp;gt; algebra_2006_2007_train_1kFormatted.csv&lt;br /&gt;
* The following screencast shows you how to do these steps: &lt;br /&gt;
* In Weka&#039;s Explorer, remove some unwanted attributes (I leave this up to your judgment), inspect the dataset. &lt;br /&gt;
* Then you can run a ML algorithm over it, e.g. Neural Networks to predict the student performance.&lt;br /&gt;
* [http://swarmfinancial.com/screencasts/nb/kddWekaUsage1.swf Screencast1]&lt;br /&gt;
* [http://swarmfinancial.com/screencasts/nb/kddWekaUsage2.swf Screencast2]&lt;br /&gt;
&lt;br /&gt;
== How to run libSVM ==&lt;br /&gt;
* See the notes at [[Machine Learning/SVM]]&lt;br /&gt;
&lt;br /&gt;
== How to run MOA ==&lt;br /&gt;
* See the notes at [[Machine Learning/moa]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11649</id>
		<title>Machine Learning/moa</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11649"/>
		<updated>2010-06-07T00:34:39Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Generating MOA model predictions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Setup Instructions==&lt;br /&gt;
* Create a directory to run your moa programs from; we&#039;ll assume it is ~/moa&lt;br /&gt;
* Download the moa release .tar.gz file from http://sourceforge.net/projects/moa-datastream/ and extract it&lt;br /&gt;
** copy moa.jar into ~/moa&lt;br /&gt;
* Download the weka release .zip file from http://sourceforge.net/projects/weka/ and extract it&lt;br /&gt;
** copy weka.jar into ~/moa&lt;br /&gt;
* Download http://jroller.com/resources/m/maxim/sizeofag.jar and copy it into ~/moa&lt;br /&gt;
&lt;br /&gt;
==Training MOA models==&lt;br /&gt;
* Your data will need to be in [http://www.cs.waikato.ac.nz/~ml/weka/arff.html ARFF format]&lt;br /&gt;
* To evaluate the performance of different models, you can run varying prequential classifiers and look at their performance; for example,&lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluatePrequential -l NaiveBayes -s (ArffFileStream -f atrain.arff -c -1) -O amodel_bayes.moa&amp;quot;&lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluatePrequential -l HoeffdingTree -s (ArffFileStream -f atrain.arff -c -1) -O amodel_hoeffding.moa&amp;quot;&lt;br /&gt;
* To actually generate the final model, you can run a command line like the following: &lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;LearnModel -l NaiveBayes -s (ArffFileStream -f atrain.arff -c -1) -O amodel_bayes.moa&amp;quot;&lt;br /&gt;
&lt;br /&gt;
==Generating MOA model predictions==&lt;br /&gt;
To generate predictions for a test set, you will need your test set to be in ARFF format, with the same columns as the training data (including output class; I just set this to all-0&#039;s)&lt;br /&gt;
&lt;br /&gt;
To do this, you will also need the moa_personal.jar file in the same directory as your other jar files; you can get all the jar files needed from http://thomaslotze.com/kdd/jarfiles.tgz&lt;br /&gt;
&lt;br /&gt;
You can then run the following (after generating a model using the above steps)&lt;br /&gt;
 java -cp .:moa_personal.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluateModel -e BasicLoggingClassificationPerformanceEvaluator -m file:amodel_bayes.moa -s (ArffFileStream -f atest.arff -c -1)&amp;quot; &amp;gt; a_bayes_predicted.txt&lt;br /&gt;
&lt;br /&gt;
This generates a comma-separated file, which contains the item number as the first column and the probability of class 1 (in our case, cfa=1) as the second column&lt;br /&gt;
&lt;br /&gt;
Thomas is going to develop the evaluator to be more general and robust, and hopefully submit it back for inclusion in the main MOA trunk.  Right now, it will only work for examples with two classes.&lt;br /&gt;
&lt;br /&gt;
==Other Resources==&lt;br /&gt;
* MOA site: http://www.cs.waikato.ac.nz/~abifet/MOA/&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11648</id>
		<title>Machine Learning/moa</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11648"/>
		<updated>2010-06-07T00:30:32Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Training MOA models */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Setup Instructions==&lt;br /&gt;
* Create a directory to run your moa programs from; we&#039;ll assume it is ~/moa&lt;br /&gt;
* Download the moa release .tar.gz file from http://sourceforge.net/projects/moa-datastream/ and extract it&lt;br /&gt;
** copy moa.jar into ~/moa&lt;br /&gt;
* Download the weka release .zip file from http://sourceforge.net/projects/weka/ and extract it&lt;br /&gt;
** copy weka.jar into ~/moa&lt;br /&gt;
* Download http://jroller.com/resources/m/maxim/sizeofag.jar and copy it into ~/moa&lt;br /&gt;
&lt;br /&gt;
==Training MOA models==&lt;br /&gt;
* Your data will need to be in [http://www.cs.waikato.ac.nz/~ml/weka/arff.html ARFF format]&lt;br /&gt;
* To evaluate the performance of different models, you can run varying prequential classifiers and look at their performance; for example,&lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluatePrequential -l NaiveBayes -s (ArffFileStream -f atrain.arff -c -1) -O amodel_bayes.moa&amp;quot;&lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;EvaluatePrequential -l HoeffdingTree -s (ArffFileStream -f atrain.arff -c -1) -O amodel_hoeffding.moa&amp;quot;&lt;br /&gt;
* To actually generate the final model, you can run a command line like the following: &lt;br /&gt;
 java -cp .:moa.jar:weka.jar -javaagent:sizeofag.jar moa.DoTask &amp;quot;LearnModel -l NaiveBayes -s (ArffFileStream -f atrain.arff -c -1) -O amodel_bayes.moa&amp;quot;&lt;br /&gt;
&lt;br /&gt;
==Generating MOA model predictions==&lt;br /&gt;
&lt;br /&gt;
==Other Resources==&lt;br /&gt;
* MOA site: http://www.cs.waikato.ac.nz/~abifet/MOA/&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11612</id>
		<title>Machine Learning/moa</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/moa&amp;diff=11612"/>
		<updated>2010-06-06T00:00:53Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Starting to create a moa page, to be linked in from Machine learning when ready&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Setup Instructions==&lt;br /&gt;
* Create a directory to run your moa programs from; we&#039;ll assume it is ~/moa&lt;br /&gt;
* Download the moa release .tar.gz file from http://sourceforge.net/projects/moa-datastream/ and extract it&lt;br /&gt;
** copy moa.jar into ~/moa&lt;br /&gt;
* Download the weka release .zip file from http://sourceforge.net/projects/weka/ and extract it&lt;br /&gt;
** copy weka.jar into ~/moa&lt;br /&gt;
* Download http://jroller.com/resources/m/maxim/sizeofag.jar and copy it into ~/moa&lt;br /&gt;
&lt;br /&gt;
==Training MOA models==&lt;br /&gt;
&lt;br /&gt;
==Generating MOA model predictions==&lt;br /&gt;
&lt;br /&gt;
==Other Resources==&lt;br /&gt;
* MOA site: http://www.cs.waikato.ac.nz/~abifet/MOA/&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-05-26&amp;diff=11399</id>
		<title>Machine Learning Meetup Notes: 2010-05-26</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-05-26&amp;diff=11399"/>
		<updated>2010-05-27T05:09:26Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*Andy gave overview of where we&#039;re at with KDD data&lt;br /&gt;
*Mike S gave presentation:&lt;br /&gt;
**Gaussian Mixture Models&lt;br /&gt;
**k-means clustering&lt;br /&gt;
**very basic expectation-maximization&lt;br /&gt;
*Brainstorming session on how to reduce skill set column&lt;br /&gt;
**Tom tried to quantify opportunity per skills per row as high dimensional vector&lt;br /&gt;
*Brainstorming on how to reduce other data and compute new features for the KDD Dataset&lt;br /&gt;
**Tom will apply k-means clustering of skills (or steps), for data reduction&lt;br /&gt;
**Andy will compute new features: unique step/problem id, student IQ (avg. correct), step challenge/difficulty (avg correct), step complexity (# skills required)&lt;br /&gt;
**Mike will use self-organizing maps to reduce skills&lt;br /&gt;
**Paul will visualize/summarize the data, to provide understanding and insight&lt;br /&gt;
**Mike will set up an FTP server for people to transfer their enormous datasets&lt;br /&gt;
**Theo will use some Weka classifiers to produce a classification method for the data&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-05-23&amp;diff=11388</id>
		<title>Machine Learning Meetup Notes: 2010-05-23</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning_Meetup_Notes:_2010-05-23&amp;diff=11388"/>
		<updated>2010-05-27T03:52:46Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Erin, Theo and Andy came together to define KDD machine learning problem definition.&lt;br /&gt;
&lt;br /&gt;
We decided to remove (-) and add (+) the following features:&lt;br /&gt;
* Row, (only used for submission, not for ML algorithms)&lt;br /&gt;
* Anon Student Id,&lt;br /&gt;
* Problem Hierarchy,&lt;br /&gt;
* Problem Name,&lt;br /&gt;
* Problem View,&lt;br /&gt;
* - Step Name,&lt;br /&gt;
* + unique step name (step name+problem name)&lt;br /&gt;
* - Step Start Time,&lt;br /&gt;
* - First Transaction Time,&lt;br /&gt;
* - Correct Transaction Time,&lt;br /&gt;
* - Step End Time,&lt;br /&gt;
* - Step Duration (sec),&lt;br /&gt;
* - Correct Step Duration (sec),&lt;br /&gt;
* - Error Step Duration (sec),&lt;br /&gt;
*  Correct First Attempt,&lt;br /&gt;
* - Incorrects,&lt;br /&gt;
* - Hints,&lt;br /&gt;
* - Corrects,&lt;br /&gt;
* - KC(...),&lt;br /&gt;
* - Opportunity(...)&lt;br /&gt;
* + set of superskills (either boolean or opportunity value) (superskills = clustered skills)&lt;br /&gt;
* + step success chance (% of successes total for this unique stepname)&lt;br /&gt;
* + student &amp;quot;IQ&amp;quot; (% successful answers by this student)&lt;br /&gt;
* + complexity (number of skills required for this step)&lt;br /&gt;
* + frequency of skills (e.g. discretize into low/medium/high frequency -- reasoning: infrequently tested skills may be harder/easier)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Submission datasets naming convention: ==&lt;br /&gt;
* &amp;quot;bridge&amp;quot;&lt;br /&gt;
* &amp;quot;algebra&amp;quot; -&amp;gt; also has KC rules model (need to orthoganlize them as well TODO erin)&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Multi-algorithm idea: ==&lt;br /&gt;
* After we discussed the features above, we had the idea to use multiple algorithms for predicting out output variable (probability of success for first try for this student-step): one algorithm predicting student success, one for step difficulty, perhaps additional ones... then have an aggregate function learn the overall success probability.  Since we have about 3000 steps per student, we should have enough data to train a model for each student. &lt;br /&gt;
&lt;br /&gt;
* We agreed that the accepted features above generally make sense to us;  to define the set of features for the individual multi-algo problems is yet TODO.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Andy&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/kdd_sample&amp;diff=11327</id>
		<title>Machine Learning/kdd sample</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/kdd_sample&amp;diff=11327"/>
		<updated>2010-05-23T03:08:17Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
# get a random subsample of students from the training set&lt;br /&gt;
&lt;br /&gt;
use strict;&lt;br /&gt;
use warnings;&lt;br /&gt;
&lt;br /&gt;
use Getopt::Long;&lt;br /&gt;
use File::Basename;&lt;br /&gt;
&lt;br /&gt;
my $numItems=1000;&lt;br /&gt;
my $method=&amp;quot;random&amp;quot;;&lt;br /&gt;
my $type=&amp;quot;students&amp;quot;;&lt;br /&gt;
my $help=&amp;quot;&amp;quot;;&lt;br /&gt;
&lt;br /&gt;
GetOptions (&#039;numitems=s&#039; =&amp;gt; \$numItems,&lt;br /&gt;
			&#039;method=s&#039; =&amp;gt; \$method,&lt;br /&gt;
			&#039;type=s&#039; =&amp;gt; \$type,&lt;br /&gt;
			&#039;h&#039; =&amp;gt; \$help);&lt;br /&gt;
&lt;br /&gt;
my $inputFile=shift(@ARGV);&lt;br /&gt;
if (not($inputFile)) {&lt;br /&gt;
	$help=1;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
my $progname=basename($0);&lt;br /&gt;
&lt;br /&gt;
if ($help) {&lt;br /&gt;
	print &amp;quot;This program will sample a tab-separated txt file of students.\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;It can be used to get all examples per student (for a number of students).\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;Basic usage:\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;$progname &amp;lt;input file&amp;gt;\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;Full usage:\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;$progname [-numitems &amp;lt;number of items&amp;gt;] [-method &amp;lt;&#039;random&#039;|&#039;first&#039;&amp;gt;] [-type &amp;lt;&#039;students&#039;&amp;gt;] &amp;lt;input file&amp;gt;\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;Examples:\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;$progname algebra_2008_2009_train.txt\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;  by default, will create a sample of 1000 random students (all examples on those students)\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;$progname -numitems 20000 algebra_2008_2009_train.txt\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;  create a sample of 20000 random students\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;$progname -type students -method first algebra_2008_2009_train.txt\n&amp;quot;;&lt;br /&gt;
	print &amp;quot;  create a sample of the first 1000 students\n&amp;quot;;&lt;br /&gt;
	exit(0);&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
print &amp;quot;Type &#039;$progname -h&#039; to get the help\n&amp;quot;;&lt;br /&gt;
my $directory=&amp;quot;download&amp;quot;;&lt;br /&gt;
if (not(-e $directory)) {&lt;br /&gt;
	$directory=&amp;quot;.&amp;quot;;&lt;br /&gt;
}&lt;br /&gt;
my $outputFile=&amp;quot;${inputFile}_sample_${numItems}_${method}_${type}.csv&amp;quot;;&lt;br /&gt;
print &amp;quot;Getting $numItems $method $type, putting in $outputFile\n&amp;quot;;&lt;br /&gt;
&lt;br /&gt;
# get the list of possible ids&lt;br /&gt;
my $sourceIdFile=&amp;quot;&amp;quot;;&lt;br /&gt;
my $idIndex=1;&lt;br /&gt;
my %names=();&lt;br /&gt;
my %sourceIds=();&lt;br /&gt;
my @sourceIds=();&lt;br /&gt;
if ($type eq &amp;quot;students&amp;quot;) {&lt;br /&gt;
	$sourceIdFile=&amp;quot;$directory/studentinfo.csv&amp;quot;;&lt;br /&gt;
	if (not (-e $sourceIdFile)) {&lt;br /&gt;
		open INPUT, $inputFile;&lt;br /&gt;
		open OUTPUT, &amp;quot;&amp;gt;$sourceIdFile&amp;quot;;&lt;br /&gt;
		while(defined(my $line = &amp;lt;INPUT&amp;gt;)) {&lt;br /&gt;
			chomp($line);&lt;br /&gt;
			my @values=split(&amp;quot;\t&amp;quot;,$line);&lt;br /&gt;
			my $id = $values[$idIndex];&lt;br /&gt;
			if (not(defined($sourceIds{$id}))) {&lt;br /&gt;
				print OUTPUT &amp;quot;$id\n&amp;quot;;&lt;br /&gt;
			}&lt;br /&gt;
			$sourceIds{$id} = 1;&lt;br /&gt;
		}&lt;br /&gt;
		close OUTPUT;&lt;br /&gt;
		close INPUT;&lt;br /&gt;
		@sourceIds = keys %sourceIds;&lt;br /&gt;
	} else {&lt;br /&gt;
		open INPUT, $sourceIdFile;&lt;br /&gt;
		while (defined(my $line=&amp;lt;INPUT&amp;gt;)) {&lt;br /&gt;
			chomp($line);&lt;br /&gt;
			push @sourceIds, $line;&lt;br /&gt;
		}&lt;br /&gt;
		close INPUT;&lt;br /&gt;
	}&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
# get the list of ids to pull&lt;br /&gt;
my %idsWanted=();&lt;br /&gt;
my $numFound=0;&lt;br /&gt;
while ($numFound &amp;lt; $numItems) {&lt;br /&gt;
	my $id=1;&lt;br /&gt;
	if ($method eq &amp;quot;first&amp;quot;) {&lt;br /&gt;
		$id=shift(@sourceIds);&lt;br /&gt;
	} else {&lt;br /&gt;
		my $index=int(rand(scalar(@sourceIds)));&lt;br /&gt;
		$id=$sourceIds[$index];&lt;br /&gt;
		# remove that id from the source array&lt;br /&gt;
		splice(@sourceIds,$index,1)&lt;br /&gt;
	}&lt;br /&gt;
	$idsWanted{$id}=1;&lt;br /&gt;
	$numFound++;&lt;br /&gt;
}&lt;br /&gt;
print &amp;quot;Pulling $type ids (found &amp;quot; . scalar(keys %idsWanted) . &amp;quot;:\n&amp;quot;;&lt;br /&gt;
#my @sortedIds=sort(keys(%idsWanted));&lt;br /&gt;
print &amp;quot;This could take a while...\n&amp;quot;;&lt;br /&gt;
&lt;br /&gt;
# go through the list and pull those lines&lt;br /&gt;
open INPUT, $inputFile;&lt;br /&gt;
open OUTPUT, &amp;quot;&amp;gt;$outputFile&amp;quot;;&lt;br /&gt;
# check first line for header&lt;br /&gt;
my $line=&amp;lt;INPUT&amp;gt;;&lt;br /&gt;
chomp($line);&lt;br /&gt;
if ($line =~ /Student Id/) {&lt;br /&gt;
	print OUTPUT &amp;quot;$line\n&amp;quot;;&lt;br /&gt;
} else {&lt;br /&gt;
	print &amp;quot;Er...no header...\n$line\n&amp;quot;;&lt;br /&gt;
	my @values=split(/\t/,$line);&lt;br /&gt;
	if ($idsWanted{$values[$idIndex]}) {&lt;br /&gt;
		print OUTPUT &amp;quot;$line\n&amp;quot;;&lt;br /&gt;
	}&lt;br /&gt;
}&lt;br /&gt;
my $lineNum=1;&lt;br /&gt;
# now go through the rest of the lines&lt;br /&gt;
while (defined(my $line=&amp;lt;INPUT&amp;gt;)) {&lt;br /&gt;
	chomp($line);&lt;br /&gt;
	my @values=split(/\t/,$line);&lt;br /&gt;
	if ($idsWanted{$values[$idIndex]}) {&lt;br /&gt;
		print OUTPUT &amp;quot;$line\n&amp;quot;;&lt;br /&gt;
	}&lt;br /&gt;
	if ($lineNum % 100000 == 0) {&lt;br /&gt;
		my $percent=100 * $lineNum/8918055;&lt;br /&gt;
		print &amp;quot;...line $lineNum ($percent %): &amp;quot; . $values[1] . &amp;quot;\n&amp;quot;;&lt;br /&gt;
	}&lt;br /&gt;
	$lineNum++;&lt;br /&gt;
}&lt;br /&gt;
close OUTPUT;&lt;br /&gt;
close INPUT;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Do the same for the test file&lt;br /&gt;
my $test_input_file = $inputFile;&lt;br /&gt;
$test_input_file =~ s/train/test/;&lt;br /&gt;
my $output_test_file = &amp;quot;${test_input_file}_sample_${numItems}_${method}_${type}.csv&amp;quot;;&lt;br /&gt;
open INPUT, $test_input_file;&lt;br /&gt;
open OUTPUT, &amp;quot;&amp;gt;$output_test_file&amp;quot;;&lt;br /&gt;
# check first line for header&lt;br /&gt;
$line=&amp;lt;INPUT&amp;gt;;&lt;br /&gt;
chomp($line);&lt;br /&gt;
if ($line =~ /Student Id/) {&lt;br /&gt;
	print OUTPUT &amp;quot;$line\n&amp;quot;;&lt;br /&gt;
} else {&lt;br /&gt;
	print &amp;quot;Er...no header...\n$line\n&amp;quot;;&lt;br /&gt;
	my @values=split(/\t/,$line);&lt;br /&gt;
	if ($idsWanted{$values[$idIndex]}) {&lt;br /&gt;
		print OUTPUT &amp;quot;$line\n&amp;quot;;&lt;br /&gt;
	}&lt;br /&gt;
}&lt;br /&gt;
$lineNum=1;&lt;br /&gt;
# now go through the rest of the lines&lt;br /&gt;
while (defined(my $line=&amp;lt;INPUT&amp;gt;)) {&lt;br /&gt;
	chomp($line);&lt;br /&gt;
	my @values=split(/\t/,$line);&lt;br /&gt;
	if ($idsWanted{$values[$idIndex]}) {&lt;br /&gt;
		print OUTPUT &amp;quot;$line\n&amp;quot;;&lt;br /&gt;
	}&lt;br /&gt;
	if ($lineNum % 100000 == 0) {&lt;br /&gt;
		my $percent=100 * $lineNum/508913;&lt;br /&gt;
		print &amp;quot;...line $lineNum ($percent %): &amp;quot; . $values[1] . &amp;quot;\n&amp;quot;;&lt;br /&gt;
	}&lt;br /&gt;
	$lineNum++;&lt;br /&gt;
}&lt;br /&gt;
close OUTPUT;&lt;br /&gt;
close INPUT;&lt;br /&gt;
&lt;br /&gt;
exit(0);&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=KDD_Competition_2010&amp;diff=11320</id>
		<title>KDD Competition 2010</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=KDD_Competition_2010&amp;diff=11320"/>
		<updated>2010-05-22T05:46:48Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We&#039;re interested in working on the KDD Competition, as a way to focus our machine learning exploration -- and maybe even finding some interesting aspects to the data!  If you&#039;re interested, drop us a note, show up at a weekly Machine Learning meeting, and we&#039;ll use this space to keep track of our ideas.&lt;br /&gt;
&lt;br /&gt;
==Resources==&lt;br /&gt;
* [https://pslcdatashop.web.cmu.edu/KDDCup/rules_data_format.jsp KDD Rules and Data Format]&lt;br /&gt;
* [http://cran.r-project.org/ R language]&lt;br /&gt;
* [http://www.csie.ntu.edu.tw/~cjlin/libsvm/ libsvm]&lt;br /&gt;
* [http://www.cs.waikato.ac.nz/ml/weka/ Weka]&lt;br /&gt;
* [http://www.kdnuggets.com/datasets/competitions.html List of other competitions in which we could engage]&lt;br /&gt;
* [[Machine Learning/Hadoop | Hadoop]]&lt;br /&gt;
* [http://lucene.apache.org/mahout/ Mahout -- machine learning libraries for Hadoop]&lt;br /&gt;
* [http://hadoop.apache.org/pig/ Pig language]&lt;br /&gt;
* [http://hadoop.apache.org/pig/docs/r0.3.0/piglatin.html Pig Latin Manual]&lt;br /&gt;
* [http://www.cloudera.com/ Cloudera -- see videos for Hadoop intro]&lt;br /&gt;
* [http://github.com/voberoi/hadoop-mrutils Vikram&#039;s awesome Hadoop/EC2 scripts]&lt;br /&gt;
* [https://www.noisebridge.net/mailman/listinfo/ml Our mailing list]&lt;br /&gt;
&lt;br /&gt;
==TODOs==&lt;br /&gt;
&lt;br /&gt;
* Vikram -- will help setting up Hadoop for the rest of us &amp;amp; create a guide for Mahout setup &lt;br /&gt;
* Thomas -- will get libsvm working on the data and put together a &amp;quot;how to&amp;quot; guide for doing so&lt;br /&gt;
** put together a [[Machine_Learning/kdd_sample | perl script]] which will take random samples from the data, for working on smaller instances&lt;br /&gt;
** put together a [[Machine_Learning/kdd_r | simple R script]] for loading the data&lt;br /&gt;
* Andy -- will get Weka working on the data and put together a &amp;quot;how to&amp;quot; guide for doing so&lt;br /&gt;
* Erin -- Will put meeting notes of 5/19 on https://www.noisebridge.net/wiki/Machine_Learning; will work on data transformations and ways to create better representations of the data; will provide the orthogonalized data sets&lt;br /&gt;
&lt;br /&gt;
* We will need to make sure we don&#039;t get disqualified for people belonging to multiple teams! Do not sign up anybody else for the competition without asking first.&lt;br /&gt;
&lt;br /&gt;
== Notes ==&lt;br /&gt;
* For KDD submission: to zip the submission file on OSX: use command line, otherwise will complain about __MACOSX file: e.g.:  zip asdf.zip algebra_2008_2009_submission.txt&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Ideas == &lt;br /&gt;
* Add new features by computing their values from existing columns -- e.g. correlation between skills based on their co-occurence within problems. Could use Decision tree to define boundaries between e.g. new &amp;quot;good student, medium student, bad student&amp;quot; feature&lt;br /&gt;
* Dimensionality reduction -- transform into numerical values appropriate for consumption by SVM&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Who we are ==&lt;br /&gt;
* Andy; Machine Learning&lt;br /&gt;
* Thomas; Statistics&lt;br /&gt;
* Erin; Maths&lt;br /&gt;
* Vikram; Hadoop&lt;br /&gt;
(insert your name/contact info/expertise here)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== How to run Weka (quick &#039;n dirty tutorial) == &lt;br /&gt;
* Download and install Weka&lt;br /&gt;
* Get your KDD data&lt;br /&gt;
* preprocess your data: this command takes 1000 lines from the given training data set and converts it into .csv file&lt;br /&gt;
* attention, in the last sed command you need to replace the long whitespace with a tab.  In OSX terminal, you do that by pressing CONTROL+V and then tab. (Copying and pasting the command below won&#039;t work, since it interprets the whitespace as spaces)&lt;br /&gt;
* head -n 1000 algebra_2006_2007_train.txt | sed -e &#039;s/[&amp;quot;,]/ /g&#039; | sed &#039;s/       /,/g&#039; &amp;gt; algebra_2006_2007_train_1kFormatted.csv&lt;br /&gt;
* The following screencast shows you how to do these steps: &lt;br /&gt;
* In Weka&#039;s Explorer, remove some unwanted attributes (I leave this up to your judgment), inspect the dataset. &lt;br /&gt;
* Then you can run a ML algorithm over it, e.g. Neural Networks to predict the student performance.&lt;br /&gt;
* [http://swarmfinancial.com/screencasts/nb/kddWekaUsage1.swf Screencast1]&lt;br /&gt;
* [http://swarmfinancial.com/screencasts/nb/kddWekaUsage2.swf Screencast2]&lt;br /&gt;
&lt;br /&gt;
== How to run SVM ==&lt;br /&gt;
* See the notes at [[Machine Learning/SVM]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/SVM&amp;diff=11319</id>
		<title>Machine Learning/SVM</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/SVM&amp;diff=11319"/>
		<updated>2010-05-22T05:46:31Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Running SVM on the Data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Downloading and Installing LibSVM ==&lt;br /&gt;
* To run LibSVM, you will want Python.  If you don&#039;t have it installed, you can download and install it from [http://www.python.org/download/].  Either current stable version should work.&lt;br /&gt;
&lt;br /&gt;
* You will also need to download and install LibSVM itself, which you can do by downloading the zip file from [http://www.csie.ntu.edu.tw/~cjlin/cgi-bin/libsvm.cgi?+http://www.csie.ntu.edu.tw/~cjlin/libsvm+zip].  Windows users should copy the libsvm.dll file from the /windows directory into their C:\Windows\System32 directory.  Mac/Linux users should be able to simply cd into the libsvm directory and run&lt;br /&gt;
 make&lt;br /&gt;
then add the libsvm directory to your PATH.&lt;br /&gt;
&lt;br /&gt;
* If you can, install [[http://www.gnuplot.info/ gnuplot]].  If you have trouble with this (as I did), you can use the modified files with the gnuplot dependency stripped out.  Simply replace tools/easy.py with [[Machine Learning/easy.py]] and tools/grid.py with [[Machine Learning/grid.py]]&lt;br /&gt;
&lt;br /&gt;
== Converting the Data ==&lt;br /&gt;
As with most (if not all) data problems, choosing and formatting the data is the most time-consuming step but also one of the most important.&lt;br /&gt;
&lt;br /&gt;
One approach for reducing the data is to take a subset; you can use Thomas&#039; perl script to take a sample of some number of the training set and test set, by choosing a random subset of the students and only including lines which include them.  You can use the perl script [[Machine_Learning/kdd_sample | sample_training.pl]] to do this, by running:&lt;br /&gt;
 perl sample_training.pl -numitems 100 ~/kdd/algebra_2008_2009_train.txt&lt;br /&gt;
(assuming your data is located in ~/kdd)&lt;br /&gt;
&lt;br /&gt;
For SVM, ultimately we need to format the data in two files: a training file and a test file.  Each of these will have a numeric class and several numeric predictors.  The general format is as follows:&lt;br /&gt;
 &amp;amp;lt;class&amp;amp;gt; 1:&amp;amp;lt;value&amp;amp;gt; 2:&amp;amp;lt;value&amp;amp;gt; 3:&amp;amp;lt;value&amp;amp;gt; ...&lt;br /&gt;
with an entry (1:, 2:, 3:,...) for each numeric predictor.  For example,&lt;br /&gt;
 0 1:0 2:0 3:0 4:0 5:0 6:1 7:0 8:0 9:0 10:0 11:0 12:0 13:0 14:0 15:0 16:0 17:0 18:0&lt;br /&gt;
&lt;br /&gt;
Thomas created a [[Machine Learning/convert_features.pl | perl script]] to take a training set and convert it (and the corresponding test set) into the correct format by using &amp;quot;correct on first attempt&amp;quot; as the output class and converting student and problem id into a series of binary flag variables (one for each student and problem, indicating whether this class regards this student or this problem).  However, this results in a fairly obscene number of predictor variables, even on a stripped-down dataset.  So there is almost certainly a better way.  But if you don&#039;t have one, you can download this script and run&lt;br /&gt;
 perl convert_features.pl ~/kdd/algebra_2008_2009_train.txt_sample_100_random_students.csv&lt;br /&gt;
&lt;br /&gt;
Assuming your data files are in ~/kdd, this will generate output files ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.txt and ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.t in the appropriate format.&lt;br /&gt;
&lt;br /&gt;
== Running SVM on the Data ==&lt;br /&gt;
* cd into your libsvm installation&#039;s tools directory and run the following command (assuming your training and test files are in ~/kdd and named appropriately):&lt;br /&gt;
 python easy.py ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.txt ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.t | tee output.txt&lt;br /&gt;
&lt;br /&gt;
* If you have many predictor variables, this will take a long time.  Prohibitively long, probably.&lt;br /&gt;
&lt;br /&gt;
* This will automatically scale your training and test data and iteratively search over the parameter space for penalty parameter c and kernel parameters, using cross-validation in order to find the best fit for the training data.&lt;br /&gt;
&lt;br /&gt;
* It will generate an output for each item in the test set with a 0/1 classification.  Apparently there is a way to get libSVM to output real values between 0 and 1 (depending on confidence), but we haven&#039;t yet investigated doing this.&lt;br /&gt;
&lt;br /&gt;
== Other references ==&lt;br /&gt;
* [http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf The basic guide] for LibSVM&lt;br /&gt;
* [http://www.csie.ntu.edu.tw/~cjlin/libsvm/ LibSVM site]&lt;br /&gt;
* [[Machine_Learning_Meetup_Notes:_2010-04-28 | Noisebridge ML talk on SVMs]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/easy.py&amp;diff=11318</id>
		<title>Machine Learning/easy.py</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/easy.py&amp;diff=11318"/>
		<updated>2010-05-22T05:46:07Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;&amp;lt;pre&amp;gt; #!/usr/bin/env python  import sys import os from subprocess import *  if len(sys.argv) &amp;lt;= 1: 	print(&amp;#039;Usage: %s training_file [testing_file]&amp;#039; % sys.argv[0]) 	raise SystemExi…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/usr/bin/env python&lt;br /&gt;
&lt;br /&gt;
import sys&lt;br /&gt;
import os&lt;br /&gt;
from subprocess import *&lt;br /&gt;
&lt;br /&gt;
if len(sys.argv) &amp;lt;= 1:&lt;br /&gt;
	print(&#039;Usage: %s training_file [testing_file]&#039; % sys.argv[0])&lt;br /&gt;
	raise SystemExit&lt;br /&gt;
&lt;br /&gt;
# svm, grid, and gnuplot executable files&lt;br /&gt;
&lt;br /&gt;
is_win32 = (sys.platform == &#039;win32&#039;)&lt;br /&gt;
if not is_win32:&lt;br /&gt;
	svmscale_exe = &amp;quot;../svm-scale&amp;quot;&lt;br /&gt;
	svmtrain_exe = &amp;quot;../svm-train&amp;quot;&lt;br /&gt;
	svmpredict_exe = &amp;quot;../svm-predict&amp;quot;&lt;br /&gt;
	grid_py = &amp;quot;./grid.py&amp;quot;&lt;br /&gt;
else:&lt;br /&gt;
        # example for windows&lt;br /&gt;
	svmscale_exe = r&amp;quot;..\windows\svm-scale.exe&amp;quot;&lt;br /&gt;
	svmtrain_exe = r&amp;quot;..\windows\svm-train.exe&amp;quot;&lt;br /&gt;
	svmpredict_exe = r&amp;quot;..\windows\svm-predict.exe&amp;quot;&lt;br /&gt;
	grid_py = r&amp;quot;.\grid.py&amp;quot;&lt;br /&gt;
&lt;br /&gt;
assert os.path.exists(svmscale_exe),&amp;quot;svm-scale executable not found&amp;quot;&lt;br /&gt;
assert os.path.exists(svmtrain_exe),&amp;quot;svm-train executable not found&amp;quot;&lt;br /&gt;
assert os.path.exists(svmpredict_exe),&amp;quot;svm-predict executable not found&amp;quot;&lt;br /&gt;
assert os.path.exists(grid_py),&amp;quot;grid.py not found&amp;quot;&lt;br /&gt;
&lt;br /&gt;
train_pathname = sys.argv[1]&lt;br /&gt;
assert os.path.exists(train_pathname),&amp;quot;training file not found&amp;quot;&lt;br /&gt;
file_name = os.path.split(train_pathname)[1]&lt;br /&gt;
scaled_file = file_name + &amp;quot;.scale&amp;quot;&lt;br /&gt;
model_file = file_name + &amp;quot;.model&amp;quot;&lt;br /&gt;
range_file = file_name + &amp;quot;.range&amp;quot;&lt;br /&gt;
&lt;br /&gt;
if len(sys.argv) &amp;gt; 2:&lt;br /&gt;
	test_pathname = sys.argv[2]&lt;br /&gt;
	file_name = os.path.split(test_pathname)[1]&lt;br /&gt;
	assert os.path.exists(test_pathname),&amp;quot;testing file not found&amp;quot;&lt;br /&gt;
	scaled_test_file = file_name + &amp;quot;.scale&amp;quot;&lt;br /&gt;
	predict_test_file = file_name + &amp;quot;.predict&amp;quot;&lt;br /&gt;
&lt;br /&gt;
cmd = &#039;%s -s &amp;quot;%s&amp;quot; &amp;quot;%s&amp;quot; &amp;gt; &amp;quot;%s&amp;quot;&#039; % (svmscale_exe, range_file, train_pathname, scaled_file)&lt;br /&gt;
print(&#039;Scaling training data...&#039;)&lt;br /&gt;
Popen(cmd, shell = True, stdout = PIPE).communicate()	&lt;br /&gt;
&lt;br /&gt;
cmd = &#039;%s -svmtrain &amp;quot;%s&amp;quot; &amp;quot;%s&amp;quot;&#039; % (grid_py, svmtrain_exe, scaled_file)&lt;br /&gt;
print(&#039;Cross validation...&#039;)&lt;br /&gt;
f = Popen(cmd, shell = True, stdout = PIPE).stdout&lt;br /&gt;
&lt;br /&gt;
line = &#039;&#039;&lt;br /&gt;
while True:&lt;br /&gt;
	last_line = line&lt;br /&gt;
	line = f.readline()&lt;br /&gt;
	if not line: break&lt;br /&gt;
c,g,rate = map(float,last_line.split())&lt;br /&gt;
&lt;br /&gt;
print(&#039;Best c=%s, g=%s CV rate=%s&#039; % (c,g,rate))&lt;br /&gt;
&lt;br /&gt;
cmd = &#039;%s -c %s -g %s &amp;quot;%s&amp;quot; &amp;quot;%s&amp;quot;&#039; % (svmtrain_exe,c,g,scaled_file,model_file)&lt;br /&gt;
print(&#039;Training...&#039;)&lt;br /&gt;
Popen(cmd, shell = True, stdout = PIPE).communicate()&lt;br /&gt;
&lt;br /&gt;
print(&#039;Output model: %s&#039; % model_file)&lt;br /&gt;
if len(sys.argv) &amp;gt; 2:&lt;br /&gt;
	cmd = &#039;%s -r &amp;quot;%s&amp;quot; &amp;quot;%s&amp;quot; &amp;gt; &amp;quot;%s&amp;quot;&#039; % (svmscale_exe, range_file, test_pathname, scaled_test_file)&lt;br /&gt;
	print(&#039;Scaling testing data...&#039;)&lt;br /&gt;
	Popen(cmd, shell = True, stdout = PIPE).communicate()	&lt;br /&gt;
&lt;br /&gt;
	cmd = &#039;%s &amp;quot;%s&amp;quot; &amp;quot;%s&amp;quot; &amp;quot;%s&amp;quot;&#039; % (svmpredict_exe, scaled_test_file, model_file, predict_test_file)&lt;br /&gt;
	print(&#039;Testing...&#039;)&lt;br /&gt;
	Popen(cmd, shell = True).communicate()	&lt;br /&gt;
&lt;br /&gt;
	print(&#039;Output prediction: %s&#039; % predict_test_file)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/grid.py&amp;diff=11317</id>
		<title>Machine Learning/grid.py</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/grid.py&amp;diff=11317"/>
		<updated>2010-05-22T05:45:19Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;&amp;lt;pre&amp;gt; #!/usr/bin/env python    import os, sys, traceback import getpass from threading import Thread from subprocess import *  if(sys.hexversion &amp;lt; 0x03000000): 	import Queue else…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/usr/bin/env python&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
import os, sys, traceback&lt;br /&gt;
import getpass&lt;br /&gt;
from threading import Thread&lt;br /&gt;
from subprocess import *&lt;br /&gt;
&lt;br /&gt;
if(sys.hexversion &amp;lt; 0x03000000):&lt;br /&gt;
	import Queue&lt;br /&gt;
else:&lt;br /&gt;
	import queue as Queue&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# svmtrain and gnuplot executable&lt;br /&gt;
&lt;br /&gt;
is_win32 = (sys.platform == &#039;win32&#039;)&lt;br /&gt;
if not is_win32:&lt;br /&gt;
       svmtrain_exe = &amp;quot;../svm-train&amp;quot;&lt;br /&gt;
       gnuplot_exe = &amp;quot;/usr/bin/gnuplot&amp;quot;&lt;br /&gt;
else:&lt;br /&gt;
       # example for windows&lt;br /&gt;
       svmtrain_exe = r&amp;quot;..\windows\svm-train.exe&amp;quot;&lt;br /&gt;
       gnuplot_exe = r&amp;quot;c:\tmp\gnuplot\bin\pgnuplot.exe&amp;quot;&lt;br /&gt;
&lt;br /&gt;
# global parameters and their default values&lt;br /&gt;
&lt;br /&gt;
fold = 5&lt;br /&gt;
c_begin, c_end, c_step = -5,  15, 2&lt;br /&gt;
g_begin, g_end, g_step =  3, -15, -2&lt;br /&gt;
global dataset_pathname, dataset_title, pass_through_string&lt;br /&gt;
global out_filename, png_filename&lt;br /&gt;
&lt;br /&gt;
# experimental&lt;br /&gt;
&lt;br /&gt;
telnet_workers = []&lt;br /&gt;
ssh_workers = []&lt;br /&gt;
nr_local_worker = 1&lt;br /&gt;
&lt;br /&gt;
# process command line options, set global parameters&lt;br /&gt;
def process_options(argv=sys.argv):&lt;br /&gt;
&lt;br /&gt;
    global fold&lt;br /&gt;
    global c_begin, c_end, c_step&lt;br /&gt;
    global g_begin, g_end, g_step&lt;br /&gt;
    global dataset_pathname, dataset_title, pass_through_string&lt;br /&gt;
    global svmtrain_exe, gnuplot_exe, gnuplot, out_filename, png_filename&lt;br /&gt;
    &lt;br /&gt;
    usage = &amp;quot;&amp;quot;&amp;quot;\&lt;br /&gt;
Usage: grid.py [-log2c begin,end,step] [-log2g begin,end,step] [-v fold] &lt;br /&gt;
[-svmtrain pathname] [-gnuplot pathname] [-out pathname] [-png pathname]&lt;br /&gt;
[additional parameters for svm-train] dataset&amp;quot;&amp;quot;&amp;quot;&lt;br /&gt;
&lt;br /&gt;
    if len(argv) &amp;lt; 2:&lt;br /&gt;
        print(usage)&lt;br /&gt;
        sys.exit(1)&lt;br /&gt;
&lt;br /&gt;
    dataset_pathname = argv[-1]&lt;br /&gt;
    dataset_title = os.path.split(dataset_pathname)[1]&lt;br /&gt;
    out_filename = &#039;%s.out&#039; % dataset_title&lt;br /&gt;
    png_filename = &#039;%s.png&#039; % dataset_title&lt;br /&gt;
    pass_through_options = []&lt;br /&gt;
&lt;br /&gt;
    i = 1&lt;br /&gt;
    while i &amp;lt; len(argv) - 1:&lt;br /&gt;
        if argv[i] == &amp;quot;-log2c&amp;quot;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            (c_begin,c_end,c_step) = map(float,argv[i].split(&amp;quot;,&amp;quot;))&lt;br /&gt;
        elif argv[i] == &amp;quot;-log2g&amp;quot;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            (g_begin,g_end,g_step) = map(float,argv[i].split(&amp;quot;,&amp;quot;))&lt;br /&gt;
        elif argv[i] == &amp;quot;-v&amp;quot;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            fold = argv[i]&lt;br /&gt;
        elif argv[i] in (&#039;-c&#039;,&#039;-g&#039;):&lt;br /&gt;
            print(&amp;quot;Option -c and -g are renamed.&amp;quot;)&lt;br /&gt;
            print(usage)&lt;br /&gt;
            sys.exit(1)&lt;br /&gt;
        elif argv[i] == &#039;-svmtrain&#039;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            svmtrain_exe = argv[i]&lt;br /&gt;
        elif argv[i] == &#039;-gnuplot&#039;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            gnuplot_exe = argv[i]&lt;br /&gt;
        elif argv[i] == &#039;-out&#039;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            out_filename = argv[i]&lt;br /&gt;
        elif argv[i] == &#039;-png&#039;:&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            png_filename = argv[i]&lt;br /&gt;
        else:&lt;br /&gt;
            pass_through_options.append(argv[i])&lt;br /&gt;
        i = i + 1&lt;br /&gt;
&lt;br /&gt;
    pass_through_string = &amp;quot; &amp;quot;.join(pass_through_options)&lt;br /&gt;
    assert os.path.exists(svmtrain_exe),&amp;quot;svm-train executable not found&amp;quot; &lt;br /&gt;
    assert os.path.exists(dataset_pathname),&amp;quot;dataset not found&amp;quot;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
def range_f(begin,end,step):&lt;br /&gt;
    # like range, but works on non-integer too&lt;br /&gt;
    seq = []&lt;br /&gt;
    while True:&lt;br /&gt;
        if step &amp;gt; 0 and begin &amp;gt; end: break&lt;br /&gt;
        if step &amp;lt; 0 and begin &amp;lt; end: break&lt;br /&gt;
        seq.append(begin)&lt;br /&gt;
        begin = begin + step&lt;br /&gt;
    return seq&lt;br /&gt;
&lt;br /&gt;
def permute_sequence(seq):&lt;br /&gt;
    n = len(seq)&lt;br /&gt;
    if n &amp;lt;= 1: return seq&lt;br /&gt;
&lt;br /&gt;
    mid = int(n/2)&lt;br /&gt;
    left = permute_sequence(seq[:mid])&lt;br /&gt;
    right = permute_sequence(seq[mid+1:])&lt;br /&gt;
&lt;br /&gt;
    ret = [seq[mid]]&lt;br /&gt;
    while left or right:&lt;br /&gt;
        if left: ret.append(left.pop(0))&lt;br /&gt;
        if right: ret.append(right.pop(0))&lt;br /&gt;
&lt;br /&gt;
    return ret&lt;br /&gt;
&lt;br /&gt;
def redraw(db,best_param,tofile=False):&lt;br /&gt;
    if len(db) == 0: return&lt;br /&gt;
    begin_level = round(max(x[2] for x in db)) - 3&lt;br /&gt;
    step_size = 0.5&lt;br /&gt;
&lt;br /&gt;
    best_log2c,best_log2g,best_rate = best_param&lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
    db.sort(key = lambda x:(x[0], -x[1]))&lt;br /&gt;
&lt;br /&gt;
    prevc = db[0][0]&lt;br /&gt;
    for line in db:&lt;br /&gt;
        if prevc != line[0]:&lt;br /&gt;
            prevc = line[0]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
def calculate_jobs():&lt;br /&gt;
    c_seq = permute_sequence(range_f(c_begin,c_end,c_step))&lt;br /&gt;
    g_seq = permute_sequence(range_f(g_begin,g_end,g_step))&lt;br /&gt;
    nr_c = float(len(c_seq))&lt;br /&gt;
    nr_g = float(len(g_seq))&lt;br /&gt;
    i = 0&lt;br /&gt;
    j = 0&lt;br /&gt;
    jobs = []&lt;br /&gt;
&lt;br /&gt;
    while i &amp;lt; nr_c or j &amp;lt; nr_g:&lt;br /&gt;
        if i/nr_c &amp;lt; j/nr_g:&lt;br /&gt;
            # increase C resolution&lt;br /&gt;
            line = []&lt;br /&gt;
            for k in range(0,j):&lt;br /&gt;
                line.append((c_seq[i],g_seq[k]))&lt;br /&gt;
            i = i + 1&lt;br /&gt;
            jobs.append(line)&lt;br /&gt;
        else:&lt;br /&gt;
            # increase g resolution&lt;br /&gt;
            line = []&lt;br /&gt;
            for k in range(0,i):&lt;br /&gt;
                line.append((c_seq[k],g_seq[j]))&lt;br /&gt;
            j = j + 1&lt;br /&gt;
            jobs.append(line)&lt;br /&gt;
    return jobs&lt;br /&gt;
&lt;br /&gt;
class WorkerStopToken:  # used to notify the worker to stop&lt;br /&gt;
        pass&lt;br /&gt;
&lt;br /&gt;
class Worker(Thread):&lt;br /&gt;
    def __init__(self,name,job_queue,result_queue):&lt;br /&gt;
        Thread.__init__(self)&lt;br /&gt;
        self.name = name&lt;br /&gt;
        self.job_queue = job_queue&lt;br /&gt;
        self.result_queue = result_queue&lt;br /&gt;
    def run(self):&lt;br /&gt;
        while True:&lt;br /&gt;
            (cexp,gexp) = self.job_queue.get()&lt;br /&gt;
            if cexp is WorkerStopToken:&lt;br /&gt;
                self.job_queue.put((cexp,gexp))&lt;br /&gt;
                # print &#039;worker %s stop.&#039; % self.name&lt;br /&gt;
                break&lt;br /&gt;
            try:&lt;br /&gt;
                rate = self.run_one(2.0**cexp,2.0**gexp)&lt;br /&gt;
                if rate is None: raise &amp;quot;get no rate&amp;quot;&lt;br /&gt;
            except:&lt;br /&gt;
                # we failed, let others do that and we just quit&lt;br /&gt;
            &lt;br /&gt;
                traceback.print_exception(sys.exc_info()[0], sys.exc_info()[1], sys.exc_info()[2])&lt;br /&gt;
                &lt;br /&gt;
                self.job_queue.put((cexp,gexp))&lt;br /&gt;
                print(&#039;worker %s quit.&#039; % self.name)&lt;br /&gt;
                break&lt;br /&gt;
            else:&lt;br /&gt;
                self.result_queue.put((self.name,cexp,gexp,rate))&lt;br /&gt;
&lt;br /&gt;
class LocalWorker(Worker):&lt;br /&gt;
    def run_one(self,c,g):&lt;br /&gt;
        cmdline = &#039;%s -c %s -g %s -v %s %s %s&#039; % \&lt;br /&gt;
          (svmtrain_exe,c,g,fold,pass_through_string,dataset_pathname)&lt;br /&gt;
        result = Popen(cmdline,shell=True,stdout=PIPE).stdout&lt;br /&gt;
        for line in result.readlines():&lt;br /&gt;
            if str(line).find(&amp;quot;Cross&amp;quot;) != -1:&lt;br /&gt;
                return float(line.split()[-1][0:-1])&lt;br /&gt;
&lt;br /&gt;
class SSHWorker(Worker):&lt;br /&gt;
    def __init__(self,name,job_queue,result_queue,host):&lt;br /&gt;
        Worker.__init__(self,name,job_queue,result_queue)&lt;br /&gt;
        self.host = host&lt;br /&gt;
        self.cwd = os.getcwd()&lt;br /&gt;
    def run_one(self,c,g):&lt;br /&gt;
        cmdline = &#039;ssh -x %s &amp;quot;cd %s; %s -c %s -g %s -v %s %s %s&amp;quot;&#039; % \&lt;br /&gt;
          (self.host,self.cwd,&lt;br /&gt;
           svmtrain_exe,c,g,fold,pass_through_string,dataset_pathname)&lt;br /&gt;
        result = Popen(cmdline,shell=True,stdout=PIPE).stdout&lt;br /&gt;
        for line in result.readlines():&lt;br /&gt;
            if str(line).find(&amp;quot;Cross&amp;quot;) != -1:&lt;br /&gt;
                return float(line.split()[-1][0:-1])&lt;br /&gt;
&lt;br /&gt;
class TelnetWorker(Worker):&lt;br /&gt;
    def __init__(self,name,job_queue,result_queue,host,username,password):&lt;br /&gt;
        Worker.__init__(self,name,job_queue,result_queue)&lt;br /&gt;
        self.host = host&lt;br /&gt;
        self.username = username&lt;br /&gt;
        self.password = password        &lt;br /&gt;
    def run(self):&lt;br /&gt;
        import telnetlib&lt;br /&gt;
        self.tn = tn = telnetlib.Telnet(self.host)&lt;br /&gt;
        tn.read_until(&amp;quot;login: &amp;quot;)&lt;br /&gt;
        tn.write(self.username + &amp;quot;\n&amp;quot;)&lt;br /&gt;
        tn.read_until(&amp;quot;Password: &amp;quot;)&lt;br /&gt;
        tn.write(self.password + &amp;quot;\n&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
        # XXX: how to know whether login is successful?&lt;br /&gt;
        tn.read_until(self.username)&lt;br /&gt;
        # &lt;br /&gt;
        print(&#039;login ok&#039;, self.host)&lt;br /&gt;
        tn.write(&amp;quot;cd &amp;quot;+os.getcwd()+&amp;quot;\n&amp;quot;)&lt;br /&gt;
        Worker.run(self)&lt;br /&gt;
        tn.write(&amp;quot;exit\n&amp;quot;)               &lt;br /&gt;
    def run_one(self,c,g):&lt;br /&gt;
        cmdline = &#039;%s -c %s -g %s -v %s %s %s&#039; % \&lt;br /&gt;
          (svmtrain_exe,c,g,fold,pass_through_string,dataset_pathname)&lt;br /&gt;
        result = self.tn.write(cmdline+&#039;\n&#039;)&lt;br /&gt;
        (idx,matchm,output) = self.tn.expect([&#039;Cross.*\n&#039;])&lt;br /&gt;
        for line in output.split(&#039;\n&#039;):&lt;br /&gt;
            if str(line).find(&amp;quot;Cross&amp;quot;) != -1:&lt;br /&gt;
                return float(line.split()[-1][0:-1])&lt;br /&gt;
&lt;br /&gt;
def main():&lt;br /&gt;
&lt;br /&gt;
    # set parameters&lt;br /&gt;
&lt;br /&gt;
    process_options()&lt;br /&gt;
&lt;br /&gt;
    # put jobs in queue&lt;br /&gt;
&lt;br /&gt;
    jobs = calculate_jobs()&lt;br /&gt;
    job_queue = Queue.Queue(0)&lt;br /&gt;
    result_queue = Queue.Queue(0)&lt;br /&gt;
&lt;br /&gt;
    for line in jobs:&lt;br /&gt;
        for (c,g) in line:&lt;br /&gt;
            job_queue.put((c,g))&lt;br /&gt;
&lt;br /&gt;
    job_queue._put = job_queue.queue.appendleft&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    # fire telnet workers&lt;br /&gt;
&lt;br /&gt;
    if telnet_workers:&lt;br /&gt;
        nr_telnet_worker = len(telnet_workers)&lt;br /&gt;
        username = getpass.getuser()&lt;br /&gt;
        password = getpass.getpass()&lt;br /&gt;
        for host in telnet_workers:&lt;br /&gt;
            TelnetWorker(host,job_queue,result_queue,&lt;br /&gt;
                     host,username,password).start()&lt;br /&gt;
&lt;br /&gt;
    # fire ssh workers&lt;br /&gt;
&lt;br /&gt;
    if ssh_workers:&lt;br /&gt;
        for host in ssh_workers:&lt;br /&gt;
            SSHWorker(host,job_queue,result_queue,host).start()&lt;br /&gt;
&lt;br /&gt;
    # fire local workers&lt;br /&gt;
&lt;br /&gt;
    for i in range(nr_local_worker):&lt;br /&gt;
        LocalWorker(&#039;local&#039;,job_queue,result_queue).start()&lt;br /&gt;
&lt;br /&gt;
    # gather results&lt;br /&gt;
&lt;br /&gt;
    done_jobs = {}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    result_file = open(out_filename, &#039;w&#039;)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
    db = []&lt;br /&gt;
    best_rate = -1&lt;br /&gt;
    best_c1,best_g1 = None,None&lt;br /&gt;
&lt;br /&gt;
    for line in jobs:&lt;br /&gt;
        for (c,g) in line:&lt;br /&gt;
            while (c, g) not in done_jobs:&lt;br /&gt;
                (worker,c1,g1,rate) = result_queue.get()&lt;br /&gt;
                done_jobs[(c1,g1)] = rate&lt;br /&gt;
                result_file.write(&#039;%s %s %s\n&#039; %(c1,g1,rate))&lt;br /&gt;
                result_file.flush()&lt;br /&gt;
                if (rate &amp;gt; best_rate) or (rate==best_rate and g1==best_g1 and c1&amp;lt;best_c1):&lt;br /&gt;
                    best_rate = rate&lt;br /&gt;
                    best_c1,best_g1=c1,g1&lt;br /&gt;
                    best_c = 2.0**c1&lt;br /&gt;
                    best_g = 2.0**g1&lt;br /&gt;
                print(&amp;quot;[%s] %s %s %s (best c=%s, g=%s, rate=%s)&amp;quot; % \&lt;br /&gt;
		    (worker,c1,g1,rate, best_c, best_g, best_rate))&lt;br /&gt;
            db.append((c,g,done_jobs[(c,g)]))&lt;br /&gt;
        redraw(db,[best_c1, best_g1, best_rate])&lt;br /&gt;
        redraw(db,[best_c1, best_g1, best_rate],True)&lt;br /&gt;
&lt;br /&gt;
    job_queue.put((WorkerStopToken,None))&lt;br /&gt;
    print(&amp;quot;%s %s %s&amp;quot; % (best_c, best_g, best_rate))&lt;br /&gt;
main()&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/SVM&amp;diff=11316</id>
		<title>Machine Learning/SVM</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/SVM&amp;diff=11316"/>
		<updated>2010-05-22T05:44:14Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: Created page with &amp;#039;== Downloading and Installing LibSVM == * To run LibSVM, you will want Python.  If you don&amp;#039;t have it installed, you can download and install it from [http://www.python.org/downlo…&amp;#039;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Downloading and Installing LibSVM ==&lt;br /&gt;
* To run LibSVM, you will want Python.  If you don&#039;t have it installed, you can download and install it from [http://www.python.org/download/].  Either current stable version should work.&lt;br /&gt;
&lt;br /&gt;
* You will also need to download and install LibSVM itself, which you can do by downloading the zip file from [http://www.csie.ntu.edu.tw/~cjlin/cgi-bin/libsvm.cgi?+http://www.csie.ntu.edu.tw/~cjlin/libsvm+zip].  Windows users should copy the libsvm.dll file from the /windows directory into their C:\Windows\System32 directory.  Mac/Linux users should be able to simply cd into the libsvm directory and run&lt;br /&gt;
 make&lt;br /&gt;
then add the libsvm directory to your PATH.&lt;br /&gt;
&lt;br /&gt;
* If you can, install [[http://www.gnuplot.info/ gnuplot]].  If you have trouble with this (as I did), you can use the modified files with the gnuplot dependency stripped out.  Simply replace tools/easy.py with [[Machine Learning/easy.py]] and tools/grid.py with [[Machine Learning/grid.py]]&lt;br /&gt;
&lt;br /&gt;
== Converting the Data ==&lt;br /&gt;
As with most (if not all) data problems, choosing and formatting the data is the most time-consuming step but also one of the most important.&lt;br /&gt;
&lt;br /&gt;
One approach for reducing the data is to take a subset; you can use Thomas&#039; perl script to take a sample of some number of the training set and test set, by choosing a random subset of the students and only including lines which include them.  You can use the perl script [[Machine_Learning/kdd_sample | sample_training.pl]] to do this, by running:&lt;br /&gt;
 perl sample_training.pl -numitems 100 ~/kdd/algebra_2008_2009_train.txt&lt;br /&gt;
(assuming your data is located in ~/kdd)&lt;br /&gt;
&lt;br /&gt;
For SVM, ultimately we need to format the data in two files: a training file and a test file.  Each of these will have a numeric class and several numeric predictors.  The general format is as follows:&lt;br /&gt;
 &amp;amp;lt;class&amp;amp;gt; 1:&amp;amp;lt;value&amp;amp;gt; 2:&amp;amp;lt;value&amp;amp;gt; 3:&amp;amp;lt;value&amp;amp;gt; ...&lt;br /&gt;
with an entry (1:, 2:, 3:,...) for each numeric predictor.  For example,&lt;br /&gt;
 0 1:0 2:0 3:0 4:0 5:0 6:1 7:0 8:0 9:0 10:0 11:0 12:0 13:0 14:0 15:0 16:0 17:0 18:0&lt;br /&gt;
&lt;br /&gt;
Thomas created a [[Machine Learning/convert_features.pl | perl script]] to take a training set and convert it (and the corresponding test set) into the correct format by using &amp;quot;correct on first attempt&amp;quot; as the output class and converting student and problem id into a series of binary flag variables (one for each student and problem, indicating whether this class regards this student or this problem).  However, this results in a fairly obscene number of predictor variables, even on a stripped-down dataset.  So there is almost certainly a better way.  But if you don&#039;t have one, you can download this script and run&lt;br /&gt;
 perl convert_features.pl ~/kdd/algebra_2008_2009_train.txt_sample_100_random_students.csv&lt;br /&gt;
&lt;br /&gt;
Assuming your data files are in ~/kdd, this will generate output files ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.txt and ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.t in the appropriate format.&lt;br /&gt;
&lt;br /&gt;
== Running SVM on the Data ==&lt;br /&gt;
* cd into your libsvm installation&#039;s tools directory and run the following command (assuming your training and test files are in ~/kdd and named appropriately):&lt;br /&gt;
 python easy.py ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.txt ~/kdd/algebra_2008_2009_train.txt_sample_10_random_students.csv_converted.t&lt;br /&gt;
&lt;br /&gt;
* If you have many predictor variables, this will take a long time.  Prohibitively long, probably.&lt;br /&gt;
&lt;br /&gt;
* This will automatically scale your training and test data and iteratively search over the parameter space for penalty parameter c and kernel parameters, using cross-validation in order to find the best fit for the training data.&lt;br /&gt;
&lt;br /&gt;
* It will generate an output for each item in the test set with a 0/1 classification.  Apparently there is a way to get libSVM to output real values between 0 and 1 (depending on confidence), but we haven&#039;t yet investigated doing this.&lt;br /&gt;
&lt;br /&gt;
== Other references ==&lt;br /&gt;
* [http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf The basic guide] for LibSVM&lt;br /&gt;
* [http://www.csie.ntu.edu.tw/~cjlin/libsvm/ LibSVM site]&lt;br /&gt;
* [[Machine_Learning_Meetup_Notes:_2010-04-28 | Noisebridge ML talk on SVMs]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11315</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11315"/>
		<updated>2010-05-22T05:42:15Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 5/26/2010 @ 8:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Work on the KDD data set (furhter work using Hadoop, Neural Networks from Mike)&lt;br /&gt;
*Presenter: Group/Mike&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression (Mike S volunteered to teach)&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**Clustering/PCA&lt;br /&gt;
**k-Means Clustering&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Possible Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[Online Optimization &amp;amp; Machine Learning Toolkit]]&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning/Hadoop&amp;diff=11254</id>
		<title>Machine Learning/Hadoop</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning/Hadoop&amp;diff=11254"/>
		<updated>2010-05-20T04:14:06Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;===About===&lt;br /&gt;
* Google had so much data that &#039;&#039;reading&#039;&#039; the data from disk took a lot of time, much less processing&lt;br /&gt;
** So they needed to parallelize everything, even disk access&lt;br /&gt;
** Make the processing local to where the data is, to avoid network issues&lt;br /&gt;
* Parallelization is hard/error-prone&lt;br /&gt;
** Want to have a &amp;quot;shared-nothing&amp;quot; architecture&lt;br /&gt;
** Functional programming&lt;br /&gt;
* Map&lt;br /&gt;
Runs the function on each item in the list, returns the list of output from running the function on each item&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def map(func, list):&lt;br /&gt;
  return [func(item) for item in list]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Example:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def twice(num):&lt;br /&gt;
  return num*2&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* Reduce&lt;br /&gt;
Take a function (which takes two arguments) and a list, and iteratively continues through, accumulating&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def reduce(func, list):&lt;br /&gt;
  a = func(list[0], list[1])&lt;br /&gt;
  for &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Examples/Actual===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def map(key,value):&lt;br /&gt;
  # process&lt;br /&gt;
  emit(another_key, another_value)&lt;br /&gt;
def reduce(key, values):&lt;br /&gt;
  # process the key and all values associated with it&lt;br /&gt;
  emit(something)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Average&lt;br /&gt;
** keys are line numbers, values are what&#039;s in it&lt;br /&gt;
** file:&lt;br /&gt;
*** 1  (1,2)&lt;br /&gt;
*** 4  (2,4)&lt;br /&gt;
*** 5  (3,5)&lt;br /&gt;
*** 6  (4,6)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
def map(key,value):&lt;br /&gt;
  emit(&amp;quot;exist&amp;quot;,1)&lt;br /&gt;
  emit(&amp;quot;x&amp;quot;,value)&lt;br /&gt;
def reduce(key, values):&lt;br /&gt;
  # process the key and all values associated with it&lt;br /&gt;
  emit(something)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Tutorials===&lt;br /&gt;
* http://www.cloudera.com/videos/introduction_to_pig&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Tools===&lt;br /&gt;
* Hadoop&lt;br /&gt;
* Hive&lt;br /&gt;
* Pig: A high-level language for compiling down to MapReduce programs&lt;br /&gt;
* MapReduce on Amazon (?)&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
	<entry>
		<id>https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11221</id>
		<title>Machine Learning</title>
		<link rel="alternate" type="text/html" href="https://replica.wiki.extremist.software/index.php?title=Machine_Learning&amp;diff=11221"/>
		<updated>2010-05-18T16:17:17Z</updated>

		<summary type="html">&lt;p&gt;ThomasLotze: /* Presentations and other Materials */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Next Meeting===&lt;br /&gt;
&lt;br /&gt;
*When: Wednesday, 5/19/2010 @ 8:00pm&lt;br /&gt;
*Where: 2169 Mission St. (back corner classroom)&lt;br /&gt;
*Topic: Hadoop&lt;br /&gt;
*Presenter: Vikram&lt;br /&gt;
&lt;br /&gt;
=== Mailing List ===&lt;br /&gt;
&lt;br /&gt;
https://www.noisebridge.net/mailman/listinfo/ml&lt;br /&gt;
&lt;br /&gt;
=== Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[KDD Competition 2010]]&lt;br /&gt;
&lt;br /&gt;
=== Topics to Learn and Teach ===&lt;br /&gt;
&lt;br /&gt;
*Supervised Learning&lt;br /&gt;
**Linear Regression (Mike S volunteered to teach)&lt;br /&gt;
**Linear Discriminants&lt;br /&gt;
**Neural Nets/Radial Basis Functions&lt;br /&gt;
**Support Vector Machines&lt;br /&gt;
**Classifier Combination [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part6.pdf]&lt;br /&gt;
**A basic decision tree builder, recursive and using entropy metrics&lt;br /&gt;
&lt;br /&gt;
*Unsupervised Learning&lt;br /&gt;
**Clustering/PCA&lt;br /&gt;
**k-Means Clustering&lt;br /&gt;
**Graphical Modeling&lt;br /&gt;
**Generative Models: gaussian distribution, multinomial distributions, HMMs, Naive Bayes&lt;br /&gt;
&lt;br /&gt;
*Reinforcement Learning&lt;br /&gt;
**Temporal Difference Learning&lt;br /&gt;
&lt;br /&gt;
*Math, Probability &amp;amp; Statistics&lt;br /&gt;
**Metric spaces and what they mean&lt;br /&gt;
**Fundamentals of probabilities&lt;br /&gt;
**Decision Theory (Bayesian)&lt;br /&gt;
**Maximum Likelihood&lt;br /&gt;
**Bias/Variance Tradeoff, VC Dimension&lt;br /&gt;
**Bagging, Bootstrap, Jacknife [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part3.pdf]&lt;br /&gt;
**Information Theory: Entropy, Mutual Information, Gaussian Channels&lt;br /&gt;
**Estimation of Misclassification [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part5.pdf]&lt;br /&gt;
**No-Free Lunch Theorem [http://www.cedar.buffalo.edu/~srihari/CSE555/Chap9.Part1.pdf]&lt;br /&gt;
&lt;br /&gt;
*Machine Learning SDK&#039;s&lt;br /&gt;
** [http://opencv.willowgarage.com/documentation/index.html OpenCV] ML component (SVM, trees, etc)&lt;br /&gt;
**[http://lucene.apache.org/mahout/ Mahout] a Hadoop cluster based ML package.&lt;br /&gt;
**[http://www.cs.waikato.ac.nz/ml/weka/ Weka] a collection of data mining tools and machine learning algorithms.&lt;br /&gt;
&lt;br /&gt;
*Applications&lt;br /&gt;
** Collective Intelligence &amp;amp; Recommendation Engines&lt;br /&gt;
&lt;br /&gt;
=== Possible Projects ===&lt;br /&gt;
&lt;br /&gt;
*[[Online Optimization &amp;amp; Machine Learning Toolkit]]&lt;br /&gt;
&lt;br /&gt;
=== Presentations and other Materials ===&lt;br /&gt;
&lt;br /&gt;
* [[Awesome Machine Learning Applications]] -- A list of cool applications of ML&lt;br /&gt;
* [[Hands-on Machine Learning]], a presentation [[User:jbm|jbm]] gave on 2009-01-07.&lt;br /&gt;
* http://www.youtube.com/user/StanfordUniversity#g/c/A89DCFA6ADACE599 Stanford Machine Learning online course videos]&lt;br /&gt;
* [[Media:Brief_statistics_slides.pdf]], a presentation given on statistics for the machine learning group&lt;br /&gt;
* [http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&amp;amp;discussionID=20096092&amp;amp;gid=77616&amp;amp;trk=EML_anet_qa_ttle-0Nt79xs2RVr6JBpnsJt7dBpSBA LinkedIn] discussion on good resources for data mining and predictive analytics&lt;br /&gt;
&lt;br /&gt;
=== Notes from Meetings ===&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-12]] -- Group workshop on KDD data set&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-05-05]] -- A Brief Tour of Statistics&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-28]] -- SVMs&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-21]] -- Linear Regression&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2010-04-14]] -- (re)Starting new Machine Learning Meetup!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-04-01]] -- Finally moving on up: fully-connected backpropagation networks.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-25]] -- We made perceptrons - added sigmoid, etc.&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-18]] -- We made perceptrons - linear function support!&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2009-03-11]] -- We made perceptrons!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-03-04 -- Josh gave a presentation on SVMs&lt;br /&gt;
&lt;br /&gt;
(time is missing!)&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-11 -- Josh gave a presentation on clustering, donuts!&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-02-04 -- Free-form hang out night, punch and pie&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-28 -- Praveen talked about Neural networks&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2008-01-21 -- Jean gave a quick overview of machine learning stuff&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-14 -- Ian gave a talk on k-Nearest Neighbor&lt;br /&gt;
&lt;br /&gt;
Machine Learning Meetup Notes: 2009-01-07 -- Josh did a quick intro to ML presentation&lt;br /&gt;
&lt;br /&gt;
[[Machine Learning Meetup Notes: 2008-12-17]]&lt;/div&gt;</summary>
		<author><name>ThomasLotze</name></author>
	</entry>
</feed>