Difference between revisions of "CSCI4155/CSCS6505 (2015)"

From Hallab
Jump to navigation Jump to search
 
(72 intermediate revisions by one other user not shown)
Line 1: Line 1:
 
== Machine Learning 2015 ==
 
== Machine Learning 2015 ==
  
Under construction (not yet released)
 
  
  
 
=== Instructors ===
 
=== Instructors ===
Prof: Dr. Thomas Trappenberg  
+
Prof: Dr. Thomas Trappenberg (tt@cs.dal.ca)
  
 
Office: Room 4216 in Mona Campbell Building  
 
Office: Room 4216 in Mona Campbell Building  
  
Email: tt@cs.dal.ca
+
TA: Paul Hollensen (paulhollensen@gmail.com)
  
TA: TBA
+
Office hours: By appointment (write email)
 
 
Office hours: TBA and after appointment (write email)
 
  
 
=== Course Description ===
 
=== Course Description ===
This course is an introduction to machine learning including theory and practical work with Matlab.
+
This course is an introduction to machine learning, including their practical use and theoretical foundation. We will start by applying some pre-programmed algorithms in Matlab to get practical experience before unpacking some of the theory behind them. The course includes introductory reviews of the Matlab scientific programming language and some mathematical concepts that we need for the discussions on the theory. The later includes the formalism of describing uncertainty (e.g. basic concepts of probability theory), representations of large data sets (e.g. vector and matrix formalism), and mathematical concepts behind some algorithms (e.g. gradients and optimization techniques).
  
 
=== Course Textbook ===
 
=== Course Textbook ===
  
We follow a special course manuscript as well as some chapters from books.
+
There are many good textbook on machine learning, some which are listed below. We will not follow a single textbook but I will provide lecture notes for most of the material.  
  
 +
Example textbooks for further studies:
  
 +
Kevin Murphy; Machine Learning: A Probabilistic Perspective; MIT Press
  
=== Assignments ===
+
Ethem Alpaydim; Introduction to Machine Learning; MIT Press
  
Assignments are posted in the schedule below.<br>
+
Trevor Hastie, Robert Tibshirani, and Jerome Friedman; The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer
Late assignments are not excepted.
 
  
  
  
=== Grading Scheme ===
+
=== Assignments ===
  
Quizzes 50%, Assignments & Projects 50%
+
Assignments are posted in the schedule below. Late assignments are not excepted.
There will be additional or different assignments and/or quiz questions for the ugrad and grad course.  
 
Some of the assignments are group work, but you have to pass all individual components in order to pass the course.
 
  
== Academic Integrity & Plagiarism ==
+
=== Grading Scheme ===
  
(Based on the sample statement provided at http://academicintegrity.dal.ca. Written by Dr. Alex Brodsky.)
+
Quizzes 50%, Assignments 50%
 +
There will be additional or different assignments and/or quiz questions for the grad course.  
 +
Some of the assignments may be group work, but you have to pass all individual components in order to pass the course.
  
Please familiarize yourself with the university policy on Intellectual Honesty. Every suspected case will be reported.
+
=== List of toolboxes that might be useful ===
 
 
At Dalhousie University, we respect the values of academic integrity: honesty, trust, fairness, responsibility and respect. As a student, adherence to the values of academic integrity and related policies is a requirement of being part of the academic community at Dalhousie University.
 
  
   
+
  Parallel Computing Toolbox
 +
Symbolic Math Toolbox
 +
Statistics and Machine Learning Toolbox
 +
Curve Fitting Toolbox
 +
Optimization Toolbox
 +
Neural Network Toolbox
 +
Signal Processing Toolbox
 +
Image Processing Toolbox
 +
Computer Vision System Toolbox
 +
Image Acquisition Toolbox
  
=== Schedule (tentative, can change) ===
+
=== Schedule ===
 +
This schedule is tentative and will likely change frequently.
 +
Please check frequently as it provides links to relevant resources and assignments.
  
 
{| class="wikitable"
 
{| class="wikitable"
Line 54: Line 61:
 
| Sept 10 || Intro  ||  ||
 
| Sept 10 || Intro  ||  ||
 
|-
 
|-
| Sept 15 ||  ||  || A1
+
| Sept 15 || Overview || http://www.cs.ubc.ca/~murphyk/MLbook/pml-intro-22may12.pdf ||
 
|-
 
|-
| Sept 17 || Matlab  || Manuscript and Online Tutorials ||  
+
| Sept 17 || Matlab  || http://www.mathworks.com/academia/student_center/tutorials/launchpad.html || [[media:A115.pdf|A1]]
 
|-
 
|-
| Sept 22 || Statistics  || Manuscript || A2
+
| Sept 22 || Supervised learning project 1: SVM  ||http://www.mathworks.com/help/stats/support-vector-machines-svm.html#bsr5b6n <br\> http://www.mathworks.com/help/stats/support-vector-machines-svm.html#bss0s6_-1 || [[media:A215.pdf|A2]] [[media:data1.mat|data1]]
 
|-  
 
|-  
| Sept 24 ||  Basic math || (matrices & gradients) || A3
+
| Sept 24 ||  Basic math   (matrices & gradients) || [[Media:math.pdf|math.pdf]]||
 
|-
 
|-
| Sept 29 ||  Supervised learning project || Manuscript and data || A4
+
| Sept 29 ||  Supervised learning project 2: MLP || [[Media:MLP.pdf|MLP.pdf]] ||[[media:A315.pdf|A3]]
 
|-
 
|-
| Oct 1 || Stochastic modeling (regression and Maximum Likelihood)  || Manuscript || A5
+
| Oct 1 || MLP _(... continue)  || example program (online, component-wise) [[Media:mlpANDonlineComponent.m|mlpANDonlineComponent.m]]  ||  
 
|-
 
|-
| Oct 6 || Classification 1 (Generative models)|| ||
+
| Oct 6 || MLP _(... continue)|| example program (batch, matrix): [[Media:mlpXORbatch.m|mlpXORbatch.m]] ||
 
|-
 
|-
| Oct 8 || Classification 2 (Naive Bayes, multiclass, etc) || ||
+
| Oct 8 || Classification 1 (SVM) || [[Media:SVM1.pdf|SVM1.pdf]] ||
 
|-
 
|-
| Oct 13 || MLP || ||
+
| Oct 13 ||Deep Learning 1 || http://www.robots.ox.ac.uk/~vgg/practicals/cnn/index.html ||[[media:A415.pdf|A4]]
 
|-
 
|-
| Oct 15 || SVM ||  ||
+
| Oct 15 ||Deep Learning 2  ||  || Quiz 2 (MLP/SVM)
 
|-
 
|-
| Oct 20 || Deep Learning 1 ||  ||
+
| Oct 20 || Deep learning 3  ||  ||
 
|-
 
|-
| Oct 22 || Deep Learning 2 ||  ||
+
| Oct 22 || Deep learning 4 ||  ||
 
|-
 
|-
| Oct 27 || EM || ||
+
| Oct 27 || Statistics || [[Media:Prob1.pdf|Prob1.pdf]] ||
 
|-
 
|-
| Oct 29 || Deep Autoencoders ||  ||
+
| Oct 29 || Stochastic modeling (regression and Maximum Likelihood)|| [[Media:Regression1.pdf|Regression1.pdf]] ||
 
|-
 
|-
| Nov 3 || Dimensionality reduction ||  ||
+
| Nov 3 || Graphical models || [[Media:graphical1.pdf|graphical1.pdf]] https://code.google.com/p/bnt/ || [[media:A515.pdf|A5]]
 
|-
 
|-
| Nov 5 || ||  ||
+
| Nov 5 || MDP || [[Media:MDPslides.pdf|MDPslides.pdf]] || Quiz 3 (Deep learning, CNN, Stochastic modeling)
 
|-
 
|-
| Nov 10 || || ||
+
| Nov 10 || POMDP/ TD(lambda) ||[[Media:RL.pdf|RL.pdf]]  || A6: Exercise 1-4 (+5 for 6505) from RL.pdf (Due Nov 20 by email with subject line A6)
 
|-
 
|-
 
| Nov 12 || Study Day ||  ||
 
| Nov 12 || Study Day ||  ||
 
|-
 
|-
| Nov 17 || || ||
+
| Nov 17 || Generative models, Discriminant Analysis, Naive Bayes || [[Media:Classification2.pdf|Classification2.pdf]] ||
|-
 
| Nov 19 ||  ||  ||
 
 
|-
 
|-
| Nov 24 ||MDP  ||  ||
+
| Nov 19 ||   Unsupervised Learning (k-means, EM algorithm)  || [[Media:Unsupervised1.pdf|Unsupervised1.pdf]] [[Media:ExpectationMaximization.m|ExpectationMaximization.m]] || Quiz 4 (RL)
 
|-
 
|-
| Nov 26 ||POMDP  ||  ||
+
| Nov 24 || Unsupervised Learning (Dimensionality reduction, t-SNE, representational learning, DAE, RBM)  ||  [[Media:Unsupervised2.pdf|Unsupervised2.pdf]] [[Media:DimensionalityReduction.pdf|DimensionalityReduction.pdf]] [[Media:RBMExample.zip|RBMExample.zip]] || [[Media:20newsgroups.zip|20newsgroups.zip]] [[media:A715.pdf|A7]]
 
|-
 
|-
| Dec 1 || TD(lambda) || ||
+
| Nov 26 || || ||
 
|-
 
|-
| Dec || RL project ||  ||
+
| Dec 1 || Summary  ||  ||
 
|-
 
|-
| Dec 8 ||  ||  ||
+
| Dec 3 ||  ||  || Quiz 5
 
|}
 
|}
 +
 +
== Academic Integrity & Plagiarism ==
 +
 +
(Based on the sample statement provided at http://academicintegrity.dal.ca. Written by Dr. Alex Brodsky.)
 +
 +
Please familiarize yourself with the university policy on Intellectual Honesty. Every suspected case will be reported.
 +
 +
At Dalhousie University, we respect the values of academic integrity: honesty, trust, fairness, responsibility and respect. As a student, adherence to the values of academic integrity and related policies is a requirement of being part of the academic community at Dalhousie University.
 +
  
 
=== What does academic integrity mean? ===
 
=== What does academic integrity mean? ===

Latest revision as of 20:20, 26 November 2015

Machine Learning 2015

Instructors

Prof: Dr. Thomas Trappenberg (tt@cs.dal.ca)

Office: Room 4216 in Mona Campbell Building

TA: Paul Hollensen (paulhollensen@gmail.com)

Office hours: By appointment (write email)

Course Description

This course is an introduction to machine learning, including their practical use and theoretical foundation. We will start by applying some pre-programmed algorithms in Matlab to get practical experience before unpacking some of the theory behind them. The course includes introductory reviews of the Matlab scientific programming language and some mathematical concepts that we need for the discussions on the theory. The later includes the formalism of describing uncertainty (e.g. basic concepts of probability theory), representations of large data sets (e.g. vector and matrix formalism), and mathematical concepts behind some algorithms (e.g. gradients and optimization techniques).

Course Textbook

There are many good textbook on machine learning, some which are listed below. We will not follow a single textbook but I will provide lecture notes for most of the material.

Example textbooks for further studies:

Kevin Murphy; Machine Learning: A Probabilistic Perspective; MIT Press
Ethem Alpaydim; Introduction to Machine Learning; MIT Press
Trevor Hastie, Robert Tibshirani, and Jerome Friedman; The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer


Assignments

Assignments are posted in the schedule below. Late assignments are not excepted.

Grading Scheme

Quizzes 50%, Assignments 50% There will be additional or different assignments and/or quiz questions for the grad course. Some of the assignments may be group work, but you have to pass all individual components in order to pass the course.

List of toolboxes that might be useful

Parallel Computing Toolbox
Symbolic Math Toolbox
Statistics and Machine Learning Toolbox
Curve Fitting Toolbox
Optimization Toolbox
Neural Network Toolbox
Signal Processing Toolbox
Image Processing Toolbox
Computer Vision System Toolbox
Image Acquisition Toolbox

Schedule

This schedule is tentative and will likely change frequently. Please check frequently as it provides links to relevant resources and assignments.

Date Content Reference Assignment
Sept 10 Intro
Sept 15 Overview http://www.cs.ubc.ca/~murphyk/MLbook/pml-intro-22may12.pdf
Sept 17 Matlab http://www.mathworks.com/academia/student_center/tutorials/launchpad.html A1
Sept 22 Supervised learning project 1: SVM http://www.mathworks.com/help/stats/support-vector-machines-svm.html#bsr5b6n <br\> http://www.mathworks.com/help/stats/support-vector-machines-svm.html#bss0s6_-1 A2 data1
Sept 24 Basic math (matrices & gradients) math.pdf
Sept 29 Supervised learning project 2: MLP MLP.pdf A3
Oct 1 MLP _(... continue) example program (online, component-wise) mlpANDonlineComponent.m
Oct 6 MLP _(... continue) example program (batch, matrix): mlpXORbatch.m
Oct 8 Classification 1 (SVM) SVM1.pdf
Oct 13 Deep Learning 1 http://www.robots.ox.ac.uk/~vgg/practicals/cnn/index.html A4
Oct 15 Deep Learning 2 Quiz 2 (MLP/SVM)
Oct 20 Deep learning 3
Oct 22 Deep learning 4
Oct 27 Statistics Prob1.pdf
Oct 29 Stochastic modeling (regression and Maximum Likelihood) Regression1.pdf
Nov 3 Graphical models graphical1.pdf https://code.google.com/p/bnt/ A5
Nov 5 MDP MDPslides.pdf Quiz 3 (Deep learning, CNN, Stochastic modeling)
Nov 10 POMDP/ TD(lambda) RL.pdf A6: Exercise 1-4 (+5 for 6505) from RL.pdf (Due Nov 20 by email with subject line A6)
Nov 12 Study Day
Nov 17 Generative models, Discriminant Analysis, Naive Bayes Classification2.pdf
Nov 19 Unsupervised Learning (k-means, EM algorithm) Unsupervised1.pdf ExpectationMaximization.m Quiz 4 (RL)
Nov 24 Unsupervised Learning (Dimensionality reduction, t-SNE, representational learning, DAE, RBM) Unsupervised2.pdf DimensionalityReduction.pdf RBMExample.zip 20newsgroups.zip A7
Nov 26
Dec 1 Summary
Dec 3 Quiz 5

Academic Integrity & Plagiarism

(Based on the sample statement provided at http://academicintegrity.dal.ca. Written by Dr. Alex Brodsky.)

Please familiarize yourself with the university policy on Intellectual Honesty. Every suspected case will be reported.

At Dalhousie University, we respect the values of academic integrity: honesty, trust, fairness, responsibility and respect. As a student, adherence to the values of academic integrity and related policies is a requirement of being part of the academic community at Dalhousie University.


What does academic integrity mean?

Academic integrity means being honest in the fulfillment of your academic responsibilities thus establishing mutual trust. Fairness is essential to the interactions of the academic community and is achieved through respect for the opinions and ideas of others. Violations of intellectual honesty are offensive to the entire academic community, not just to the individual faculty member and students in whose class an offence occurs. (see Intellectual Honesty section of University Calendar)


How can you achieve academic integrity?

• Make sure you understand Dalhousies policies on academic integrity.

• Give appropriate credit to the sources used in your assignment such as written or oral work, com- puter codes/programs, artistic or architectural works, scientific projects, performances, web page designs, graphical representations, diagrams, videos, and images. Use RefWorks to keep track of your research and edit and format bibliographies in the citation style required by the instructor (http://www.library.dal.ca/How/RefWorks)

• Do not download the work of another from the Internet and submit it as your own.

• Do not submit work that has been completed through collaboration or previously submitted for another assignment without permission from your instructor. • Do not write an examination or test for someone else.

• Do not falsify data or lab results.

These examples should be considered only as a guide and not an exhaustive list.


What will happen if an allegation of an academic offence is made against you?

I am required to report a suspected offence. The full process is outlined in the Discipline flow chart, which can be found at: http://academicintegrity.dal.ca/Files/AcademicDisciplineProcess.pdf and in- cludes the following:

1. Each Faculty has an Academic Integrity Officer (AIO) who receives allegations from instructors.

2. The AIO decides whether to proceed with the allegation and you will be notified of the process.

3. If the case proceeds, you will receive an INC (incomplete) grade until the matter is resolved.

4. If you are found guilty of an academic offence, a penalty will be assigned ranging from a warning to a suspension or expulsion from the University and can include a notation on your transcript, failure of the assignment or failure of the course. All penalties are academic in nature.


Where can you turn for help?

• If you are ever unsure about ANYTHING, contact myself.

• The Academic Integrity website (http://academicintegrity.dal.ca) has links to policies, defini tions, online tutorials, tips on citing and paraphrasing.

• The Writing Center provides assistance with proofreading, writing styles, citations.

• Dalhousie Libraries have workshops, online tutorials, citation guides, Assignment Calculator, Ref- Works, etc.

• The Dalhousie Student Advocacy Service assists students with academic appeals and student discipline procedures.

• The Senate Office provides links to a list of Academic Integrity Officers, discipline flow chart, and Senate Discipline Committee.

Request for special accommodation

Students may request accommodation as a result of barriers related to disability, religious obligation, or any characteristic under the Nova Scotia Human Rights Act. Students who require academic accommodation for either classroom participation or the writing of tests and exams should make their request to the Advising and Access Services Center (AASC) prior to or at the outset of the regular academic year. Please visit www.dal.ca/access for more information and to obtain the Request for Accommodation – Form A.

A note taker may be required as part of a student’s accommodation. There is an honorarium of $75/course/term (with some exceptions). If you are interested, please contact AASC at 494-2836 for more information.

Please note that your classroom may contain specialized accessible furniture and equipment. It is important that these items remain in the classroom, untouched, so that students who require their usage will be able to participate in the class.