Learning Log
Day to day reflections as I work my way through the curriculum. Find related IPython notebooks here.

Tue, 5/17—HN coverage of my sigmoid function notebook, working on a logistic regression implementation

Mon, 5/16—A notebook exploring why a sigmoid function is used in logistic regression and conditional probability HW

Thu, 5/12—More probability HW, making sense of the odds ratio underlying logistic regression

Wed, 5/11—Collaborative filtering via Matrix Factorization, more probability homework, Skikitlearn version of perceptron

Tue, 5/10—High level understanding of bayesian nonparametric models, more stats, and back to Python Machine Learning with stochastic gradient descent.

Mon, 5/9—Grokking probability fundamentals paying off, stats problem set progress

Thu, 5/5—Measure theory as it relates to Cumulative distribution functions (CDFs), working on problem sets.

Wed, 5/4—Probability through the lens of measure theory

Tue, 5/3—Finally grokking random variables and going back to review curriculum of probability and stats.

Mon, 5/2—Mathjax in posts and resolving confusion about random variables vs probability functions

Fri, 4/29—Probability defined (and figuring out how to host IPython notebooks)

Thu, 4/28—Probability intro and simulating the birthday problem

Wed, 4/27—Statistical sampling, Designing studies

Wed, 3/30—Feature scaling to improve performance of gradient descent.

Tue, 3/29—Python ML book ch2 and improving on the perceptron using gradient descent.

Thu, 3/24—Getting familiar with pandas and plotting a scatter matrix of NBA player stats.

Wed, 3/23—Two variable exploratory data analysis (scatter plots)

Mon, 3/21—Basic exploratory data analysis and Simpson's Paradox

Fri, 3/18—Ch2 of Python ML book: implementing the perceptron algorithm.

Thu, 3/17—Getting into Python Machine Learning Book
« Previous
 3 
Next »