Learning Log
Day to day reflections as I work my way through the curriculum. Find related IPython notebooks here.
-
Mon, 7/25—Expressing a Binomial as the sum of two Poisson random variables, and working on automatic data science project
-
Fri, 7/15—Back in the saddle with probability hw, initial steps on automatic data science project
-
Mon, 6/27—Samples as Random Variables, Sample Mean and Variance and Automatic Data Science
-
Thu, 6/23—Getting started with artificial neural nets and image classification, side project ideas
-
Wed, 6/22—Understanding Expectation, Moments and Variance with help from the transformation of random variables
-
Mon, 6/20—scikit-learn Pipeline gotchas, k-fold cross-validation, hyperparameter tuning and improving my score on Kaggle's Forest Cover Type Competition
-
Tue, 6/14—First attempt at Kaggle's Forest Cover Type competition, learning how slow SVMs can be
-
Fri, 6/10—Dimensionality reduction with Principal Components Analysis
-
Thu, 6/9—A notebook on inverse transform sampling
-
Wed, 6/8—Working with multiple random variables: conditionals, marginals, transformations and IID samples
-
Mon, 6/6—Transforming random variables, joint and marginal distributions, and The Rule of the Lazy Statistician
-
Thu, 6/2—Feature selection via L1 regularization penalty, greedily removing least impactful features and random forests
-
Wed, 6/1—Second attempt at Kaggle's Titanic data set, accuracy up to 78%, notes on preprocessing and Pandas
-
Tue, 5/31—First attempt at Kaggle's starter competition classifying Titanic passenger survivorship
-
Thu, 5/26—K-nearest neighbors, getting started with ch04 (data preprocessing) and Kaggle's Titanic data set
-
Wed, 5/25—Discrete and continuous random variable review, and down the math rabbit hole
-
Tue, 5/24—Random Variables again, regularization to combat high variance and a tour of some classifiers Scikit-learn (SVMs, Decision Trees)
-
Fri, 5/20—A conference-like day: high level understanding of MCMC and connecting with a study buddy
-
Thu, 5/19—Implementing logistic regression by swapping in a new cost function to previous single-layer neural network implementation
-
Wed, 5/18—Probability Density Function hw problem and more math monk vids on conditional probability and indpendance
« Previous
| 2 |
Next »