Ending In:

Certification Included

access

lifetime

content

3 Hours

**Description**

- Access 31 lectures & 3 hours of content 24/7
- Code your own logistic regression module in Python
- Complete a course project that predicts user actions on a website given user data
- Use Deep Learning for facial expression recognition
- Understand how to make data-driven decisions

The Lazy Programmer is a data scientist, big data engineer, and full stack software engineer. For his master's thesis he worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons to communicate with their family and caregivers.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: all levels, but you must have some knowledge of calculus, linear algebra, probability, Python, and Numpy
- All code for this course is available for download
*here*, in the directory logistic_regression_class

Compatibility

- Internet required

**Terms**

- Unredeemed licenses can be returned for store credit within 15 days of purchase. Once your license is redeemed, all sales are final.

- Introductgion and Outline
- Introduction and Outline (4:02)
- Review of the classification problem (2:53)
- Introduction to the E-Commerce Course Project (8:53)
- What can classification be used for?

- Basics: What is linear classification? What's the relation to neural networks?
- Linear Classification (4:59)
- Biological inspiration - the neuron (3:36)
- How do we calculate the output of a neuron / logistic classifier? - Theory (4:18)
- How do we calculate the output of a neuron / logistic classifier? - Code (4:30)
- E-Commerce Course Project: Pre-Processing the Data (5:24)
- E-Commerce Course Project: Making Predictions (3:01)
- Feedforward

- Solving for the optimal weights
- A closed-form solution to the Bayes classifier (5:59)
- What do all these symbols mean? X, Y, N, D, L, J, P(Y=1|X), etc. (3:38)
- The cross-entropy error function - Theory (2:46)
- The cross-entropy error function - Code (4:53)
- Visualizing the linear discriminant / Bayes classifier / Gaussian clouds (2:28)
- Can we use squared error instead of cross-entropy for the error if we're doing classification?
- Maximizing the likelihood (6:34)
- Updating the weights using gradient descent - Theory (6:20)
- Updating the weights using gradient descent - Code (3:09)
- E-Commerce Course Project: Training the Logistic Model (6:47)
- Softmax

- Practical concerns
- L2 Regularization - Theory (8:38)
- Regularization - Code (1:43)
- The donut problem (10:01)
- The XOR Problem (6:12)
- Neural Networks

- Checkpoint and applications: How to make sure you know your stuff
- Sentiment Analysis (5:13)
- Exercises + how to get good at this (2:48)

- Project: Facial Expression Recognition
- Facial Expression Recognition Problem Description (12:21)
- The class imbalance problem (6:01)
- Utilities walkthrough (5:45)
- Facial Expression Recognition in Code (10:41)

- Appendix
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:22)
- Gradient Descent Tutorial (4:30)

access

lifetime

content

3 Hours