subscribe

Posts

• Another Activation Function

The rectified linear function avoids two problems of the logistic/sigmoid activation function.
• Pooling Layers

Pooling layers "downsample" their inputs to highlight or summarize the features in a region.
• Convolutional Layers

Convolutional layers map the occurrence of a local feature across the entire input.
• Neural Networks

An introduction to neural networks and associated terminology.
• A Multilayer Perceptron in Julia

A Julia program that uses a multi-layer perceptron to classify images from the MNIST dataset.
• A Multilayer Perceptron in Python

A Python program that uses a multi-layer perceptron to classify images from the MNIST dataset.
• Multi-Layer Perceptrons

Multi-layer perceptrons approach the multivariate logistic regression problem by learning the appropriate features at the same time as their coefficients.
• Simple Computer Vision in Julia

A Julia program that uses stochastic mini-batch gradient descent to classify 28-pixel square images from the MNIST dataset.
• Simple Computer Vision in Python

A Python program that uses stochastic mini-batch gradient descent to classify 28-pixel square images from the MNIST dataset.

• Stochastic and Mini-Batch Gradient Descent

The cost function for stochastic gradient descent (SGD) considers the cost of a single example, randomly-chosen. Mini-batch gradient descent considers a fraction of the examples.
• Multiclass Logistic Regression in Julia—with TensorFlow

A Julia program that finds polynomial logistic models for three classes using TensorFlow.
• Multiclass Logistic Regression in Python—with TensorFlow

A Python program that finds polynomial logistic models for three classes using TensorFlow.
• Automatic Differentiation with TensorFlow

TensorFlow is a fast, efficient open source library that can automatically generate partial derivative functions from the definition of a complex cost function.
• Scikit-learn for Logistic Regression

Scikit-learn is a popular open-source library with a fast, efficient implementation of logistic regression.
• Overfitting and Regularization

Regularization reduces the complexity of the model to help it generalize.
• Feature Scaling Caution

Scaling inputs and features can help speed convergence with gradient descent.
• Multiclass Logistic Regression in Julia

A Julia program that finds polynomial logistic models for three classes. Interactive visualization of the results.
• Multiclass Logistic Regression in Python

A Python program that finds polynomial logistic models for three classes. Interactive visualization of the results.
• Logistic Regression with Multiple Classes

The logistic model can be extended to classify its input into one of several classes.
• Interactive Minimization of the Logistic Regression Cost Function

Adjust $$\theta_0$$ and $$\theta_1$$ to minimize the cost function.
• Visualization of Improved Cost Function for Logistic Regression

The new cost function has a convex shape suitable for gradient descent.
• A Better Cost Function for Logistic Regression

A cost function for logistic regression that works with gradient descent.
• Visualization of Cost Function for Logistic Regression

There is a problem with using gradient descent to minimize the cost function.
• Almost Entirely Nonlinear Regression

Logistic regression can be used for classification because it learns a nonlinear relationship between input and output.
• Interactive Minimization of the Cost Function

Adjust $$\theta_0$$, $$\theta_1$$ and $$\theta_2$$ to minimize the cost function.
• Even Less Linear Regression

Learning a polynomial relationship between multiple inputs and an output.

• Not Completely Linear Regression

Learning a linear relationship from multiple inputs to one output.
• Learning Rate Caution

A standard, iterative method for minimizing the cost function.
• Visualization of the Cost Function

An interactive visualization of the cost function for linear regression.
• Interactive Minimization of Cost Function

Adjust $$\theta_0$$ and $$\theta_1$$ to minimize the cost function.
• Simple Linear Regression

Simple linear regression is one of the simplest forms of supervised machine learning.
• Introduction to Machine Learning

An introduction to a series of posts on machine learning.