First Steps

Here you will learn the basic components for running a deep learning model.

To get up to speed on deep learning, check out a blog post here: Deep Learning 101.

I also recommend setting up Theano to use the GPU to vastly reduce training time.

OpenDeep relies on the creation of four classes:

  • Dataset: you will use a dataset object to act as an interface to whatever data you train with. You can use standard dataset wrappers provided (such as MNIST or CIFAR10), or create your own from files or even in-memory if you pass arrays from other packages like Numpy.

  • Model: the model defines the computation you want to perform. Models take in Theano variables/expressions and output Theano expressions.

  • Loss: you must define the appropriate loss (also known as cost) function that defines how well your model is performing. This function will be minimized during training.

  • Optimizer: an optimizer takes a model, loss, and dataset, and trains the model's parameters using minibatches of examples from the dataset. This is a logically separate object to give flexibility to training models as well as the ability to continually train models based on new data.

Hello World: MLP on the MNIST dataset

A Multilayer Perceptron is a simple feedforward neural network that has a classification layer. In this example, we can quickly construct a model container, add some hidden layers and an output layer, and train the network to classify images of handwritten digits (MNIST dataset).

from opendeep.models import Prototype, Dense, Softmax
from opendeep.models.utils import Noise
from opendeep.optimization.loss import Neg_LL
from opendeep.optimization import AdaDelta
from opendeep.data import MNIST
from theano.tensor import matrix, lvector

print "Getting data..."
data = MNIST(flatten=True)

print "Creating model..."
in_shape = (None, 28*28)
in_var = matrix('xs')
mlp = Prototype()
mlp.add(Dense(inputs=(in_shape, in_var), outputs=512, activation='relu'))
mlp.add(Noise, noise='dropout', noise_level=0.5)
mlp.add(Softmax, outputs=10, out_as_probs=False)

print "Training..."
target_var = lvector('ys')
loss = Neg_LL(inputs=mlp.models[-1].p_y_given_x, targets=target_var, one_hot=False)

optimizer = AdaDelta(model=mlp, loss=loss, dataset=data, epochs=10)
optimizer.train()

print "Predicting..."
predictions = mlp.run(data.test_inputs)

print "Accuracy: ", float(sum(predictions==data.test_targets)) / len(data.test_targets)

Passing data from Numpy/Scipy/Pandas/Array

If you want to use your own data for training/validation/testing, you can pass any array-like object (it gets cast to a numpy array in the code) to a Dataset like so:

# imports
from opendeep.data import NumpyDataset
import numpy

# create some fake random data to demonstrate creating a Dataset
# train set
fake_train_inputs = numpy.random.uniform(0, 1, size=(100, 5))
fake_train_targets = numpy.random.binomial(n=1, p=0.5, size=100)
# valid set
fake_valid_inputs = numpy.random.uniform(0, 1, size=(30, 5))
fake_valid_targets = numpy.random.binomial(n=1, p=0.5, size=30)
# test set (showing you can mix and match the types of inputs - as long as they can be cast to numpy arrays
fake_test_inputs = [[0.1, 0.2, 0.3, 0.4, 0.5],
               [0.9, 0.8, 0.7, 0.6, 0.5]]
fake_test_targets = [0, 1]

# create the dataset!
# note that everything except for train_inputs is optional. that would be bare-minimum for an unsupervised model.
data = NumpyDataset(train_inputs=fake_train_inputs, train_targets=fake_train_targets,
                     valid_inputs=fake_valid_inputs, valid_targets=fake_valid_targets,
                     test_inputs=fake_test_inputs, test_targets=fake_test_targets)
# now you can use the dataset normally when creating an optimizer like other tutorials show!

Summary

Congrats, you just:

  • set up a dataset (MNIST or an array from memory)
  • instantiated a container model and added layer models to form a neural network
  • trained it with an AdaDelta optimizer using negative log-likelihood cost
  • and predicted some outputs given inputs (and computed accuracy)!
480