Announcing quick experimentation with Prototype!
over 9 years ago by Markus Beissinger
Lots of progress has been made since the initial alpha. Today, I would like to announce the new Prototype class (opendeep.models.container.Prototype) - a container to quickly assemble layers into a model. This is similar to Sequential in Torch, but provides all of the power that the Model class gives for modularity!
You can create a Prototype in two main ways:
-
Adding layers (or lists of layers) with the .add() method
-
Initializing the Prototype with a list of its layers
These two methods are demonstrated below:
# Add one layer at a time
from opendeep.models.container import Prototype
from opendeep.models.single_layer.basic import BasicLayer, SoftmaxLayer
# create the Prototype
mlp = Prototype()
# add a fully-connected relu layer
mlp.add(BasicLayer(input_size=784, output_size=1000, activation='rectifier'))
# add a softmax classification layer
mlp.add(SoftmaxLayer(inputs_hook=(1000, mlp[-1].get_outputs()), output_size=10))
# that's it!
# Add all the layers during initialization
from opendeep.models.container import Prototype
from opendeep.models.single_layer.basic import BasicLayer, SoftmaxLayer
# define the layers
relu = BasicLayer(input_size=784, output_size=1000, activation='rectifier')
softmax = SoftmaxLayer(inputs_hook=(1000, relu.get_outputs()), output_size=10)
# create the Prototype
mlp = Prototype(layers=[relu, softmax])
# that's it!
Finally, here is a quick 'hello world' example of assembling a basic multilayer perceptron (MLP) over the standard MNIST handwritten digit dataset (you can see more advanced uses in the Tutorials section of the developer hub):
# standard libraries
import logging
# third party libraries
from opendeep.log.logger import config_root_logger
from opendeep.models.container import Prototype
from opendeep.models.single_layer.basic import BasicLayer, SoftmaxLayer
from opendeep.optimization.adadelta import AdaDelta
from opendeep.data.standard_datasets.image.mnist import MNIST
from opendeep.data.dataset import TEST
# grab a log to output useful info, and configure our environment
log = logging.getLogger(__name__)
config_root_logger()
# define the model layers for the MNIST dataset
# create a fully-connected rectifier layer from the input to 1000 outputs
relu_layer1 = BasicLayer(input_size=784, output_size=1000, activation='rectifier')
# create a second fully-connected rectifier layer from the 1000 units to another 1000.
relu_layer2 = BasicLayer(inputs_hook=(1000, relu_layer1.get_outputs()), output_size=1000, activation='rectifier')
# create our output classification layer (softmax)
class_layer3 = SoftmaxLayer(inputs_hook=(1000, relu_layer2.get_outputs()), output_size=10)
# add the layers as a Prototype
mlp = Prototype(layers=[relu_layer1, relu_layer2, class_layer3])
# grab the dataset
mnist = MNIST()
# define the optimizer (adadelta is a good default)
optimizer = AdaDelta(model=mlp, dataset=mnist, n_epoch=10)
optimizer.train()
test_data = mnist.getDataByIndices(indices=range(25), subset=TEST)
# use the predict function!
yhat = mlp.predict(test_data)
log.info('-------')
log.info(str(yhat))
log.info(str(mnist.getLabelsByIndices(indices=range(25), subset=TEST)))
Changelog