Project 2: Neural nets (multilayer perceptron networks)

The goal of this project is to implement the feed-forward algorithm for applying a multilayer perceptron to a data point and the backpropagation algorithm for training a multilayer perceptron. To simplify this problem, we will assume that this MLP does binary classification.

Write two functions in Python 3 with the following signatures:

classify(mlp, inputs)

train(M, data, targets)

Where

The train function returns something that represents a multilayer perceptron with M hidden units in one hidden layer and a single output unit that returns 0 or 1 (or -1 or 1; or True or False). The number of input units is determined by the dimensionality of the training data. How the MLP is represented is up to you, but your train function must return something that can be passed to the first parameter of your classify function.

Test your implementation on at least two data sets and report on the performance.

Submit your code in two files, mlp.py and test_mlp.py so that the following code will work:

import mlp

clsfyr = mlp.train(30, X_train, y_train)

results = mlp.classify(clsyr, X_test)

Moreover, the following should work from the command line:

python3 test_mlp.py

Which should display information about the performance of your classifier.

Finally, include a file README that (briefly) describes how you tested your classifier and to what results, and anything else you think it would be good for me to know in order for me to give your submission the fairest grading.

Specific guidelines:

Update, March 25. It seems like those students who have put time into this project so far are having a lot of trouble getting backpropagation to work. So that we can make something of this project, here's what I am going to do and suggest:

To turn in:

Copy mlp.py, test_mlp.py, README, and any other files your code needs (such as data sets for testing) to

/cslab/class/cs394/[your userid]/mlp

Due Fri, March 29


Thomas VanDrunen
Last modified: Tue Feb 5 16:13:52 CST 2019