**Convolutional autoencoder (MATLAB/Octave):** two-dimensional convolutional autoencoder (neural network) for data compression.

[tutorial] [source]
**Description:** least squares objective function, backpropagation gradient, and network model for a convolutional autoencoder with an arbitrary number of hidden layers and hidden units. Solutions can be found using gradient based algorithms such as stochastic gradient descent, conjugate gradient descent, and L-BFGS.

**Deep reinforcement learning of tic-tac-toe (python):** builds and optimizes a simple residual network to teach a computer to play tic-tac-toe.

[tutorial] [source]
**Description:** construct a basic feed-forward residual neural network using the package Theano. The script allows for an arbitrary number of hidden units and hidden layers to be defined and automatically computes the gradient for backpropagation. The optimization can then be performed on either CPUs or GPUs.

**Maximum noise entropy (C):** first and second-order maximum noise entropy modeling of receptive fields.

[tutorial] [source]
**Description:** computes the weights of a linear or quadratic classifier to reconstruct multicomponent receptive fields of sensory neurons with regularization options including LASSO, ridge, elastic net, and early stopping. The theoretical background of this model may be found in

Fitzgerald, Rowekamp, Sincich, & Sharpee, 2011. The code uses BLAS and openMP.

**Low-rank maximum noise entropy (python):** low-rank second-order maximum noise entropy modeling of receptive fields.

[tutorial] [source]
**Description:** reconstruct the linear and quadratic feature spaces of a classifier when the feature space is poorly sampled. The

*mner* package explicitly reduces the number of weights that need to be optimized relative to

full-rank second-order maximum noise entropy through explicit bilinear factorization of the quadratic weights and nuclear-norm and is related to the model described in

Kaardal, Theunissen, & Sharpee, 2017. The software is written modularly for easy customization.

**Functional bases (MATLAB/Octave):** compute functional inputs of neurons and other classifiers using Boolean operations.

[tutorial] [source]
**Description:** identify functional bases that describe the functional inputs of neurons by taking linear combinations of the receptive field components. The software assumes that the input activations are modeled by logistic functions and the overall neural response computes Boolean operations on these input activations as described in the publication

Kaardal, Fitzgerald, Berry, & Sharpee. This code is limited to logical OR and logical AND operations but can be easily extended to other operations.