To further understand the implementation of hyperparameters re-estimation technique in Bayesian hierarchical model, we added two more prior assumptions over the weight in BayesPI, namely Laplace prior and Cauchy prior, by using the evidence approximation method. In addition, we divided hyperparameter (regularization constants of the model) into multiple distinct classes based on either the structure of the neural networks or the property of...
Again I'm uploading my homework...
This is a GUI which enables to load images and train a Hopfield network according to the image. You can run the network on other images (or add noise to the same image) and see how well it...
Simple tutorial on pattern recognition using back propagation neural networks. the program has 3 classes with 3 images per class.
this model show the design of sun seeker control system using neural network model refrence with neural network toolbox and SIMULINK with MATLAB
errperf(T,P,M) uses T and P, which are target and prediction vectors respectively, and returns the value for M, which is one of several error related performance metrics.
T and P can be row or column vectors of the same size. M can be...
This is a nonlinear system of differential and algebriac equations, that describe the dynamics of a continuous bioreactor. The bioreactor simulates the production of ethanol through yeast fermentation. The model is described in:
Z. K....
-Compatible with pre-2010 vers. of Matlab and Neural network toolbox -Trains a perceptron for the spring and one for the damper. -Runs a simulation with forcing function and noise. -If you don't have the toolbox, you can still use...
Annt Trainer: graphically teach a neural net to avoid obstacles.
The Kalman filter can be interpreted as a feedback approach to minimize the least equare error. It can be applied to solve a nonlinear least square optimization problem. This function provides a way using the unscented Kalman filter to solve...
This demo shows some examples for image pre-processing before the recognition stage. the first example is just some commnads of image processing which are commonly used for preprocessing, while the second example shows how to use the simple...
It is a single lyer single neuron for linear sparable data classification.It implement the first neural networks algorithm by Rosenblatt's. It is a single lyer single neuron for linear sparable data classification.It implement the first...
RubyFann Bindings to use FANN (Fast Artificial Neural Network) from within ruby/rails environment. Requires: Ruby 1.8.6 or greater. gnu make tools or equiv for native code in ext To install: sudo gem install ruby-fann
In this article, procedure for using the the Matlab Neural Toolbox in order to identify and control an hypothetical non linear plant is explained with details enough to help anybody interested in that issue to know what to do and how to do a...
This code implements the basic back propagation of error learning algorithm. the network has tanh hidden neurons and a linear output neuron, and applied for predicting y=sin(2pix1)*sin(2pix2). We didn't use any feature of neural network toolbox.
A Ruby extension that provides a 2-Layer Back Propagation Neural Network, which can be used to categorize datasets of arbitrary size. The network can be easily (re-)stored to/from the hard disk.
Simple Matlab Code for Neural Network Hebb Learning Rule. It is good for NN beginners students. It can be applied for simple tasks e.g. Logic "and", "or", "not" and simple images classification.
This code, realized in collaboration with Robert Thijs Kozma, implements a simple and powerful spiking model proposed by Eugene Izhikevich in 2003 (http://www.izhikevich.org/publications/spikes.pdf). This mathematical model is the most recent of a...
The extended Kalman filter can not only estimate states of nonlinear dynamic systems from noisy measurements but also can be used to estimate parameters of a nonlinear system. A direct application of parameter estimation is to train artificial...
This little package contains a Parzen Neural Network classifier that can classify data between N classes in D dimensions. The classifier is really fast and simple to learn. The good classification performance can be obtained for a certain class of...
The Kalman filter is actually a feedback approach to minimize the estimation error in terms of sum of square. This approach can be applied to general nonlinear optimization. This function shows a way using the extended Kalman filter to solve some... |