Hopfield Network

A discrete hopfield network is a collection of binary neurons that are fully connected to all neurons but themselves. The Hopfield Network exists as a subnetwork in Simbrain because it updates asynchronously, unlike the synchronized updating method of a standard network. Hopfield networks must be updated in this way in order to be guaranteed to converge on a stable state.

The synapses in a Hopfield networks are trained by the Hebb rule. Note that if you open these synapses, however, you will see that they are clamped; the synapse values are changed by Hopfield subnetwork. To train a Hopfield network set the neurons to some desired level and select "train" from subnetwork tab context menu. You may want to begin by clearing the networks.

To create a continuous Hopfield networks see below.

Initialization

Hopfield networks are initialized with some number of units, and are by default laid out as a grid. They are fully interconnected with no self connections.

Parameters

Number of Neurons: Input desired number of neurons in this field.

Update Order: This can be set to random or sequential. If set to random, the neuronns are updated in random order. This is the standard assumption of Hopfield networks. if sequential is used, neurons are updated in the same sequence each time, making it possible to reproduce chains of behavior.

By Priority: TODO

Shuffle Order: TODO

Right Click Menu

Update Order:This can be set to random or sequential. If set to random, the neuronns are updated in random order. This is the standard assumption of Hopfield networks. if sequential is used, neurons are updated in the same sequence each time, making it possible to reproduce chains of behavior.  Three cases

1) Synchronous does not depend on the order in which nodes are updated but sometimes produces oscillations

2) Sequential is more stable.  Nodes are updated one at a time and order matters.  More stable than synchronous.  Can either use the priority [link] of the nodes for update order, or a random order (randomize once at initialization and use this order every time).

(3) Random: nodes are updated in random order.  This was important in the history of Hopfield networks and is related to Boltzmann machnes.

Add Current Pattern To Input Data

Set the training data to the current pattern.

Randomize Synapses Symmetrically

Randomize the synaptic weights symmertrically.

Set Weights To Zero

Set all synaptic strength to zero.

Right Click Menu

Edit/Train Hopfield

Add Current Pattern To Input Data

Randomize Synapses Symmetrically

Set Weights To Zero

Train

Trains the hopfield network using the Hebb rule to learn the current pattern of activity across its nodes.

Continuous Hopfield Networks

To create a continuous Hopfield network use a set of Additive neurons in a standard network. These can be connected appropriately and trained by using Hebbian synapses. The user then clamps all neurons, iterates to train the synapses, then clamps all weights. On clamping, see toolbar.

Also see

See here.Random update is discused there. Related to Boltzmann machines.