ART (Adaptive Resonance Theory)

May 26 • General • 4384 Views • 2 Comments on ART (Adaptive Resonance Theory)


The adaptive resonance theory (ART) network, is an unsupervised learning, developed by Steven Grossberg and Gail Carpenter in 1987.The adaptive resonance was developed to solve the problem of instability occurring in feed-forward systems.

Fundamental Architecture:

To build an adaptive resonance theory or ART network three groups of neurons are used. These include-

1.Input processing neurons(F1 layer)-The input processing neuron consists of two portions: Input portion and interface portion. The input portion perform some processing based on inputs it receives. The interface portion of the F1 layer combines the input from input portion of F1 and F2 layers for comparing the similarity of the input signal with the weight vector for the cluster unit that has been selected as a unit for learning.

2.Clustering unit(F2 layer).

3.Control mechanism(controls degree of similarity of pattern placed on the same cluster).


ART is of two types-



ART1 is designed for clustering binary vectors and ART2 is designed to accept continuous-valued vectors.

Adaptive Resonance Theory1:

ART1 network is designed for binary input vectors. The ART1 net consists of two field of units-input unit(F1 unit) and output unit(F2 unit)-along with reset control unit for controlling the degree of similarity of patterns placed on the same cluster unit. There exit two sets of weighted interconnection path between F1 and F2 layers. In ART network input pattern can be presented in any order. Practically ART1 network can be implemented by analog circuits governing the differential equations.ART1 network runs autonomously that means t does not require any external control signals and can run stably with infinite patterns of input data.

ART1 network is trained using fast learning method. This network performs well with perfect binary input patterns, but it is sensitive to noise in the input data and should be handled carefully.

ART1 Architecture:

It includes two units-

1.Computational units

2.Supplemental units

Computational unit again consist of following-

i).Input units(F1 unit-both input portion and interface portion).

ii).Cluster units(F2 unit-output unit).

iii).Reset control unit(controls degree of similarity of patterns placed on the same cluster).


Training Algorithm for ART1 network:

Step 0: Initialize the parameters:

α>1 and 0<ρ<=1

Initialize the weights:

0<bij(0)<α/α-1+n and tij(0)=1

Step 1: Perform steps 2 to 13 when stopping condition is false.

Step 2: Perform steps 3 to 12 for each of the training input.

Step 3: Set activation of all F2 units to zero. Set the activation of F1(a) units to input vectors.

Step 4: Calculate the norm of s:

||s||= ∑si

Step 5: Send input signal from F1(a) layer to F1(b) layer:


Step 6: For each F2 node that is not inhibited, the following rule should hold: If yj not=-1, then

yj=  ∑bij.xi

Step 7: Perform step 8 to 11 when reset is true.

Step 8: Find J for yJ>=yj for all nodes. If yJ =-1, then all the nodes are inhibited and this pattern cannot be clustered.

Step 9: Recalculate activation X of F1(b):

xi = si.tJi

Step 10: Calculate the norm of vector x:


Step 11: Test for reset condition.

If ||x||/||s||<ρ, then inhibit node J, yJ= -1.go to step 7 again.

Else if ||x||/||s||>=ρ, then proceed to the next step.

Step 12: Perform weight updation for node J:

biJ(new)=αxi / α-1+||x||


Step 13: Test for stopping condition. The following may be stopping conditions:

i).no change in weights

ii).no reset of units

iii).maximum number of epochs reached.

Adaptive Resonance Theory 2:

ART2 is for continuous-valued input vectors.In ART2 network complexity is higher than ART1 network because much processing s needed in F1 layer.ART2 network was designed to self-organize recognition categories for analog as well as binary input sequences.The continuous-valued inputs presented to the ART2 network may be of two forms-the first form is a “noisy binary” signal form and the second form of data is “truly continuous”.

The major difference between ART1 and ART2 network is the input layer.A three-layer feedback system in the input layer of Art2 network is required :a bottom layer where the input patterns are read in, a top layer where inputs coming from the output layer are read in and a middle layer where the top and bottom patterns are combined together to form a matched pattern which is then fed back to the top and bottom input layers.

ART2 Architecture:

In ART2 architecture F1 layer consist of six types of units-W, X, U, V, P, Q- and there are n unit of each type.The supplemental unit “N” between units W and X receives signals from all “W” units, computes the norm of vector w and sends this signal to each of the X units.Similarly there exit supplemental units between U and V, and P and Q,performing same operation as done between W an X.The connection between Pi f the F1 layer and Yj of the F2 layer show the weighted interconnections, which multiplies the signal transmitted over those paths.

The operation performed in F2 layer are same for both ART1 and ART2.

adaptive resonance theory dp2

adaptive resonance theory

Training Algorithm for ART2 network:

Step 0: Initialize the parameters :a, b, c, d, e, α, ρ, θ. Also specify the number of epochs of training(nep) and number of leaning iterations(nit).

Step 1: Perform step 2 to 12 (nep) times.

Step 2: Perform steps 3 to 11 for each input vector s.

Step 3: Update F1 unit activations:

ui=0  ; wi=si; Pi=0; qi=0; vi=f(xi);

xi=si / e+||s||

Update F1 unit activation again:

ui=vi / e+||v||; wi=si+a.ui;

Pi=ui; xi=wi / e+||w||;

qi=pi / e+ ||p||; vi= f(xi) + b.f(qi)

In  ART2  networks, norms are calculated as the square root of the sum of the squares of the respective values.

Step 4: Calculate signals to F2 units:


Step 5: Perform steps 6 and 7 when reset is true.

Step 6: Find F2 unit Yj with largest signal( J is defined such that yj>=yj, j=1 to m).

Step 7: Check for reset:

 ui=vi / e + ||v||;  Pi=ui + d.tJi;  ri=ui + c.Pi / e+||u||+c||p||

If ||r|| < (ρ-e), then yJ =-1(inhibit J).Reset is true; perform step 5.

If ||r|| >=  (ρ-e), then wi=si+a.ui;  xi=wi / e+||w||;  qi=pi / e+||p||;  vi=f(xi)+b.f(qi)

Reset is false. Proceed to step 8.

Step 8: Perform steps 9 to 11 for specified number of learning iterations.

Step 9: Update the weights for winning unit J:

tJi=α.d.ui + {[1+α.d(d-1)]}tJi

biJ= α.d.ui + {[1+α.d(d-1)]}biJ

Step 10: Update F1 activations:

ui-vi / e+ ||v||;  wi=si+a.ui;

Pi=ui+d.tJi;  xi=wi / e+||w||;

qi=Pi / e+||p||;  vi=f(xi)+b.f(qi)

Step 11: Check for the stopping condition of weight updation.

Step 12: Check for the stopping condition for number of epochs.

Tell us Your Queries, Suggestions and Feedback

Your email address will not be published.

2 Responses to ART (Adaptive Resonance Theory)

  1. patlakshi Jha says:

    ART which is also known as Adaptive Resonance Theory. This article comprises about the ART. Many people are not aware of this topic so they can get clear by reading this article which provides clear cut description in detail about ART.

  2. Rachita Mishra says:

    This is a good article which is about Adaptive resonance theory (ART) that is a theory developed by Stephen Grossberg and Gail Carpenter on aspects of how the brain processes information. It describes a number of neural network models which use supervised and unsupervised learning methods, and address problems such as pattern recognition and prediction.

« »