...XGUI
X Graphical User Interface

...1.
 Backpercolation 1 was developed by JURIK RESEARCH & CONSULTING, PO 2379, Aptos, CA USA. Any and all SALES of products (commercial, industrial, or otherwise) that utilize the
Backpercolation 1 process or its derivatives require a license from JURIK RESEARCH & CONSUL-
TING. Write for details.

...units
In the following the more common name ''units'' is used instead of ''cells''.

...unit.
The term transfer function often denotes the combination of activation and output function. To make matters worse, sometimes the term activation function is also used to comprise activation and output function.

...number
This number can change after saving but remains unambiguous. See also chapter gif

...range
Mathematically correct would be 15#15, but the values 0 and 1 are reached due to arithmetic inaccuracy.

...layers
Changing it to 16 layers can be done very easily in the source code of the interface.

...font
On some systems the fonts 7x14 or 7x14bold are preferable

...SSE
Sum Squared Error

...MSE
Mean Squared Error

...anymore
If a frozen display has to be redrawn, e.g. because an overlapping window was moved, it gets updated. If the network has changed since the freeze, its contents will also have changed!

...closed
The loss of power by graph should be minimal.

...possible
SNNSv3.3 reads all pattern file formats, but writes only the new, flexible format. This way SNNS itself can be used as a conversion utility.

...values
C is the value read from line 0005

...part
The F569#569 layer consists of three internal layers. See chapter gif.

...weight
Every mean vector 746#746 of a class is represented by a class unit. The elements of these vectors are stored in the weights between class unit and the input units.

...units
This case may be transformed into a network with an additional hidden unit for each input unit and a single connection with unity weight from each input unit to its corresponding hidden unit.

...step
If only an upper bound n for the number of processing steps is known, the input patterns may consist of windows containing the current input pattern together with a sequence of the previous n-1 input patterns. The network then develops a focus to the sequence element in the input window corresponding to the best number of processing steps.

...layer.
The candidate units are realized as special units in SNNS.

...correctly
This is only important for the chosen realization of the ART1 learning algorithm in SNNS

...mapped
Different ART1025#1025 classes may be mapped onto the same category.

...vector
c will be used as index for the winning unit in the competitive layer throughout this text

...neighborhood
Neighborhood is defined as the set of units within a certain radius of the winner. So 1061#1061 would be the the eight direct neighbors in the 2D grid; 1062#1062 would be 1063#1063 plus the 16 next closest; etc.

...detail
For any comments or questions concerning the implementation of an autoassociative memory please refer to Jamie DeCoster at jamie@psych.purdue.edu

...generalization
Generalization: ability of a neural net to recognize unseen patterns (test set) after training

...10pm
This construction is necessary since `at' can read only from stdin.

...IO-type
The term T-type was changed to IO-type after completion of the kernel

...main-event-dispatch-loop
SNNS-XGUI interpretes this as an application.

...variable
Otherways every widget would need its own local variables

Niels Mache
Wed May 17 11:23:58 MET DST 1995