All examples for NN
#====================================================
*call nn net
nodes input 3 -0.5 0.5 in-layer
nodes output 2 -0.5 0.5 out-layer
links active 0 0 in-layer out-layer
# This call creates an ANN named "net".
# The input layer is called "in-layer", the output layer is
# "out-layer". Response functions are sigmoidal (default), the
# starting biases are random values unifirmly distributed between
# -0.5 and 0.5.
# The list "links" connects these layers with active (adjustable during
# training) weights. The original values of the weights are all 0.