A set of neural network routines were developed for this thesis, and other related work. As a result these routines have a generic architecture that lends them to a large number of tasks.
use an arbitrary number of neurons
define connections, inputs and outputs for a neural network architecture
use an arbitrary number of networks concurrently
use various activation functions
apply backpropagation for learning
store and retrieve message strings associated with the defined neural network
store and retrieve the neural network from an ASCII file
and, do obvious functions like set network inputs, get network outputs, set expected output values, get connection values, and set connection values.
The neural networks are stored by the subroutines as linked lists. There are lists for neurons, connections, inputs and outputs. Since the neurons are defined in lists, they must be referenced by numbers, ranging from 0 to the number defined.
To create a network, the connections between neurons are defined. A connection to a neuron inherently implies that a neuron is part of the network. Thus, when a connection is defined, the neuron list is updated, and two connections are added to the connection list. One connection points forward from one neuron to another, and the other connection points backward from the second neuron to the first (for training). These connections are assigned the same random weight between -0.2 and 0.2.
The structure of connections is such that each neuron has a pointer to the first connection, in a linked list of connections, which are acting as inputs from predecessor neurons. The neuron also has a pointer to the first connection, in the linked list of connections, which are pointing to the neurons receiving its output. Thus, the inputs and outputs to the neuron are readily accessible for each neuron.
It is necessary to define which neurons are serving as inputs, and which neurons are serving as outputs. This is done by a declaration that a neuron is one, or the other. The declared neuron is added to a list of input, or output neurons. Since there are multiple networks, there is a set of input and output lists for each network, even though the networks may share common neurons. The network which is selected determines which list the inputs and outputs are defined in. When a network is selected, then all solutions and training happens according to its particular list of inputs and outputs.
When a network is stored, all values are written in ASCII. This makes the files completely portable between computing platforms, although quite bulky. Within these files, it is possible to store some data of the users choice. This data is described with message strings. There are a number of slots which these message strings may be written into, and read from. Through appropriate use, the message strings may be used to store a numerical value for a system (which has been converted to a string).
The neural network activation function is defined as a global variable. Thus, when initializing the network, this value may be set to choose the activation function for every neuron. Available activation functions include sigmoid, new sigmoid, tanh, and linear.
Solution of the network requires that a set of inputs be applied, and the network be evaluated for those values. The results may then be read from the output neurons. When training, the network must first be solved for a set of inputs, the network should be told what the expected outputs are, and then the backpropagation algorithm may be applied. The weight changes may occur then, or later.
There are two modes for training the neural network. Weights may be updated on the basis of small changes for every training vector (point training), or by a single change for an entire set of training vectors (set training). If the network is to be trained for each vector, then the backpropagation sub-routine only need be aware that point training is in use. If the neural networks are to use set training, then they must be instructed to collect weight changes, and then be instructed to apply the average of these when the entire set of training vectors has been tried.
As a point of interest, it should be noted that the subroutines actually produce two lists of neuron precedence before forward, or back, propagation occurs. One list will keep track of the firing order for the neurons using forward propagation. The second list will track the firing order for back propagation of errors for training. These lists are developed so as to ensure input completeness before processing a neuron. This will actually speed the processing time by eliminating the need to constantly reevaluate the network connections for every forward or backward pass.
In the interest of speed, there are very few error checks in the sub-routines. As a result care should be taken in all aspects of network declaration and application.
net_demo(): A program which demonstrates the use of the neural network subroutines through the application to the Exclusive Or Problem. The network is loaded, or defined, then trained, and tested. This routine uses another subroutine for training.
net_quick(): A subroutine used by net_demo() for the sake of brevity when training.
net_init(): This routine will clear all of the lists and pointers, so as to present an empty storage space. This routine expects the network activation function to be indicated at this time.
net_arc_define(): will define the connection between two neurons and assign a random weight to the connection.
def_input(): defines a network input neuron.
def_output(): defines a network output neuron.
net_write(): will write a currently defined neural network to an ASCII file, for retrieval later.
net_read(): will read a previously saved neural network file. If the file is not available then an error code will be returned.
net_put_string(): a tool for allowing the user to append their own messages to the neural network file, to be saved with the ASCII file.
net_get_string(): a tool for recovering user defined strings, which have been recovered from a saved neural network file.
net_number(): Allows the user to change the neural network input/output lists being used. The default is zero and thus, this need not be called unless concurrent networks are required.
net_input(): defines a numerical value for network input
net_solve(): will do a forward propagation pass of all inputs to the neural network, and produce a set of outputs.
net_output(): returns a numerical output, resulting from a forward pass of input values.
net_expect(): allows definition of an expected output value for the neural network.
net_back_prop(): will use supplied learning and smoothing rates to determine weight changes. If point training is being used, the weight changes will be made immediately, otherwise net_update() will have to be called after the entire set has been applied, and changes are to be made.
net_update(): will make weight changes resulting from a number of applications of the backpropagation algorithm under set training. If point training is being used this algorithm will have no effect.
net_set_weight(): This will allow the user to alter the weight value between two arbitrary neurons in the neural network.
net_weight(): This function will return the connection weight between any two neurons.
G_INVERSE(): The activation functions inverse.
G_DIF(): The differentiated activation function
net_prep(): will create lists for forward, and backward propagation. The lists will determine the neuron firing orders, based upon neuron input/output completeness.
updater: will determine whether point or set training is used. The should be set equal to SET_TRAIN for set training, and to POINT_TRAIN for point training.
* This package of routines will serve as a neural network simulator. At
* present the routines use feedforward neural networks, with the
* Backpropagation learning algorithm [Rummelhart et.al., 1986]. There are
* a number of features available which may be explored by looking at the
* These routines use a fairly sophisticated data structure (using lists)
* to help in processing speed for the network, while maintaining flexibility.
#include <stdio.h> /* function prototypes stored here */
#define EMPTY -1 /* to indicate unused */
#define USED 1 /* to indicate used */
#define WAITING 2 /* to indicate unprocessed */
#define POINT_TRAIN 3 /* for neural network point training */
#define SET_TRAIN 4 /* for neural network set training */
#define LAST -1 /* Condition Codes for internal flags */
#define NO_ERROR 0 /* indicates state ok */
#define ERROR 1 /* indicates a big problem */
#define SIGMOID 1 /* Activation Function Types */
#define NEURONS 300 /* Neuron count is not dynamic */
#define NUM_DATA 40 /* Maximum number of message strings */
#define LEN_DATA 50 /* maximum length of data strings */
#define NUMBER_NETS 7 /* maximum number of concurrent networks */
#define UPDATER updater /* updater determines the training method */
* Definintions of functions contained within.
struct nodes{ /* Information about one Neuron */
struct connections { /* Info about inter-neuron connections */
struct nodes node[NEURONS]; /* define storage list for neurons */
struct connections syn[NEURONS*NEURONS/2]; /* define connection list */
char net_strings[NUM_DATA][LEN_DATA]; /* define message list */
int activation_type, /* stores activation type */
sum_cnt[NUMBER_NETS], /* stores set training data */
num, /* The number of the current network */
num_nets, /* number of parallel networks allowed */
point_arc, /* pointer to last spot in connection list */
point_node, /* pointer to last spot in neuron list */
updater = SET_TRAIN, /* Network training type */
pnt_in[NUMBER_NETS], /* Pointer to start of input list for nets */
pnt_out[NUMBER_NETS], /* pointer to start of output list for nets */
in_node[NUMBER_NETS][NEURONS], /* lists of input nodes for nets */
out_node[NUMBER_NETS][NEURONS], /* lists of output nodes for nets */
work_pnt[NUMBER_NETS], /* Pnt to first spot in fwd prop order list */
work[NUMBER_NETS][NEURONS], /* Forward Propagation order list */
back_pnt[NUMBER_NETS], /* Pnt to first spot in back prop order list */
back[NUMBER_NETS][NEURONS], /* Backward propagation order list */
marker[NEURONS]; /* work array for connection ordering */
* This is a set of neural network subroutines, and an example
* program that will use them. The test is based upon simulating the
* Exclusive or Gate problems described in [Minsky and Papert, 1969]
* and [Rummelhart et.al., 1986]. This is a classic example of the
* use of a hidden layer to deal with a problem which is learly
static char aa[20]; /* Define a Work Variable */
static int i, j; /* define some work variables */
updater = POINT_TRAIN; /* use set training for network */
net_init(SIGMOID); /* init net with sigmoid activation*/
printf("\nSmoothing Rate (eg. 0.8) :");
aaa = atof(aa); /* get a smooting rate */
printf("\nLearning Rate (eg. 0.93) :");
rrr = atof(aa); /* get a learning rate */
printf("\nNumber of Iterations (eg. 500) :");
i = atoi(aa); /* get number of training iterations */
if(net_read("eor.net") == ERROR){ /* Load network if on disk */
net_arc_define(0, 3); /* if not build a new network */
net_arc_define(1, 3); /* first define all inter-neuron */
net_arc_define(2, 3); /* connections. Neuron 2 is a bias */
net_arc_define(3, 6); /* input. Neurons 0 and 1 are the */
net_arc_define(0, 4); /* EOR inputs, and neuron 6 is the */
net_arc_define(1, 4); /* EOR output. Neurons 3, 4, 5, 7 */
net_arc_define(2, 4); /* are the hidden layer */
def_input(0); /* indicate input neurons */
def_output(6); /* indicate output neuron */
net_input(2, 0.5); /* input some constant bias value */
for(j = 0; j < i; j++){ /* do ’i’ training iterations */
net_quick(0.9, 0.9, 0.1, rrr, aaa);
net_quick(0.9, 0.1, 0.9, rrr, aaa);
net_quick(0.1, 0.1, 0.1, rrr, aaa);
net_quick(0.1, 0.9, 0.9, rrr, aaa);
net_update(); /* will update if set training used */
want = 0.0; /* reset error sum */
net_input(0, 0.1); /* Test for case 1 */
want += fabs(net_output(6)-.1); /* acumulate error for case 1 */
printf("output for 0, 0, %f \n", net_output(6));
net_input(0, 0.9); /* Test for case 2 */
want += fabs(net_output(6): 0.9); /* acumulate error for case 2 */
printf("output for 1, 0, %f \n", net_output(6));
net_input(0, 0.1); /* Test for case 3 */
want += fabs(net_output(6): 0.9); /* acumulate error for case 3 */
printf("output for 0, 1, %f \n", net_output(6));
net_input(0, 0.9); /* Test for case 4 */
want += fabs(net_output(6): .1); /* acumulate error for case 4 */
printf("output for 1, 1, %f \n", net_output(6));
printf("%d iteraions, %f Learn, error = %f \n", j, rrr, want);
net_write("eor.net"); /* Save network on disk */
void net_quick(x, y, z, rrr, aaa)
* TRAINING SUBROUTINE FOR EOR EXAMPLE
* This routine will set up the training inputs and output, and update
* the network for one pattern.
* VARIABLES: x, y: network inputs
* z: expected output for inputs
net_input(0, x); /* input new values */
net_expect(6, z); /* indicate expected values */
net_solve(); /* update the network outputs for inputs */
net_back_prop(rrr, aaa);/* update connections with backprop */
* This routine will initialize the activation function type, and
* wipe out all defined node connections, and neurons.
* VARIABLES: type: the type of activation function used by the net
* SIGMOID : the sigmoid function
* NEW_SIGMOID : the new sigmoid function
activation_type = type; /* store activation type */
point_arc = LAST; /* empty connection list */
point_node = LAST; /* empty neuron list */
for(i = 0; i < NUMBER_NETS; i++){
pnt_in[i] = LAST; /* clear network input list */
pnt_out[i] = LAST; /* clear network output list */
node[i].point = LAST; /* clear forward connection pointers */
node[i].forward_point = LAST; /* clear back conn. pointers */
num = 0; /* set network to number 0 */
num_nets = NUMBER_NETS; /* Set maximum number of networks */
void net_arc_define(neuron_1, neuron_2)
* This routine will define a forward and back pointing connection
* VARIABLES: neuron_1: the source neuron
* neuron_2: the destination neuron
i = rand(); i = rand(); /* get a random integer then convert to */
/* real weight for conection */
weight = 0.2*((double)(32767 & rand()) /16368.0: 1.0);
/* add backward connection between neurons */
point_arc++; /* increase pointer for connection list */
syn[point_arc].point = node[neuron_2].point;
node[neuron_2].point = point_arc; /* insert connection in list */
syn[point_arc].neuron = neuron_1; /* make point to neuron 1 */
syn[point_arc].sum = 0.0; /* reset training sum */
syn[point_arc].weight = weight; /* connection weight */
syn[point_arc].last_weight = 0; /* weight before last update */
/* add forward connection between neurons */
syn[point_arc].point = node[neuron_1].forward_point;
node[neuron_1].forward_point = point_arc;
syn[point_arc].neuron = neuron_2;
syn[point_arc].weight = weight;
syn[point_arc].last_weight = 0;
if(point_node < neuron_1) point_node = neuron_1; /* bump up pointer to*/
if(point_node < neuron_2) point_node = neuron_2; /* last neuron used */
net_prep(); /* make list of order which neurons are to */
/* be forward and backward propagation */
* The contents of the net may be read from a file on the disk.
* VARIABLES: file_name: the name of the file to load the net from
* RETURNS: ERROR: if the file could not be opened
* NO_ERROR: if the file was opened
error = ERROR; /* set error flag */
if((file_ptr = fopen(file_name, "r")) != NULL){ /* if file exists */
error = NO_ERROR; /* clear error flag */
fscanf(file_ptr, "%d,%d,%d,%d\n", &point_node, &point_arc,
&num_nets, &activation_type); /* get sizes of lists */
for(i = 0; i <= point_node; i++){ /* load neuron list */
fscanf(file_ptr, "%d,%d,%lf,%lf,%lf,%lf\n",
&node[i].point, &node[i].forward_point,
&node[i].sum, &node[i].out, &node[i].expect,
for(i = 0; i <= point_arc; i++){ /* load connection list */
fscanf(file_ptr, "%d,%d,%lf,%lf\n", &syn[i].point,
&syn[i].neuron, &syn[i].weight,
for(i = 0; i < num_nets; i++){ /* get input/output pointers */
fscanf(file_ptr, "%d,%d\n", &pnt_in[i], &pnt_out[i]);
for(j = 0; j < num_nets; j++){ /* get list of input nodes */
for(i = 0; i <= pnt_in[j]; i++){
fscanf(file_ptr, "%d\n", &in_node[j][i]);
for(j = 0; j < num_nets; j++){ /* get list of output nodes */
for(i = 0; i <= pnt_out[j]; i++){
fscanf(file_ptr, "%d\n", &out_node[j][i]);
fscanf(file_ptr, "%d\n", &pts); /* get message strings */
for(j = 0; ((j < NUM_DATA) && (j < pts)); j++){
fgets(net_strings[j], LEN_DATA, file_ptr);
for(i = 0; i < num_nets; i++){ /* prepare processing order lists */
net_number(i); /* for all of the networks */
net_number(0); /* set network number to 0 */
* The neural net may be written to disk.
* VARIABLES: file_name: the name of the destination file
file_ptr = fopen(file_name, "w"); /* Open file */
fprintf(file_ptr, "%d,%d,%d,%d\n", point_node, point_arc,
num_nets, activation_type); /* write list lengths */
for(i = 0; i <= point_node; i++){ /* write neuron list */
fprintf(file_ptr, "%d,%d,%15.12lf,%15.12lf,%15.12lf,%15.12lf\n",
node[i].point, node[i].forward_point,
node[i].sum, node[i].out, node[i].expect,
for(i = 0; i <= point_arc; i++){ /* write connection list */
fprintf(file_ptr, "%d,%d,%15.12lf,%15.12lf\n", syn[i].point,
for(i = 0; i < num_nets; i++){ /* write in/out point list */
fprintf(file_ptr, "%d,%d\n", pnt_in[i], pnt_out[i]);
for(j = 0; j < num_nets; j++){ /* write input lists */
for(i = 0; i <= pnt_in[j]; i++){
fprintf(file_ptr, "%d\n", in_node[j][i]);
for(j = 0; j < num_nets; j++){ /* write output lists */
for(i = 0; i <= pnt_out[j]; i++){
fprintf(file_ptr, "%d\n", out_node[j][i]);
fprintf(file_ptr, "%d\n", NUM_DATA); /* write message strings */
for(j = 0; j < NUM_DATA; j++){
fprintf(file_ptr, "%s\n", net_strings[j]);
* This will allow an input neuron to be defined for the current network.
* VARIABLE: neuron: the neuron to be defined as an input.
pnt_in[num]++; /* increase number of inputs */
in_node[num][pnt_in[num]] = neuron; /* add input to list */
net_prep(); /* reorder processing list */
* The specified neuron is defined as an output neuron for the current
* VARIABLE: neuron: the neuron defined as an output
pnt_out[num]++; /* increase output count */
out_node[num][pnt_out[num]] = neuron; /* add output to list */
net_prep(); /* reorder processing list */
* This will allow an input to a neuron to be defined.
* VARIABLES: neuron: the neuron number
* value: the value input to the neuron
node[neuron].out = value; /* put value on output of neuron */
* RETURN OUTPUT VALUE FOR NETWORK
* The output from the specified neuron is returned from
* VARIABLE: neuron: the output neuron of interest
* RETURNS: the output value from the specified neuron
void net_expect(neuron, value)
* DEFINE EXPECTED OUTPUT FROM NETWORK
* The output expected from the specified node is defined here.
* VARIABLES: neuron: the neuron of interest
* value: the value expected from the specified neuron
void net_set_weight(from, to, weight)
point = node[to].point; /* get pointer to list of */
while(point != LAST){/* add until out of inputs */
if(syn[point].neuron == from){
* RETRIEVE VALUE OF CONNECTION WEIGHT
point = node[to].point; /* get pointer to list of */
while(point != LAST){/* add until out of inputs */
if(syn[point].neuron == from){
* CREATE LISTS OF NEURON PROCESSING ORDER
* In the interest of speed, and flexibility for future software additions
* it is advantageous to have lists of processing order for the forward
* propagation of inputs, and the backward propagation of errors. This
* subroutine will create a list of neurons which indicates the processing
* order for neurons to ensure that all of their predecessors have been
* activated already. Another list will be created for backward propagation
* to ensure that all the neurons after the current have been satisfied.
* these lists are both generated before training and network solution so
* that they will increase processing speed.
static int i, j, pointer, wait_count, change_count, flag;
* This section will construct the pointer table for forward prop
for(i = 0; i <= point_node; i++){
for(i = 0; i <= point_node; i++){
if((node[i].point != LAST) || (node[i].forward_point != LAST)){
for(i = 0; i < num_nets; i++){
for(j = 0; j <= pnt_in[i]; j++)
marker[in_node[i][j]] = EMPTY;
for(i = 0; i <= pnt_in[num]; i++){
marker[in_node[num][i]] = USED;
work[num][work_pnt[num]] = in_node[num][i];
while((change_count > 0) && (wait_count > 0)){
for(i = 0; i <= point_node; i++){
while((marker[i] == WAITING) && (flag == 0)
if(marker[syn[pointer].neuron] != USED){
if((flag == 0) && (marker[i] == WAITING)){
if(marker[i] == WAITING) wait_count++;
if((change_count == 0) && (wait_count > 0)){
printf("The Forward Network is incomplete ! \n");
printf("The Forward Network is Alright ! \n");
* This section will do back propogation preperation.
for(i = 0; i <= point_node; i++){
for(i = 0; i <= point_node; i++){
if((node[i].point != LAST) || (node[i].forward_point != LAST)){
for(i = 0; i < num_nets; i++){
for(j = 0; j <= pnt_out[i]; j++)
marker[out_node[i][j]] = EMPTY;
for(i = 0; i <= pnt_out[num]; i++){
marker[out_node[num][i]] = USED;
back[num][back_pnt[num]] = out_node[num][i];
while((change_count > 0) && (wait_count > 0) && (back_pnt[num] > 0)){
for(i = 0; i <= point_node; i++){
pointer = node[i].forward_point;
while((marker[i] == WAITING) && (flag == 0)
if(marker[syn[pointer].neuron] != USED){
if((flag == 0) && (marker[i] == WAITING)){
if(marker[i] == WAITING) wait_count++;
if((change_count == 0) && (wait_count > 0)){
printf("The backpropagation Network is Incomplete ! \n");
printf("The Backpropagation Network is Okey Dokey ! \n");
* SOLVE THE NETWORK FOR A SET OF INPUTS
* This little routine is the heart of the neural network simulator. The
* network values will be forward propagated through the network to the
for(i = pnt_in[num]+1; i < work_pnt[num]; i++){ /* use processing list*/
point = node[work[num][i]].point; /* get pointer to list of */
node[work[num][i]].sum = 0.0; /* set input sum to zero */
while(point != LAST){ /* add until out of inputs */
node[work[num][i]].sum += syn[point].weight
* node[syn[point].neuron].out;
/* apply activation to inputs */
node[work[num][i]].out = G(node[work[num][i]].sum);
void net_back_prop(rate, smooth)
* UPDATE NETWORK CONNECTION WEIGHTS
* The back propogation is done here based upon the results
* of the last network solution, and the net expect values.
* VARIABLES: rate: the learning rate to be used
* smooth: the smoothing function value
for(i = 0; i <= pnt_out[num]; i++){ /* calculate output deltas */
node[out_node[num][i]].delta = G_DIF(node[out_node[num][i]].sum) *
(node[out_node[num][i]].expect: node[out_node[num][i]].out);
/* backpropagate errors with order */
for(i = pnt_out[num] + 1; i < back_pnt[num]; i++){
point = node[back[num][i]].forward_point;
node[back[num][i]].delta = 0.0;
node[back[num][i]].delta += syn[point-1].weight*
node[syn[point].neuron].delta;
node[back[num][i]].delta *= G_DIF(node[back[num][i]].sum);
/* update connection weights with */
for(i = 0; i < back_pnt[num]; i++){
point = node[back[num][i]].point;
delta = (rate * node[back[num][i]].delta
* node[syn[point].neuron].out)
+ (smooth * syn[point].last_weight);
if(updater == POINT_TRAIN){ /* update now */
syn[point].last_weight = delta;
if(updater == SET_TRAIN){ /* save updates til later */
if(updater == SET_TRAIN) sum_cnt[num]++; /* count training trys */
* MAKE WEIGHT CHANGES FOR SET TRAINING
* This routine will update the network based upon the learned weights,
* if the training is being done with set training. Set training refers to
* finding all of the suggested weight changes for the set, and then
* applying the average of all. Point training refers to updating the
* weight values after every estimation of the required change.
if(updater == SET_TRAIN){ /* only for set training */
for(i = pnt_in[num]+1; i < work_pnt[num]; i++){
point = node[work[num][i]].point;
delta = syn[point].sum / sum_cnt[num];
syn[point].last_weight = delta;
* This is the activation function which takes the input and
* VARIABLES: input: the sum from within the neuron
* RETURNS: The value of the activation function for the sum
if(activation_type == SIGMOID)
output = 1.0/(1.0+exp(-input));
if(activation_type == NEW_SIGMOID)
output = 1.0/(1.0+exp(-input)): .5;
* GIVE SUM WHICH PRODUCES ACTIVATION OUTPUT
* This will give the input to the activation function, based
* upon the output of the activation function.
* VARIABLE: output: the output of the activation function
* RETURNS: the input the activation function required for the output
if(activation_type == SIGMOID)
input = -log(1.0/output: 1.0);
if(activation_type == NEW_SIGMOID)
input = -log(1.0/(output+.5): 1.0);
* DERIVATIVE OF ACTIVATION FUNCTION
* This takes the input and returns the first derivative of the
* VARIABLE: input: the input to the activation function
* RETURNS: the derivative of the activation function with the input
if(activation_type == SIGMOID){
output = outter * (1.0: outter);
if(activation_type == NEW_SIGMOID){
output = outter * (1.0: outter);
void net_put_string(n, string)
* This feature was added so that variables and message strings could be
* stored with a network in a disk file.
* VARIABLES: n: number of message string
* string: a string to be stored in location n
strcpy(net_strings[n], string);
void net_get_string(n, string)
* This allows message strings stored to be returned
* VARIABLES: n: number of message location
* string: the message string to be returned
strcpy(string, net_strings[n]);
* In an effort to allow more than one network to be used by the same
* program, it was necessary to create the ability to choose networks.
* Thus all the networks are all defined together with the same neuron
* numbers, but when dealing with inputs and outputs, and processing,
* the network number must be set to indicate the network to be used.
* VARIABLE: number: the current network number