19. Appendix C: Neural Network Subroutines

19.1 C.1 Introduction:

A set of neural network routines were developed for this thesis, and other related work. As a result these routines have a generic architecture that lends them to a large number of tasks.

At present the routines may,

use an arbitrary number of neurons

define connections, inputs and outputs for a neural network architecture

use an arbitrary number of networks concurrently

use various activation functions

apply backpropagation for learning

store and retrieve message strings associated with the defined neural network

store and retrieve the neural network from an ASCII file

and, do obvious functions like set network inputs, get network outputs, set expected output values, get connection values, and set connection values.

19.2 C.2 Subroutine Architecture:

The neural networks are stored by the subroutines as linked lists. There are lists for neurons, connections, inputs and outputs. Since the neurons are defined in lists, they must be referenced by numbers, ranging from 0 to the number defined.

To create a network, the connections between neurons are defined. A connection to a neuron inherently implies that a neuron is part of the network. Thus, when a connection is defined, the neuron list is updated, and two connections are added to the connection list. One connection points forward from one neuron to another, and the other connection points backward from the second neuron to the first (for training). These connections are assigned the same random weight between -0.2 and 0.2.

The structure of connections is such that each neuron has a pointer to the first connection, in a linked list of connections, which are acting as inputs from predecessor neurons. The neuron also has a pointer to the first connection, in the linked list of connections, which are pointing to the neurons receiving its output. Thus, the inputs and outputs to the neuron are readily accessible for each neuron.

It is necessary to define which neurons are serving as inputs, and which neurons are serving as outputs. This is done by a declaration that a neuron is one, or the other. The declared neuron is added to a list of input, or output neurons. Since there are multiple networks, there is a set of input and output lists for each network, even though the networks may share common neurons. The network which is selected determines which list the inputs and outputs are defined in. When a network is selected, then all solutions and training happens according to its particular list of inputs and outputs.

When a network is stored, all values are written in ASCII. This makes the files completely portable between computing platforms, although quite bulky. Within these files, it is possible to store some data of the users choice. This data is described with message strings. There are a number of slots which these message strings may be written into, and read from. Through appropriate use, the message strings may be used to store a numerical value for a system (which has been converted to a string).

The neural network activation function is defined as a global variable. Thus, when initializing the network, this value may be set to choose the activation function for every neuron. Available activation functions include sigmoid, new sigmoid, tanh, and linear.

Solution of the network requires that a set of inputs be applied, and the network be evaluated for those values. The results may then be read from the output neurons. When training, the network must first be solved for a set of inputs, the network should be told what the expected outputs are, and then the backpropagation algorithm may be applied. The weight changes may occur then, or later.

There are two modes for training the neural network. Weights may be updated on the basis of small changes for every training vector (point training), or by a single change for an entire set of training vectors (set training). If the network is to be trained for each vector, then the backpropagation sub-routine only need be aware that point training is in use. If the neural networks are to use set training, then they must be instructed to collect weight changes, and then be instructed to apply the average of these when the entire set of training vectors has been tried.

As a point of interest, it should be noted that the subroutines actually produce two lists of neuron precedence before forward, or back, propagation occurs. One list will keep track of the firing order for the neurons using forward propagation. The second list will track the firing order for back propagation of errors for training. These lists are developed so as to ensure input completeness before processing a neuron. This will actually speed the processing time by eliminating the need to constantly reevaluate the network connections for every forward or backward pass.

In the interest of speed, there are very few error checks in the sub-routines. As a result care should be taken in all aspects of network declaration and application.

19.3 C.3 Example Subroutines:

net_demo(): A program which demonstrates the use of the neural network subroutines through the application to the Exclusive Or Problem. The network is loaded, or defined, then trained, and tested. This routine uses another subroutine for training.

net_quick(): A subroutine used by net_demo() for the sake of brevity when training.

19.4 C.4 Neural Network Initialization:

net_init(): This routine will clear all of the lists and pointers, so as to present an empty storage space. This routine expects the network activation function to be indicated at this time.

19.5 C.5 Network Definition and Application Tools:

net_arc_define(): will define the connection between two neurons and assign a random weight to the connection.

def_input(): defines a network input neuron.

def_output(): defines a network output neuron.

net_write(): will write a currently defined neural network to an ASCII file, for retrieval later.

net_read(): will read a previously saved neural network file. If the file is not available then an error code will be returned.

net_put_string(): a tool for allowing the user to append their own messages to the neural network file, to be saved with the ASCII file.

net_get_string(): a tool for recovering user defined strings, which have been recovered from a saved neural network file.

net_number(): Allows the user to change the neural network input/output lists being used. The default is zero and thus, this need not be called unless concurrent networks are required.

19.6 C.6 Neural Network Calculation Subroutines:

net_input(): defines a numerical value for network input

net_solve(): will do a forward propagation pass of all inputs to the neural network, and produce a set of outputs.

net_output(): returns a numerical output, resulting from a forward pass of input values.

net_expect(): allows definition of an expected output value for the neural network.

net_back_prop(): will use supplied learning and smoothing rates to determine weight changes. If point training is being used, the weight changes will be made immediately, otherwise net_update() will have to be called after the entire set has been applied, and changes are to be made.

net_update(): will make weight changes resulting from a number of applications of the backpropagation algorithm under set training. If point training is being used this algorithm will have no effect.

net_set_weight(): This will allow the user to alter the weight value between two arbitrary neurons in the neural network.

net_weight(): This function will return the connection weight between any two neurons.

19.7 C.7 Neural Network Background Utilities:

G(): The activation functions

G_INVERSE(): The activation functions inverse.

G_DIF(): The differentiated activation function

net_prep(): will create lists for forward, and backward propagation. The lists will determine the neuron firing orders, based upon neuron input/output completeness.

19.8 C.8 User Defined Global Variable:

updater: will determine whether point or set training is used. The should be set equal to SET_TRAIN for set training, and to POINT_TRAIN for point training.

 

/*

* NEURAL NETWORK SIMULATOR

*

* This package of routines will serve as a neural network simulator. At

* present the routines use feedforward neural networks, with the

* Backpropagation learning algorithm [Rummelhart et.al., 1986]. There are

* a number of features available which may be explored by looking at the

* routines them selves.

*

* These routines use a fairly sophisticated data structure (using lists)

* to help in processing speed for the network, while maintaining flexibility.

*

* written by Hugh Jack

*

* August 8th, 1990.

*/

 

#include <stdio.h> /* function prototypes stored here */

#include <math.h>

 

#define EMPTY -1 /* to indicate unused */

#define USED 1 /* to indicate used */

#define WAITING 2 /* to indicate unprocessed */

#define POINT_TRAIN 3 /* for neural network point training */

#define SET_TRAIN 4 /* for neural network set training */

 

#define LAST -1 /* Condition Codes for internal flags */

#define NO_ERROR 0 /* indicates state ok */

#define ERROR 1 /* indicates a big problem */

 

#define SIGMOID 1 /* Activation Function Types */

#define SUM 2

#define NEW_SIGMOID 3

#define TANH 4

 

#define NEURONS 300 /* Neuron count is not dynamic */

#define NUM_DATA 40 /* Maximum number of message strings */

#define LEN_DATA 50 /* maximum length of data strings */

#define NUMBER_NETS 7 /* maximum number of concurrent networks */

#define UPDATER updater /* updater determines the training method */

 

/*

* Definintions of functions contained within.

*/

 

void net_init(),

net_demo(),

net_quick(),

net_arc_define(),

net_input(),

def_input(),

net_set_weight(),

net_get_string(),

net_put_string(),

net_number(),

def_output(),

net_write(),

net_solve(),

net_prep(),

net_expect(),

net_update(),

net_back_prop();

 

double net_output(),

net_weight(),

G(),

G_INVERSE(),

G_DIF();

 

int net_read();

 

struct nodes{ /* Information about one Neuron */

int point;

int forward_point;

double sum;

double out;

double expect;

double delta;};

 

struct connections { /* Info about inter-neuron connections */

int point;

int neuron;

double sum;

double weight;

double last_weight;};

 

struct nodes node[NEURONS]; /* define storage list for neurons */

 

struct connections syn[NEURONS*NEURONS/2]; /* define connection list */

 

char net_strings[NUM_DATA][LEN_DATA]; /* define message list */

 

int activation_type, /* stores activation type */

sum_cnt[NUMBER_NETS], /* stores set training data */

num, /* The number of the current network */

num_nets, /* number of parallel networks allowed */

point_arc, /* pointer to last spot in connection list */

point_node, /* pointer to last spot in neuron list */

updater = SET_TRAIN, /* Network training type */

pnt_in[NUMBER_NETS], /* Pointer to start of input list for nets */

pnt_out[NUMBER_NETS], /* pointer to start of output list for nets */

in_node[NUMBER_NETS][NEURONS], /* lists of input nodes for nets */

out_node[NUMBER_NETS][NEURONS], /* lists of output nodes for nets */

work_pnt[NUMBER_NETS], /* Pnt to first spot in fwd prop order list */

work[NUMBER_NETS][NEURONS], /* Forward Propagation order list */

back_pnt[NUMBER_NETS], /* Pnt to first spot in back prop order list */

back[NUMBER_NETS][NEURONS], /* Backward propagation order list */

marker[NEURONS]; /* work array for connection ordering */

 

 

 

/*

*main()

*{

* net_demo();

*}

*/

 

 

void net_demo()

/*

* EOR Net test

*

* This is a set of neural network subroutines, and an example

* program that will use them. The test is based upon simulating the

* Exclusive or Gate problems described in [Minsky and Papert, 1969]

* and [Rummelhart et.al., 1986]. This is a classic example of the

* use of a hidden layer to deal with a problem which is learly

* inseparable.

*

* August 8th, 1990.

*/

{

static char aa[20]; /* Define a Work Variable */

static int i, j; /* define some work variables */

static double want, rrr, aaa;

 

updater = POINT_TRAIN; /* use set training for network */

net_init(SIGMOID); /* init net with sigmoid activation*/

printf("\nSmoothing Rate (eg. 0.8) :");

gets(aa);

aaa = atof(aa); /* get a smooting rate */

printf("\nLearning Rate (eg. 0.93) :");

gets(aa);

rrr = atof(aa); /* get a learning rate */

printf("\nNumber of Iterations (eg. 500) :");

gets(aa);

i = atoi(aa); /* get number of training iterations */

 

if(net_read("eor.net") == ERROR){ /* Load network if on disk */

net_arc_define(0, 3); /* if not build a new network */

net_arc_define(1, 3); /* first define all inter-neuron */

net_arc_define(2, 3); /* connections. Neuron 2 is a bias */

net_arc_define(3, 6); /* input. Neurons 0 and 1 are the */

 

net_arc_define(0, 4); /* EOR inputs, and neuron 6 is the */

net_arc_define(1, 4); /* EOR output. Neurons 3, 4, 5, 7 */

net_arc_define(2, 4); /* are the hidden layer */

net_arc_define(4, 6);

 

net_arc_define(0, 5);

net_arc_define(1, 5);

net_arc_define(2, 5);

net_arc_define(5, 6);

 

net_arc_define(0, 7);

net_arc_define(1, 7);

net_arc_define(2, 7);

net_arc_define(7, 6);

 

net_arc_define(2, 6);

 

def_input(0); /* indicate input neurons */

def_input(1);

def_input(2);

def_output(6); /* indicate output neuron */

}

net_input(2, 0.5); /* input some constant bias value */

 

for(j = 0; j < i; j++){ /* do ’i’ training iterations */

net_quick(0.9, 0.9, 0.1, rrr, aaa);

net_quick(0.9, 0.1, 0.9, rrr, aaa);

net_quick(0.1, 0.1, 0.1, rrr, aaa);

net_quick(0.1, 0.9, 0.9, rrr, aaa);

net_update(); /* will update if set training used */

}

 

want = 0.0; /* reset error sum */

 

net_input(0, 0.1); /* Test for case 1 */

net_input(1, 0.1);

net_solve();

want += fabs(net_output(6)-.1); /* acumulate error for case 1 */

printf("output for 0, 0, %f \n", net_output(6));

 

net_input(0, 0.9); /* Test for case 2 */

net_input(1, 0.1);

net_solve();

want += fabs(net_output(6): 0.9); /* acumulate error for case 2 */

printf("output for 1, 0, %f \n", net_output(6));

 

net_input(0, 0.1); /* Test for case 3 */

net_input(1, 0.9);

net_solve();

want += fabs(net_output(6): 0.9); /* acumulate error for case 3 */

printf("output for 0, 1, %f \n", net_output(6));

 

net_input(0, 0.9); /* Test for case 4 */

net_input(1, 0.9);

net_solve();

want += fabs(net_output(6): .1); /* acumulate error for case 4 */

printf("output for 1, 1, %f \n", net_output(6));

 

printf("%d iteraions, %f Learn, error = %f \n", j, rrr, want);

net_write("eor.net"); /* Save network on disk */

}

 

 

 

 

void net_quick(x, y, z, rrr, aaa)

double x, y, z, rrr, aaa;

/*

* TRAINING SUBROUTINE FOR EOR EXAMPLE

*

* This routine will set up the training inputs and output, and update

* the network for one pattern.

*

* VARIABLES: x, y: network inputs

* z: expected output for inputs

* rrr: learning rate

* aaa: smoothing rate

*

* August 8th, 1990.

*/

{

net_input(0, x); /* input new values */

net_input(1, y);

net_expect(6, z); /* indicate expected values */

net_solve(); /* update the network outputs for inputs */

net_back_prop(rrr, aaa);/* update connections with backprop */

}

 

 

 

 

 

void net_init(type)

int type;

/*

* This routine will initialize the activation function type, and

* wipe out all defined node connections, and neurons.

*

* VARIABLES: type: the type of activation function used by the net

* SIGMOID : the sigmoid function

* SUM : a linear function

* NEW_SIGMOID : the new sigmoid function

* TANH : the tanh function

*

* October 10th, 1989.

*/

{

static int i;

 

activation_type = type; /* store activation type */

 

point_arc = LAST; /* empty connection list */

point_node = LAST; /* empty neuron list */

for(i = 0; i < NUMBER_NETS; i++){

pnt_in[i] = LAST; /* clear network input list */

pnt_out[i] = LAST; /* clear network output list */

}

for(i = 0; i < NEURONS; i++){

node[i].point = LAST; /* clear forward connection pointers */

node[i].forward_point = LAST; /* clear back conn. pointers */

}

num = 0; /* set network to number 0 */

num_nets = NUMBER_NETS; /* Set maximum number of networks */

}

 

 

 

 

void net_arc_define(neuron_1, neuron_2)

int neuron_1, neuron_2;

/*

* This routine will define a forward and back pointing connection

* between two neurons.

*

* VARIABLES: neuron_1: the source neuron

* neuron_2: the destination neuron

*

* October 10th, 1989.

*/

{

static double weight;

static int i;

 

i = rand(); i = rand(); /* get a random integer then convert to */

/* real weight for conection */

weight = 0.2*((double)(32767 & rand()) /16368.0: 1.0);

 

/* add backward connection between neurons */

point_arc++; /* increase pointer for connection list */

syn[point_arc].point = node[neuron_2].point;

node[neuron_2].point = point_arc; /* insert connection in list */

syn[point_arc].neuron = neuron_1; /* make point to neuron 1 */

syn[point_arc].sum = 0.0; /* reset training sum */

syn[point_arc].weight = weight; /* connection weight */

syn[point_arc].last_weight = 0; /* weight before last update */

 

/* add forward connection between neurons */

point_arc++;

syn[point_arc].point = node[neuron_1].forward_point;

node[neuron_1].forward_point = point_arc;

syn[point_arc].neuron = neuron_2;

syn[point_arc].sum = 0.0;

syn[point_arc].weight = weight;

syn[point_arc].last_weight = 0;

 

if(point_node < neuron_1) point_node = neuron_1; /* bump up pointer to*/

if(point_node < neuron_2) point_node = neuron_2; /* last neuron used */

net_prep(); /* make list of order which neurons are to */

/* be forward and backward propagation */

}

 

 

 

 

 

int net_read(file_name)

char *file_name;

/*

* LOAD NET FROM ACSII FILE

*

* The contents of the net may be read from a file on the disk.

*

* VARIABLES: file_name: the name of the file to load the net from

*

* RETURNS: ERROR: if the file could not be opened

* NO_ERROR: if the file was opened

*

* August 8th, 1990.

*/

{

FILE *file_ptr;

static int i, j, error, pts;

error = ERROR; /* set error flag */

if((file_ptr = fopen(file_name, "r")) != NULL){ /* if file exists */

error = NO_ERROR; /* clear error flag */

fscanf(file_ptr, "%d,%d,%d,%d\n", &point_node, &point_arc,

&num_nets, &activation_type); /* get sizes of lists */

 

for(i = 0; i <= point_node; i++){ /* load neuron list */

fscanf(file_ptr, "%d,%d,%lf,%lf,%lf,%lf\n",

&node[i].point, &node[i].forward_point,

&node[i].sum, &node[i].out, &node[i].expect,

&node[i].delta);

}

for(i = 0; i <= point_arc; i++){ /* load connection list */

fscanf(file_ptr, "%d,%d,%lf,%lf\n", &syn[i].point,

&syn[i].neuron, &syn[i].weight,

&syn[i].last_weight);

}

 

for(i = 0; i < num_nets; i++){ /* get input/output pointers */

fscanf(file_ptr, "%d,%d\n", &pnt_in[i], &pnt_out[i]);

}

for(j = 0; j < num_nets; j++){ /* get list of input nodes */

for(i = 0; i <= pnt_in[j]; i++){

fscanf(file_ptr, "%d\n", &in_node[j][i]);

}

}

for(j = 0; j < num_nets; j++){ /* get list of output nodes */

for(i = 0; i <= pnt_out[j]; i++){

fscanf(file_ptr, "%d\n", &out_node[j][i]);

}

}

fscanf(file_ptr, "%d\n", &pts); /* get message strings */

for(j = 0; ((j < NUM_DATA) && (j < pts)); j++){

fgets(net_strings[j], LEN_DATA, file_ptr);

}

 

fclose(file_ptr);

}

for(i = 0; i < num_nets; i++){ /* prepare processing order lists */

net_number(i); /* for all of the networks */

net_prep();

}

net_number(0); /* set network number to 0 */

return(error);

}

 

 

 

 

void net_write(file_name)

char *file_name;

/*

* SAVE NET TO ASCII FILE

*

* The neural net may be written to disk.

*

* VARIABLES: file_name: the name of the destination file

*

* August 8th, 1990.

*/

{

FILE *file_ptr;

static int i, j;

file_ptr = fopen(file_name, "w"); /* Open file */

fprintf(file_ptr, "%d,%d,%d,%d\n", point_node, point_arc,

num_nets, activation_type); /* write list lengths */

 

for(i = 0; i <= point_node; i++){ /* write neuron list */

fprintf(file_ptr, "%d,%d,%15.12lf,%15.12lf,%15.12lf,%15.12lf\n",

node[i].point, node[i].forward_point,

node[i].sum, node[i].out, node[i].expect,

node[i].delta);

}

for(i = 0; i <= point_arc; i++){ /* write connection list */

fprintf(file_ptr, "%d,%d,%15.12lf,%15.12lf\n", syn[i].point,

syn[i].neuron, syn[i].weight,

syn[i].last_weight);

}

for(i = 0; i < num_nets; i++){ /* write in/out point list */

fprintf(file_ptr, "%d,%d\n", pnt_in[i], pnt_out[i]);

}

for(j = 0; j < num_nets; j++){ /* write input lists */

for(i = 0; i <= pnt_in[j]; i++){

fprintf(file_ptr, "%d\n", in_node[j][i]);

}

}

for(j = 0; j < num_nets; j++){ /* write output lists */

for(i = 0; i <= pnt_out[j]; i++){

fprintf(file_ptr, "%d\n", out_node[j][i]);

}

}

 

fprintf(file_ptr, "%d\n", NUM_DATA); /* write message strings */

for(j = 0; j < NUM_DATA; j++){

fprintf(file_ptr, "%s\n", net_strings[j]);

}

 

fclose(file_ptr);

}

 

 

 

 

void def_input(neuron)

int neuron;

/*

* DEFINE INPUT NEURON

*

* This will allow an input neuron to be defined for the current network.

*

* VARIABLE: neuron: the neuron to be defined as an input.

*

* July 21st, 1989.

*/

{

pnt_in[num]++; /* increase number of inputs */

in_node[num][pnt_in[num]] = neuron; /* add input to list */

net_prep(); /* reorder processing list */

}

 

 

 

 

 

void def_output(neuron)

int neuron;

/*

* DEFINE NETWORK OUTPUT

*

* The specified neuron is defined as an output neuron for the current

* network number.

*

* VARIABLE: neuron: the neuron defined as an output

*

* July 21st, 1989.

*/

{

pnt_out[num]++; /* increase output count */

out_node[num][pnt_out[num]] = neuron; /* add output to list */

net_prep(); /* reorder processing list */

}

 

 

 

 

 

void net_input(neuron, value)

int neuron;

double value;

/*

* SET NETWORK INPUT VALUE

*

* This will allow an input to a neuron to be defined.

*

* VARIABLES: neuron: the neuron number

* value: the value input to the neuron

*

* July 21st, 1989.

*/

{

node[neuron].out = value; /* put value on output of neuron */

node[neuron].sum = value;

}

 

 

 

 

 

double net_output(neuron)

int neuron;

/*

* RETURN OUTPUT VALUE FOR NETWORK

*

* The output from the specified neuron is returned from

* this function.

*

* VARIABLE: neuron: the output neuron of interest

*

* RETURNS: the output value from the specified neuron

*

* July 21st, 1989.

*/

{

return((node[neuron].out));

}

 

 

 

 

void net_expect(neuron, value)

int neuron;

double value;

/*

* DEFINE EXPECTED OUTPUT FROM NETWORK

*

* The output expected from the specified node is defined here.

*

* VARIABLES: neuron: the neuron of interest

* value: the value expected from the specified neuron

*

* July 21st, 1989.

*/

{

node[neuron].expect = value;

}

 

 

 

 

 

void net_set_weight(from, to, weight)

int from, to;

double weight;

/*

* SET CONNECTION WEIGHT

*

*/

{

static int i, point;

 

point = node[to].point; /* get pointer to list of */

while(point != LAST){/* add until out of inputs */

if(syn[point].neuron == from){

syn[point].weight = weight;

}

point = syn[point].point;

}

}

 

 

 

 

 

double net_weight(from, to)

int from, to;

/*

* RETRIEVE VALUE OF CONNECTION WEIGHT

*

*/

{

static int i, point;

static double w_value;

w_value = 0.0;

 

point = node[to].point; /* get pointer to list of */

while(point != LAST){/* add until out of inputs */

if(syn[point].neuron == from){

w_value = syn[point].weight;

}

point = syn[point].point;

}

return(w_value);

}

 

 

 

 

 

void net_prep()

/*

* CREATE LISTS OF NEURON PROCESSING ORDER

*

* In the interest of speed, and flexibility for future software additions

* it is advantageous to have lists of processing order for the forward

* propagation of inputs, and the backward propagation of errors. This

* subroutine will create a list of neurons which indicates the processing

* order for neurons to ensure that all of their predecessors have been

* activated already. Another list will be created for backward propagation

* to ensure that all the neurons after the current have been satisfied.

* these lists are both generated before training and network solution so

* that they will increase processing speed.

*

* August 8th, 1990.

*/

{

static int i, j, pointer, wait_count, change_count, flag;

 

/*

* This section will construct the pointer table for forward prop

*/

work_pnt[num] = 0;

wait_count = 0;

for(i = 0; i <= point_node; i++){

marker[i] = EMPTY;

}

for(i = 0; i <= point_node; i++){

if((node[i].point != LAST) || (node[i].forward_point != LAST)){

marker[i] = WAITING;

wait_count++;

}

}

for(i = 0; i < num_nets; i++){

for(j = 0; j <= pnt_in[i]; j++)

marker[in_node[i][j]] = EMPTY;

}

for(i = 0; i <= pnt_in[num]; i++){

marker[in_node[num][i]] = USED;

work[num][work_pnt[num]] = in_node[num][i];

work_pnt[num]++;

}

change_count = 1;

while((change_count > 0) && (wait_count > 0)){

wait_count = 0;

change_count = 0;

for(i = 0; i <= point_node; i++){

pointer = node[i].point;

flag = 0;

while((marker[i] == WAITING) && (flag == 0)

&& (pointer != LAST)){

if(marker[syn[pointer].neuron] != USED){

flag++;

}

pointer = syn[pointer].point;

}

if((flag == 0) && (marker[i] == WAITING)){

change_count++;

marker[i] = USED;

work[num][work_pnt[num]] = i;

work_pnt[num]++;

}

if(marker[i] == WAITING) wait_count++;

}

}

if((change_count == 0) && (wait_count > 0)){

printf("The Forward Network is incomplete ! \n");

} else {

printf("The Forward Network is Alright ! \n");

}

/*

* This section will do back propogation preperation.

*/

back_pnt[num] = 0;

wait_count = 0;

for(i = 0; i <= point_node; i++){

marker[i] = EMPTY;

}

for(i = 0; i <= point_node; i++){

if((node[i].point != LAST) || (node[i].forward_point != LAST)){

marker[i] = WAITING;

wait_count++;

}

}

for(i = 0; i < num_nets; i++){

for(j = 0; j <= pnt_out[i]; j++)

marker[out_node[i][j]] = EMPTY;

}

for(i = 0; i <= pnt_out[num]; i++){

marker[out_node[num][i]] = USED;

back[num][back_pnt[num]] = out_node[num][i];

back_pnt[num]++;

}

change_count = 1;

while((change_count > 0) && (wait_count > 0) && (back_pnt[num] > 0)){

wait_count = 0;

change_count = 0;

for(i = 0; i <= point_node; i++){

pointer = node[i].forward_point;

flag = 0;

while((marker[i] == WAITING) && (flag == 0)

&& (pointer != LAST)){

if(marker[syn[pointer].neuron] != USED){

flag++;

}

pointer = syn[pointer].point;

}

if((flag == 0) && (marker[i] == WAITING)){

change_count++;

marker[i] = USED;

back[num][back_pnt[num]] = i;

back_pnt[num]++;

}

if(marker[i] == WAITING) wait_count++;

}

}

if((change_count == 0) && (wait_count > 0)){

printf("The backpropagation Network is Incomplete ! \n");

} else {

printf("The Backpropagation Network is Okey Dokey ! \n");

}

}

 

 

 

 

 

void net_solve()

/*

* SOLVE THE NETWORK FOR A SET OF INPUTS

*

* This little routine is the heart of the neural network simulator. The

* network values will be forward propagated through the network to the

* outputs.

*

* July 21st, 1989.

*/

{

static int i, point;

 

for(i = pnt_in[num]+1; i < work_pnt[num]; i++){ /* use processing list*/

point = node[work[num][i]].point; /* get pointer to list of */

/* previous neurons */

node[work[num][i]].sum = 0.0; /* set input sum to zero */

while(point != LAST){ /* add until out of inputs */

node[work[num][i]].sum += syn[point].weight

* node[syn[point].neuron].out;

point = syn[point].point;

}

/* apply activation to inputs */

node[work[num][i]].out = G(node[work[num][i]].sum);

}

}

 

 

 

 

 

void net_back_prop(rate, smooth)

double rate, smooth;

/*

* UPDATE NETWORK CONNECTION WEIGHTS

*

* The back propogation is done here based upon the results

* of the last network solution, and the net expect values.

*

* VARIABLES: rate: the learning rate to be used

* smooth: the smoothing function value

*

* July 21st, 1989.

*/

{

static int i, point;

static double delta;

 

 

for(i = 0; i <= pnt_out[num]; i++){ /* calculate output deltas */

node[out_node[num][i]].delta = G_DIF(node[out_node[num][i]].sum) *

(node[out_node[num][i]].expect: node[out_node[num][i]].out);

}

 

/* backpropagate errors with order */

/* processing list */

for(i = pnt_out[num] + 1; i < back_pnt[num]; i++){

point = node[back[num][i]].forward_point;

node[back[num][i]].delta = 0.0;

while(point != LAST){

node[back[num][i]].delta += syn[point-1].weight*

node[syn[point].neuron].delta;

point = syn[point].point;

}

node[back[num][i]].delta *= G_DIF(node[back[num][i]].sum);

}

 

/* update connection weights with */

/* modified delta rule */

for(i = 0; i < back_pnt[num]; i++){

point = node[back[num][i]].point;

while(point != LAST){

delta = (rate * node[back[num][i]].delta

* node[syn[point].neuron].out)

+ (smooth * syn[point].last_weight);

if(updater == POINT_TRAIN){ /* update now */

syn[point].last_weight = delta;

syn[point].weight += delta;

}

if(updater == SET_TRAIN){ /* save updates til later */

syn[point].sum += delta;

}

point = syn[point].point;

}

}

if(updater == SET_TRAIN) sum_cnt[num]++; /* count training trys */

}

 

 

 

 

 

 

void net_update()

/*

* MAKE WEIGHT CHANGES FOR SET TRAINING

*

* This routine will update the network based upon the learned weights,

* if the training is being done with set training. Set training refers to

* finding all of the suggested weight changes for the set, and then

* applying the average of all. Point training refers to updating the

* weight values after every estimation of the required change.

*

* Oct 9th, 1989.

*/

{

static int i, point;

static double delta;

 

if(updater == SET_TRAIN){ /* only for set training */

for(i = pnt_in[num]+1; i < work_pnt[num]; i++){

point = node[work[num][i]].point;

while(point != LAST){

delta = syn[point].sum / sum_cnt[num];

syn[point].sum = 0.0;

syn[point].last_weight = delta;

syn[point].weight += delta;

point = syn[point].point;

}

}

sum_cnt[num] = 0;

}

}

 

 

 

 

 

 

double G(input)

double input;

/*

* ACTIVATION FUNCTIONS

*

* This is the activation function which takes the input and

* returns the output.

*

* VARIABLES: input: the sum from within the neuron

*

* RETURNS: The value of the activation function for the sum

*

* July 21st, 1989.

*/

{

static double output;

 

if(activation_type == SUM)

output = input;

if(activation_type == SIGMOID)

output = 1.0/(1.0+exp(-input));

if(activation_type == NEW_SIGMOID)

output = 1.0/(1.0+exp(-input)): .5;

if(activation_type == TANH)

output = tanh(input);

return(output);

}

 

 

 

 

 

double G_INVERSE(output)

double output;

/*

* GIVE SUM WHICH PRODUCES ACTIVATION OUTPUT

*

* This will give the input to the activation function, based

* upon the output of the activation function.

*

* VARIABLE: output: the output of the activation function

*

* RETURNS: the input the activation function required for the output

*

* July 21st, 1989.

*/

{

static double input;

 

if(activation_type == SUM)

input = output;

if(activation_type == SIGMOID)

input = -log(1.0/output: 1.0);

if(activation_type == NEW_SIGMOID)

input = -log(1.0/(output+.5): 1.0);

if(activation_type == TANH)

input = atanh(output);

return(input);

}

 

 

 

 

 

 

double G_DIF(input)

double input;

/*

* DERIVATIVE OF ACTIVATION FUNCTION

*

* This takes the input and returns the first derivative of the

* activation function.

*

* VARIABLE: input: the input to the activation function

*

* RETURNS: the derivative of the activation function with the input

*

* July 21st, 1989.

*/

{

static double output, outter;

if(activation_type == SUM)

output = 1.0;

if(activation_type == SIGMOID){

outter = G(input);

output = outter * (1.0: outter);

}

if(activation_type == NEW_SIGMOID){

outter = G(input)+.5;

output = outter * (1.0: outter);

}

if(activation_type == TANH){

outter = cosh(input);

output = 1.0/outter/outter;

}

return(output);

}

 

 

 

 

 

void net_put_string(n, string)

int n;

char *string;

/*

* STORE MESSAGE STRING

*

* This feature was added so that variables and message strings could be

* stored with a network in a disk file.

*

* VARIABLES: n: number of message string

* string: a string to be stored in location n

*

* August 8th, 1990.

*/

{

strcpy(net_strings[n], string);

}

 

 

 

 

 

void net_get_string(n, string)

int n;

char *string;

/*

* RETURN STORED MESSAGE

*

* This allows message strings stored to be returned

*

* VARIABLES: n: number of message location

* string: the message string to be returned

*

* August 8th, 1990.

*/

{

strcpy(string, net_strings[n]);

}

 

 

 

 

 

void net_number(number)

int number;

/*

* SET NET NUMBER

*

* In an effort to allow more than one network to be used by the same

* program, it was necessary to create the ability to choose networks.

* Thus all the networks are all defined together with the same neuron

* numbers, but when dealing with inputs and outputs, and processing,

* the network number must be set to indicate the network to be used.

*

* VARIABLE: number: the current network number

*

* August 8th, 1990.

*/

{

if(number > NUMBER_NETS){

printf("Too many nets defined !!! \007 \n");

} else num = number;

}