[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Neurostat-develop] API proposal

From: Joseph
Subject: Re: [Neurostat-develop] API proposal
Date: Fri, 29 Mar 2002 13:40:18 +0100

Ok, maybe we can adopt the second method (joseph's way), not because it
is my way but because I thinks that it is a more classical way, more
documented in  books dealing with neural networks. 

The API for back-propagate become then
 int weight_derivatives(MLP *arch, Activation *act, double * jacobian,
double *w_jacobian)
 int inputs_derivatives(MLP *arch, Activation *act, double * jacobian,
double *in_Jacobian)

Next, I propose that MLP will be an aggregate of more simple structure :
the layer.
This structure is basically a MLP restricted to one layer, so the
previous API for the MLP have mainly to be replicated for the layers,
this yields us the following API :

int layer_propagate(Layer *lay, double *weights, double * input, double
* output, activation * act, activation *dact)

int layer_back_propagate(Layer *lay, double *weights, double *
local_deriv_out,double *local_derive_in, activation *dact)

where :

-lay is a pointer on the layer structure
-weight is a pointer on the parameter of the layer
-local_deriv_out in the derivative with respect to the preoutput of the
-local_deriv_in is the derivative with respect to the preinput of the
-dact is a pointer of the derivative of the input of the layer

Remark that "local_deriv_out" and "local_deriv_in" are pointer on the
addresses belonging to the Jacobian of the error and "weight" are
pointer on address belonging the parameter vector of the MLP.

the calculation of the derivative with respect to the network weight is
done by :

int weight_layer_derivative(Layer *lay, Activation *act, double
*local_deriv_out, double *w_deriv)

"w_deriv" is an adresse belonging to the w_jacobian vector.

The derivatives with respect to the layer entries is useless, for the
input_derivative function.

I know, I think early to the detail, but dimitri needs this API to begin
the code of the layers and then of the MLP.

Comments are naturally welcome.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]