A mathematical model for feed-forward neural networks : theoretical description and parallel applications.

Abstract : We present a general model for differentiable feed-forward neural networks. Its general mathematical description includes the standard multi-layer perceptron as well as its common derivatives. These standard structures assume a strong relationship between the network links and the neuron weights. Our generalization takes advantage of the suppression of this assumption. Since our model is especially well-adapted to gradient-based learning algorithms, we present a direct and a backward algorithm that can be used to differentiate the output of the network. Theoretical computation times are estimated for both algorithms. We describe a direct application of this model: a parallelization method that uses the expression of our general backward differentiation to overlap the communication times.
Document type :
Reports
Complete list of metadatas

Cited literature [11 references]  Display  Hide  Download

https://hal-lara.archives-ouvertes.fr/hal-02101945
Contributor : Colette Orange <>
Submitted on : Wednesday, April 17, 2019 - 9:10:27 AM
Last modification on : Friday, May 17, 2019 - 1:39:21 AM

File

RR1995-23.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02101945, version 1

Collections

Citation

Cedric Gegout, Bernard Girau, Fabrice Rossi. A mathematical model for feed-forward neural networks : theoretical description and parallel applications.. [Research Report] LIP RR-1995-23, Laboratoire de l'informatique du parallélisme. 1995, 2+12p. ⟨hal-02101945⟩

Share

Metrics

Record views

5

Files downloads

8