Abstract
We study two classes of sigmoids: the simple sigmoids, defined to be odd, asymptotically bounded, completely monotone functions in one variable, and the hyperbolic sigmoids, a proper subset of simple sigmoids and a natural generalization of the hyperbolic tangent. We obtain a complete characterization for the inverses of hyperbolic sigmoids using Euler's incomplete beta functions, and describe composition rules that illustrate how such functions may be synthesized from others. These results are applied to two problems. First we show that with respect to simple sigmoids the continuous Cohen-Grossberg-Hopfield model can be reduced to the (associated) Legendre differential equations. Second, we show that the effect of using simple sigmoids as node transfer functions in a one-hidden layer feedforward network with one summing output may be interpreted as representing the output function as a Fourier series sine transform evaluated at the hidden layer node inputs, thus extending and complementing earlier results in this area.
Original language | English (US) |
---|---|
Pages (from-to) | 819-835 |
Number of pages | 17 |
Journal | Neural Networks |
Volume | 9 |
Issue number | 5 |
DOIs | |
State | Published - Jul 1996 |
Keywords
- Additive model
- Cohen-Grossberg-Hopfield model
- Fourier transform
- Hypergeometric series
- Legendre equation
- Sigmoid functions
ASJC Scopus subject areas
- Cognitive Neuroscience
- Artificial Intelligence