Javascript must be enabled to continue!
Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks
View through CrossRef
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
Title: Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks
Description:
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory.
The weight parameters need a lot of memory resources.
Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks.
Since their architectures are much more complicated than that of CHNN, the architecture should be simplified.
In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks.
Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks.
By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
Related Results
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Noise Robust Projection Rule for Klein Hopfield Neural Networks
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural net...
Like Me or Like Us
Like Me or Like Us
Research has shown abundant evidence for social projection, that is, the tendency to expect similarity between oneself and others ( Krueger, 1998a , 1998b ). This effect is stronge...
Intersemiotic projection and academic comics: towards a social semiotic framework of multimodal paratactic and hypotactic projection
Intersemiotic projection and academic comics: towards a social semiotic framework of multimodal paratactic and hypotactic projection
Abstract
Intersemiotic projection is one of the most common configurations in the knowledge construction process of academic comics. Although previous studies addres...
On Exponential Convergence Conditions of an Extended Projection Neural Network
On Exponential Convergence Conditions of an Extended Projection Neural Network
Recently the extended projection neural network was proposed to solve constrained monotone variational inequality problems and a class of constrained nonmonotontic variational ineq...
Impulse noise: Comparison of dose calculated by 5-dB rule and 3-dB rule
Impulse noise: Comparison of dose calculated by 5-dB rule and 3-dB rule
In the past several years, there have been many proposals concerning incorporation of impulse noise into total worker exposure. This is a problem of particular concern in the Unite...
Synaptic Dynamics in Analog VLSI
Synaptic Dynamics in Analog VLSI
Synapses are crucial elements for computation and information transfer in both real and artificial neural systems. Recent experimental findings and theoretical models of pulse-base...
Redundancy-Aware Pruning of Convolutional Neural Networks
Redundancy-Aware Pruning of Convolutional Neural Networks
Pruning is an effective way to slim and speed up convolutional neural networks. Generally previous work directly pruned neural networks in the original feature space without consid...
On Convergence Conditions of an Extended Projection Neural Network
On Convergence Conditions of an Extended Projection Neural Network
The output trajectory convergence of an extended projection neural network was developed under the positive definiteness condition of the Jacobian matrix of nonlinear mapping. This...