Abstract

Hopfield neural network (HNN) is considered as an artificial model derived from the brain structures and it is an important model that admits an adequate performance in neurocomputing. In this article, we solve a dynamical model of 3D HNNs via Atangana–Baleanu (AB) fractional derivatives. To find the numerical solution of the considered dynamical model, the well-known Predictor-Corrector (PC) method is used. A number of cases are taken by using two different sets of values of the activation gradient of the neurons as well as six different initial conditions. The given results have been perfectly established using the different fractional-order values on the given derivative operator. The objective of this research is to investigate the dynamics of the proposed HNN model at various values of fractional orders. Nonlocal characteristic of the AB derivative contains the memory in the system which is the main motivation behind the proposal of this research.

1. Introduction

Neural networks (NNs) are a part of machine learning that are at the centre of deep learning techniques. Their identity and dynamics are taken from the human brain, and they dovetail the path real neurons transfuse to each other. In some branches of artificial intelligence (AI), deep learning, and machine learning, NNs are mimetic to the function of the human brain, helping computer algorithms to locate patterns and estimate general problems. As a result of their widespread use in a variety of sectors, NNs have elicited a great deal of anxiety [13]. Practice data is utilized by neural networks to swot and improve their performance over time. However, when these learning tactics have been improved for precision, they get as the knotty features in computer science and AI, helping us to hastily classify and hoard data. In comparison to manual recognition by human experts, actions in speech or picture identification can take minutes rather than hours. Various types of NNs are present, each of which is used for a specific target.

For the first time in 1984, Hopfield introduced the Hopfield neural network (HNN) [4]. Since then, a greater learning of the Hopfield neural network’s dynamical behaviour has been crucial in the study of applications of engineering and information processing, such as pattern identification [5], signal processing, and associative memory [6]. Moreover, there have been several studies published in the literature on the dynamical characteristics of a range of complex-valued neural network models. HNN, as previously said, is an artificial model derived from brain dynamics, and it is an important model in neurocomputing [7]. A neural model like this is capable of accumulating information or material in an identical fashion to a human brain. Njitacke et al. in [8] discussed the space magnetization, hysteretic dynamics, and offset boosting in a third-order memristive system. In [9], the authors analyzed the complex structure of a 3D autonomous system without linear terms having line of equilibria. A study on the control of multistability with selection of chaotic attractor along with an application to image encryption is given in [10]. In [11], a dynamical analysis on a simple autonomous jerk system with multiple attractors is proposed.

Nowadays, fractional-order operators are highly useful to solve varieties of real-world problems [1214]. The main feature of fractional derivatives is their nonlocal properties which help to capture memory effects in the systems. These operators are an advanced version of the integer-order operators. To date, fractional operators have been used in various scientific and engineering fields by using different kinds of mathematical modelings. Recently, fractional derivatives have been used in disease dynamics [15, 16], mechanics [17], psychology [18], engineering [19], advanced modeling via fractal-fractional operators [20], etc. For the sake of the various advantages of fractional operators for memory effects, modeling dynamic systems using fractional calculus has been met with scepticism [21, 22]. In [23], the explicit stability dependency on a variable time delay was presented and delay-dependent stability switches of linear systems of the fractional type were examined. This theory has recently been included into NNs, resulting in fractional-order neural networks (FONNs). So such a medication can strengthen the ability of neurons to process information. Fortunately, owing to the unwavering tenacity of researchers, various worldwide applications of FONNs have been discovered, including network approximation [24], state estimation [25], system identification [26], robotic manipulators [27], and formation control [28]. For neurons, fractional-order elements have two clear benefits. On the one hand, fractional calculus, as compared to ordinary calculus, has a far better depiction of memory and hereditary features [29]. Fractional-order parameters, on the other hand, can improve system performance by adding one degree of freedom [30]. By combining memory peculiarity into NNs, there is clearly an enormous improvement. FONNs have produced some astonishing effects [31, 32].

In this article, we perform some novel mathematical simulations on the dynamical model of 3D HNNs which was investigated in ref. [33] given as follows:where terms are state variables and stand for the variable gradient in relation to the activation function. Firstly, we give some preliminaries related to the fractional calculus in Section 2. Then, to solve the above given dynamical model (1), we generalise the model into Atangana–Baleanu (AB) fractional derivative under the Mittag–Leffler kernel in Section 3. For investigating the numerical solution of the fractional-order model, we apply Predictor-Corrector (PC) method. In Section 4, a number of graphs are plotted to check the correctness of the derived solution. Lastly, we give the supporting conclusion.

2. Preliminaries

Several important notions are recalled here.

Definition 1. (see [34]). For the function , where and , the -AB derivative iswhere with is the normalization function.

Definition 2. (see [34]). The AB fractional integral is given by

Lemma 1 (see [34]). For , the solution of the systemis derived by

Lemma 2 (see [35]). Let is a nonnegative integer and , then there exist two constants and in terms of , such thatand .

Lemma 3 (see [35]). Let us suppose for is a positive integer. Let for . If , then in which is independent of both and .

3. The Structure of the Model

Here we generalise the aforementioned integer-order model (1) into the fractional-order sense by using a nonsingular type fractional derivative called Atangana–Baleanu fractional derivative. Nonlocal characteristic of the AB derivative contains the memory in the system which is the main motivation behind this generalization. So, the fractional form of the given system (1) in the AB-operator sense is given bywhere is the AB fractional derivative of order .

3.1. Derivation of the Numerical Solution

In the current literature, there are many computational methods available to solve different types of fractional-order systems. Some very recent works on the proposal of numerical methods in the sense of fractional derivatives can be seen from ref. [36, 37]. Here we implement the Predictor-Corrector method for solving the given dynamical model (7). The complete methodology of the proposed method has been defined in ref. [38]. Firstly, we consider the initial value problem (IVP)

From ref. [38], the equivalent Volterra integral equation is written by

According to the derivation of the method proposed in [38] for the fractional-order , and considering and , for , the corrector term for the IVP (8) is derived bywhereand . The predictor term is given bywhere

We can see that our proposed model (7) is just a generalized form of the considered IVP (8). Hence the corrector formulae in relation to the proposed model (7) are given bywhere

3.1.1. Stability of the Scheme

Theorem 1. The derived scheme (14) and (15) is conditionally stable.

Proof. Consider and be perturbations of , respectively. Then the perturbations given from (14) and (15) areAccording to the Lipschitz condition, we getwhere . Also, from Eq.(3.18) in [35] we writewhere . Substituting from (18) into (17) reads as follows:where . is a constant dependent on (from Lemma 2) and is supposed to be very small. From Lemma 3, it is obtained that , which is the desired result.

4. Graphical Observations

Now to check the role of the proposed Atangana–Baleanu fractional derivative, we plot a number of graphs with the help of the above mentioned numerical method. The values of the parameters and are fixed and equal to . For the activation gradient of the second neuron , we plotted the coexistence of four distinct stable states in the group of Figures 14. In the frame of Figure 1, the initial values are taken as . Here Figures 1(a) and 1(b) represent the dynamics of versus at .

Figures 1(c) and 1(e) show the dynamics of versus , and Figures 1(d) and 1(f) justify the variations of versus at the given values of order . Similarly, we perform some other cases of different initial values. In the frame of Figure 2, the initial values are taken as . In the case of Figure 3, these values are and for Figure 4 are fixed as . Here we can see that the proposed results are slightly different to the previously given results of ref. [33]. When we change the values of , the nature of the assumed multiple attractors also changes. One of the main differences in the proposed fractional-order analysis and the previously performed results of [33] is that there is no existence of any perfect periodic attractors at any fractional-order values, but the chaotic attractors are achieved in much better form. All simulations are performed by using Mathematica software.

In the same line when , we consider the coexistence of six different stable states in the group of Figures 510. In the frame of Figure 5, we assume . Here Figures 5(a) and 5(b) represent the dynamics of versus at .

Figures 5(c) and 5(e) show the dynamics of versus , and Figures 5(d) and 5(f) justify the variations of versus at the given values of order . In Figure 6, we take . In the case of Figure 7, these are and for Figure 8 are fixed as . Then for Figure 9, values are and for Figure 10 are . Here again we notice that the proposed results are different to the previously given results of ref. [33]. Again, the main difference in the proposed fractional-order analysis and the previously performed results of [33] is that there is no existence of any periodic attractors at any fractional-order values, but the chaotic attractors are achieved in much better form.

5. Conclusions

In this paper, we simulated a dynamical model of 3D HNNs in terms of Atangana–Baleanu fractional derivative. The numerical solution of the suggested dynamical model has derived via the Predictor-Corrector method. A number of cases for initial values are considered for the better understanding of the role of initial changes. By using the two different values of the second activation gradient of the neuron, the behaviour of the proposed model is investigated at four different fractional orders. From the given graphical simulations, we conclude that in the case of fractional-order values there is no clear existence of any periodic attractors, but the chaotic attractors are achieved in much better form. In the future, the proposed dynamical model can be further solved by using any other fractional operators.

Data Availability

Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.

Conflicts of Interest

The authors declare that they have no competing interests.

Authors’ Contributions

The authors declare that the study was realized in collaboration with equal responsibility. All authors read and approved the final manuscript.

Acknowledgments

The first and fourth authors would like to thank Azarbaijan Shahid Madani University.