Emergence of Subcritical Bifurcations in a System of Randomly Coupled Supercritical Andronov-Hopf Oscillators: A Potential Mechanism for Neural Network Type Switching
aa r X i v : . [ m a t h . D S ] A ug Emergence of Subcritical Bifurcations in a System of RandomlyCoupled Supercritical Andronov-Hopf Oscillators: A PotentialMechanism for Neural Network Type Switching
Keith Hayton ∗ and Dimitrios Moirogiannis † Center for Studies in Physics and Biology, The Rockefeller University (Dated: August 23, 2019)
Abstract
Experimental evidence suggests that the computational state of cortical systems change accord-ing to behavioral and stimulus context. However, it is still unknown what mechanisms underlie thisadaptive processing in cortical circuitry. In this paper, we present a model of randomly coupledsupercritical Andronov-Hopf oscillators which can act as an adaptive processor by exhibiting dras-tically different dynamics depending on the value of a single network parameter. Despite being onlycomposed of supercritical subunits, the full system can exhibit either supercritical or subcriticalAndronov-Hopf bifurcations. This model might provide a novel mechanism for switching betweenglobally asymptotically stable and nonhyperbolic neural network types in pattern recognition the-ory. ∗ [email protected] ; author contributed equally † [email protected]; author contributed equally NTRODUCTION
A growing body of evidence suggests that cortical areas are not fixed in function but in-stead act as adaptive processors [Kirst 2017] with intrinsic cortical circuits that are stronglymodulated by feedback from higher cortical areas [Gilbert 2007, Gilbert 2013] and incomingsensory information [Kapadia 1999, Sceniak 1999, Nauhaus 2009, Yan 2012, Hayton 2018a].This is in contrast to the classical feedforward view of the cortex as a hierarchical seriesof cortical areas which encode an increasingly more complex description of sensory input[Gilbert 2007, Gilbert 2013]. How the brain is able to dynamically reconfigure it’s compu-tational state is largely unknown and a matter of active research [Kirst 2017].In this paper, we present a toy model of cortical circuitry consisting of a network ofdamped, identical oscillators coupled through a
Wigner real symmetric matrix [Tao 2008,Anderson 2010, Tao 2012, O’Rourke 2016]. Each oscillator in the network is poised at a su-percritical
Andronov-Hopf bifurcation [Poincarè 1893, Hopf 1942, Andronov 2013]. To studythe dynamics [Ioos 1999, Wiggins 2003, Kuznetsov 2013] of the system, we follow the statis-tical center manifold reduction approach introduced in [Moirogiannis 2019]. We show thatby changing a single network parameter the full system can undergo either a supercriticalor subcritical Andronov-Hopf bifurcation. The transformation between supercritical andsubcritical occurs despite the system being exclusively composed of supercritical oscillators.This model is an example of how a network of fixed computational subunits can act as anadaptive processor by drastically switching it’s dynamic regime without affecting the sub-unit dynamics. Additionally, we also explore how this network model might also act as aneuroscience inspired adaptive pattern recognition processor by switching between neuralnetwork types, specifically globally asymptotically stable and nonhyperbolic . ANALYSIS
We consider the following system of N coupled, damped, identical oscillators: ˙ x i = y i + ax i + bx i + X j M ij x j (1) ˙ y i = − x i i ∈ { , . . . , N } and a, b ∈ R are system parameters. The individual subunits ˙ x i = y i + ax i + bx i (2) ˙ y i = − x i are linearly coupled through the variables x i , with the connectivity matrix M ∈ R N × N givenby a random matrix with all eigenvalues negative and contained within a compact set inthe complex plane except for a single leading eigenvalue λ , which will act as a bifurcationparameter. The construction of such a matrix is outlined in [Moirogiannis 2019].In general, a system ˙ x = y + f ( x, y ) (3) ˙ y = − x + g ( x, y ) is poised at an Andronov-Hopf bifurcation if f and g , which capture all nonlinearities oforder ≥ , ensure that the first Lyaponuv coefficient l (0) = 116 ( f xxx + f xyy + g xxy + g yyy ) +116 ( f xy ( f xx + f yy ) − g xy ( g xx + g yy ) − f xx g xx + f yy g yy ) = 0 [Kuznetsov 2013]. The bifurca-tion is subcritical (supercritical) iff l (0) > ( l (0) < ).For the individual subunits described in (2), the first Lyapunov coefficient is given by l (0) = 116 ( f xxx + f xyy ) + 116 ( f xy ( f xx + f yy )) = 12 b , where b is the coefficient of the cubicterm in (2). Therefore, the type of bifurcation that a single subunit is poised at is determinedexclusively by b , with the bifurcation being subcritical (supercritical) iff b > ( b < ); thecoefficient a of x in (2) does not play a role.We will now proceed to calculate the Lyapunov coefficient for the full system. Using thestatistical center manifold reduction technique outlined in [Moirogiannis 2019], it has beenshown that at the bifurcation point λ = 0 , the system of equations in (1) can be reducedto a set of restricted equations on the center manifold. Up to third order, these reducedequations are given by: 3 x = y + a √ N Γ δ , • x + b √ N Γ δ , • + 2 a √ N N X k =2 Γ δ , • + δ k, • α k ! x ++ a √ N N X k =2 Γ δ , • + δ k, • α k ! x y + a √ N N X k =2 Γ δ , • + δ k, • α k ! xy + O (4) (4) ˙ y = − x , where the Γ ’s are random variables which depend on the structure of the eigenvectors andare defined in [Moirogiannis 2019]. In the limit N → ∞ , the Γ ’s can be evaluated using mo-ments of the statistical distribution of the eigenvectors [O’Rourke 2016, Moirogiannis 2019].By analyzing the reduced equations, it can be proved that the system undergoes anAndronov-Hopf bifurcation when λ = 0 . Observe that the reduced equations are of the form(3), and thus, the Lyapunov coefficient can be easily calculated: l (0) = b √ N Γ δ , • + a √ N N P k =2 Γ δ , • + δ k, • Γ δ , • k − λ k λ k + 9 = 0 . As l (0) only vanishes along a single parabolic curvein the a, b parameter plane, this implies that the system typically undergoes a Andronov-Hopf bifurcation as λ crosses the imaginary axis. The bifurcation is subcritical (supercritical)iff l (0) > ( l (0) < ).Moreover, assuming the connection matrix M is normal, it can be easily shown that theterm √ N N P k =2 Γ δ , • + δ k, • Γ δ , • k − λ k λ k + 9 is typically positive. First, since M is normal, the Γ ’scan be greatly simplified: Γ δ , • + δ k, • = Γ δ , • k = P φ V φ,k V φ, where ( V ,k , . . . , V N,k ) t is the k thvector in the orthonormal base of eigenvectors (with eigenvalue λ k ) [Moirogiannis 2019]. Itthen follows that √ N N P k =2 Γ δ , • + δ k, • Γ δ , • k − λ k λ k + 9 = √ N N P k =2 ( P φ V φ,k V φ, ) − λ k λ k + 9 > almostsurely. We also have that the other term in the Lyapunov coefficient is positive Γ δ , • k > ,since Γ δ , • k = P φ V φ,k V φ, = P φ V φ, > .We now examine how the bifurcation of the full system and individual subunits vary withparameters a and b . If b > , the full system along with each subunit ( l (0) subunit = 12 b ) issubcritical regardless of the value of a . When b < , the subunits are always supercritical,but the full system can be either subcritical or supercritical depending on the value of || a || : the full system is subcritical (supercritical) iff || a || > γ √− b ( || a || < γ √− b ), where γ = r P φ V φ, / N P k =2 ( P φ V φ,k V φ, ) − λk λ k +9 ! . Thus, it is possible for the full system to undergo asubcritical Andronov-Hopf bifurcation despite each individual subunit being supercritical.4 DAPTIVE PATTERN RECOGNITION
We now explain how our results might be applicable to pattern recognition and switchingof neural network types. Assuming static input patterns, neural networks can be broadlycategorized into three types: multiple attractor (MA), globally asymptotically stable (GAS),and nonhyperbolic (NH) neural networks [Hoppensteadt 2012]. MA-type networks, which in-clude Hopfield and Cohen-Grossberg networks, are characterized by each input pattern con-verging to one of many multiple attractors; each attractor corresponds to a different storedmemory pattern. GAS-type networks, on the other hand, store memories by associating agiven input pattern with the globally asymptotically stable state of the system. In this case,a memory is stored as a specific location of the attractor, and an input pattern is presentedto the network as a parameter which controls the attractor location [Hoppensteadt 2012].Finally, NH-types have nonhyperbolic equilibriums which affect the global dynamics ofthe system. Input in NH types is given to the system in the form of a bifurcation parameterthat perturbs the nonhyperbolic equilibrium, causing it to lose stability or disappear alto-gether. Only a local analysis near the equilibrium is needed to understand global behavior,since the equilibrium is a point of convergence of the curves, called separatrices, separatingthe different attraction domains of the system [Hoppensteadt 2012]. Nonhyperbolic equilib-rium dynamics have been the focus of an extensive number of studies in neuroscience.These studies include entire hemisphere ECoG recordings [Solovey 2012, Alonso 2014,Solovey 2015], experimental studies in premotor and motor cortex [Churchland 2012],theoretical [Seung 1998] and experimental studies [Seung 2000] of slow manifolds (a spe-cific case of center manifolds) in oculomotor control, slow manifolds in decision making[Machens 2005], Andronov-Hopf bifurcation [Poincarè 1893, Hopf 1942, Andronov 2013] inthe olfactory system [Freeman 2005] and cochlea [Choe 1998, Eguíluz 2000, Camalet 2000,Kern 2003, Duke 2003, Magnasco 2003, Hayton 2018b], a nonhyperbolic model of primaryvisual cortex (V1) [Yan 2012, Hayton 2018a], and theoretical work on regulated criticality[Bienenstock 1998].Following the lead of [Hoppensteadt 2012], we consider a specific case of (1), where the N damped oscillators in the system, given by (2), are neural relaxation oscillators with5nput matrix R = r . . . r N , where r i is the external input to the i -th oscillator, and M = R + C , where C represents the linear connectivity between the x i ’s. In the caseof neural relaxation oscillators, x i and y i represent rescaled populations of excitatory andinhibitory populations, respectively. As in [Hoppensteadt 2012], we assume Dales principleholds, C ij ≥ ∀ i, j , and because of Hebbian learning mechanisms, C is symmetric.It has been shown, using the Perron-Frobenius theorem, that for any input matrix R ,the leading eigenvalue λ of R + C is simple, and if λ = 0 , the full system undergoes anAndronov-Hopf bifurcation [Hoppensteadt 2012]. Using this result, it has also been shownthat such a system has the potential to perform pattern recognition on a set of input vectors.The exact construction of the pattern recognition algorithm is still an open problem as itrequires the construction of a specific matrix for all input vectors, however, once this matrixis constructed, the resulting algorithm will depend heavily on the type of bifurcation theneural network undergoes. If the bifurcation is supercritical, each input vector is mapped toa stable limit cycle with a location corresponding to one of the memorized pattern vectors.This is an example of a GAS-type neural network. On the other hand, if the bifurcation issubcritical, the trajectory corresponding to the initial input vector leaves the vicinity of theequilibrium point along a direction associated with one of the memorized patterns, which isan example of a NH-type neural network.Our earlier analysis of the coupled supercritical oscillators with normal connectivity ma-trix directly applies to this setting, since according to the Perron-Frobenuis theorem, achange in input corresponds to moving the leading eigenvalue as we assumed above in ourcalculations. Thus, the network parameter || a || , can switch the neural network type betweenGAS (supercritical) and NH (subcritical). CONCLUSION