A Hopfield neural network in magnetic films with natural learning
AA Hopfield neural network in magnetic films with natural learning
Weichao Yu ( 余伟 超 ), Jiang Xiao ( 萧 江 ),
2, 3, 4, ∗ and Gerrit E. W. Bauer ( 包 格 瑞 )
5, 1, 6 Institute for Materials Research, Tohoku University, Sendai 980-8577, Japan Department of Physics and State Key Laboratory of Surface Physics, Fudan University, Shanghai 200433, China Institute for Nanoelectronics Devices and Quantum Computing, Fudan University, Shanghai 200433, China Shanghai Research Center for Quantum Sciences, Shanghai 201315, China WPI-AIMR, Tohoku University, Sendai 980-8577, Japan Zernike Institute for Advanced Materials, Groningen University, Netherlands
Macroscopic spin ensembles possess brain-like features such as non-linearity, plasticity, stochasticity, self-oscillations, and memory effects, and therefore offer opportunities for neuromorphic computing by spintronicsdevices. Here we propose a physical realization of artificial neural networks based on magnetic textures, whichcan update their weights intrinsically via built-in physical feedback utilizing the plasticity and large number ofdegrees of freedom of the magnetic domain patterns and without resource-demanding external computations.We demonstrate the idea by simulating the operation of a 4-node Hopfield neural network for pattern recognition.
Introduction.
Tremendous progress in the last decade haspropelled neuromorphic computing to the forefront of infor-mation technology. However, brain-inspired algorithms aremainly emulated by conventional von Neumann architecturesin which the computing and storage units are physically sep-arated, thereby limiting the power of artificial intelligence al-gorithms. For example, the power consumption of the AlphaGo processor ( ∼ ) is 50,000 times higher than that ofa human brain ( ∼
20 W ). [1, 2] A more sustainable route to-wards artificial intelligence is an architecture with hard-wiredneuromorphic functions. Spintronic systems based on mag-nets share features of the brain, such as non-linearity, mem-ory, self-oscillations, stochasticity, plasticity, high degrees offreedom, etc . [3] These advantages already led to alternativecomputing schemes, such as the stochastic [4, 5], in-memorylogic [6], as well as neuromorphic [7] computing.The Artificial Neural Network (ANN) is a widely usedmodel for neuromorphic computing, with artificial neuronsand synapses that emulate biological system [8]. Neurons aredevices with output signals that spike when the integrated in-put reaches a certain threshold. While synapses connect theneurons with tunable weights. Both functionalities can bemimicked by spintronic devices. For example, spin-torqueoscillators can serve as artificial neurons and recognize spo-ken digits and vowels [9, 10]. The nonlinear dynamics ofskyrmion fabrics can pre-process information for reservoircomputing [11–13].Memristors have a resistance that depends on its historyand are widely used as artificial synapses [14–17]. Spintron-ics offers memristor functionalities by reconfigurable mag-netic configurations whose resistance depends on, for exam-ple, the positions of magnetic domain walls [18], the num-ber of skyrmions [19–21], or the texture in antiferromag-net/ferromagnet bilayers [22], all of which can be controlledby applied fields or currents.In most software and hardware realizations of an ANN, theweight-updating process of the synapses is based on externalalgorithms, such as the back-propagation method in which ex-ternal computations allocate new weights based on the resultsof a previous cycle. Unfortunately, this is more expensive in terms of resource and energy consumption than the inferringprocess itself [23]. In this Letter we propose a platform forneuromorphic computation with superior performance that isbased on the plasticity of magnetic textures. We show that anelectrically conducting magnetic film may operate as a collec-tion of artificial synapses with weights encoded by the con-ductances between external electrodes. The network train-ing naturally updates the weights without requiring externalcomputation. As a proof of principle we simulate the on-chiptraining and inferring (or read-out) of a 4-node Hopfield net-work implemented on a metallic magnetic thin film with mazespiral domains as shown in Fig. 1(g).
Modeling.
The dynamics of the magnetization M ( r , t ) of aferromagnetic film with saturation magnetization M s is gov-erned by the Landau-Lifshitz-Gilbert (LLG) equation ∂ m ∂t = − γ m × H eff + α m × ∂ m ∂t + τττ d , (1)where m = M /M s , γ is the gyromagnetic ratio and α is theGilbert damping constant. The effective magnetic field H eff = A ∇ m + K m · ˆ z − D ∇ × m (2)consists of the exchange interaction, the perpendicular easy-axis anisotropy along ˆ z , and a bulk-type Dzyaloshinskii-Moriya interaction (DMI) [24–26], parameterized by A, K ,and D , respectively. The current-induced spin-transfer torque[27–29] in Eq. (1) τττ d = µ B PeM s j · ∇ m (3)is proportional to the electric current density j , where µ B is theBohr magneton, P the (conductivity) spin polarization, and − e the electron charge. The dipolar interaction is not impor-tant for the energetics of sub-micrometer scale textures andmay be disregarded for the DMI-stabilized ones consideredhere.The electric current density j is proportional to the localelectric field E : j ( r ) = ˆΣ[ m ( r )] · E ( r ) , (4) a r X i v : . [ c ond - m a t . d i s - nn ] J a n (b) (c) (d)(e) (f)(a) V V Longitudinal Transverse (g) 𝑉 𝑉 𝑉 𝑉 𝐼 𝑖 = Σ j 𝐺 𝑖𝑗 𝑉 𝑗 𝐼 𝐼 𝐼 𝐼 𝐺 𝐺 𝐺 𝐺 𝐺 𝐺 𝑢 𝑢 𝑢 𝑢 𝑤 𝑤 𝑤 𝑤 𝑤 𝑤 𝑢 𝑖′ = Σ 𝑗≠𝑖 𝑤 𝑖𝑗 𝑢 𝑗 Current 𝑡 𝑡 𝑡 𝑡 = 𝑡 𝑡 = 𝑡 𝑡 = 𝑡 LongitudinalTransverseAverage
Figure 1. (a) Snapshot of a maze domain structure at t = 0 ns (color code for out-of-plane magnetization and arrows for in-plane magnetiza-tion) (b) Calculated conductance | G | in longitudinal ( ˆ x ) and transverse ( ˆ y ) directions. Snapshots of magnetization (c)(e) and current density | j | (d)(f) distributions at t = 10 ns and t = 50 ns, respectively. An applied V = 0 . V drives a current along the x direction. (g) Left:Schematical Hopfield network formed by four neurons with values u i and weights w ij . Right: Implementation of this network by a magneticthin film with a texture that stores weights that can be trained by currents. The inputs are the voltages V i at the electrodes and the outputs arethe currents I i into the film. The conductances G ij are equivalent to the weights in the Hopfield network. where ˆΣ[ m ] is the × conductivity matrix of a mag-netic thin film with the anisotropic magnetoresistance (AMR)[30], i.e. , a local resistivity depending on the angle θ be-tween the current flow and the local magnetization m ( r ) like ρ = ρ (cid:107) cos θ + ρ ⊥ sin θ . Inverting this relation,the Cartesian elements Σ ij [ m ] = σ ⊥ + σ δ m i m j with σ ⊥ = 1 /ρ ⊥ , σ δ = 1 /ρ (cid:107) − /ρ ⊥ and the AMR ratio a = 2 (cid:0) ρ (cid:107) − ρ ⊥ (cid:1) / (cid:0) ρ (cid:107) + ρ ⊥ (cid:1) [11–13, 31]. A large enoughcurrent-induced torque in Eq. (1) rotates the magnetizationthat in turn modulates the current distribution by Eq. (4). Wesolve the spatiotemporal Eqs. (1, 4) self-consistently under theconstraint ∇ · j = 0 by the COMSOL Multiphysics [32] finiteelement code. Magnetic synapse.
In magnetic films with DMI
D > √ AK/π [26, 33, 34], a spiral maze domain texture emerges[35] as illustrated by Fig. 1(a) for a
400 nm ×
400 nm slabwith thickness µ m. Parameters are typical for, e.g. ,Pt/CoFe/MgO, but a large AMR ratio a = 150% as foundin Sr IrO [36], and D = 6 . × − A. In this system manyenergetically nearly degenerate textures span a huge config-uration space that is accessible by small variations in tem-perature, field or voltage. Here we focus on the overdampedregime with α = 0 . , which can be reached by rare earth dop-ing [37]. The low-energy realizations of magnetic textures arethen stable in the absence of applied torque and forces, whilethe effects of weak pinning may be disregarded (see Supple- mental Materials (SM) [38]).Under the action of the spin-transfer torque caused by anelectric voltage V applied across the film, the maze-domaintexture evolves with time, simultaneously modulating the con-ductance via the AMR. Fig. 1(c-f) shows snapshots of thetexture with V = 0 .
05 V applied in the x direction, as inFig. 1(a). The domains tend to align perpendicular to the cur-rent flow to minimize the spin-transfer torque, as observedin thin films of lanthanum strontium manganite [39], but thesample boundaries prevent perfect alignment. The reorienta-tion of the texture induced by the current leads to a conduc-tance increase in the longitudinal direction ( ˆ x ) and a conduc-tance decrease in the transverse direction ( ˆ y ), see Fig. 1(b).The magnetic texture acts as memristors that saturates onlywhen high voltages are applied for sufficiently long time. A self-learning Hopfield network.
The magnetic tex-tures equipped with electrodes establish a Hopfield network[40, 41] of fully connected neurons that can recognize pat-terns. Fig. 1(g) sketches a Hopfield network with 4 neurons,where the synapses between different neurons are realizedby the underlying magnetic film with textures. The neuroninputs are the voltages at the electrodes, which take binaryvalues as { V i = ± V } . According to Kirchhoff’s law I i = (cid:80) j G ij [ m ] V j , the current through the electrodes are governedby a (symmetric) conductance matrix G ij [ m ] = G ij + G (cid:48) ij [ m ] that consists of a texture-independent ˆ G and dependent con-tributions ˆ G (cid:48) [42]. Current conservation implies G ii = − (cid:88) j (cid:54) = i G ij > with G ij < for i (cid:54) = j, (5)where a current is positive when flowing into the electrodes. ˆ G can be measured by saturating the magnetic film by a suf-ficiently strong magnetic field perpendicular to the film andsatisfies the same constraints as ˆ G in Eq. (5). The difference ˆ G (cid:48) = ˆ G − ˆ G is still bound by G (cid:48) ii = − (cid:80) j (cid:54) = i G (cid:48) ij but thenon-diagonal elements G (cid:48) ij can have either sign.We may distinguish the current contributions from thetexture-independent conductance ˆ G and that from thetexture-dependent conductance ˆ G (cid:48) : I i = I i + I (cid:48) i , where I (cid:48) i = (cid:88) j G (cid:48) ij V j . (6)This completes the formulation of Hopfield network with V j = ± V the neuron inputs, G (cid:48) ij the tunable weights, and sign( I (cid:48) i ) the neuron outputs. Analogous to the the bipolarIsing spin glass model [43], we define the functional energy E (cid:48) = − (cid:88) i I (cid:48) i V i = − (cid:88) i,j G (cid:48) ij V i V j = 2 V (cid:88) V i (cid:54) = V j G (cid:48) ij . (7)Physically, − E (cid:48) is the power consumption of the magneticthin film (with conductance matrix G ij ) relative to that of thetexture-free thin film ( G ij ): − E (cid:48) = − E − ( − E ) with − E = (cid:80) i G ij V i V j . With fixed G ij , we know − E in advance forarbitrary inputs { V i } . The energy Eq. (7) therefore measuresthe additional Joule heating caused by the magnetic texture.The film can be trained to memorize a pattern encoded byan array of binary values V ≡ { V i } simply by applying thevoltages { V i } and let magnetization evolve. The texture ad-justs itself to the current-induced spin-transfer torque suchthat the conductances ( | G ij | and (cid:12)(cid:12) G (cid:48) ij (cid:12)(cid:12) ) between electrode- i and - j increase when V i (cid:54) = V j , thereby decreasing the objec-tive function E (cid:48) in Eq. (7). The conductances between elec-trodes with the same voltage cannot be directly trained, butthey tend to decrease when other conductances grow.Fig. 2 shows examples of training a four-neuron network tomemorize three patterns, corresponding to the voltages on thefour nodes as { + − + −} , { + − − + } , { + + − + } , respec-tively. The top panels show the post-training texture and cur-rent density distribution. The lower panels of Fig. 2 show theenergies Eq. (7) of the trained texture when fed by all possi-ble inputs. The energy is minimized when the trial pattern (upto an global sign change) agree with the memorized pattern.For instance, Fig. 2(d) shows clear minima for the equivalent { + − + −} and {− + − + } states. Due to the point (rotatingthe pattern according to center) and mirror symmetry (flippingthe voltage sign), there are 8 degenerate patterns for Fig. 2(c)(2 for (a), 4 for (b)), representing 14 out of 16 possible states(that include the trivial { + + + + } and {− − − −} ) spannedby a 4-node Hopfield network.The memorized pattern can be retrieved by standard infer-ring algorithms, for example by feeding the neurons with a random initial pattern of binary voltages { V i (0) } with smallamplitudes that do not perturb the texture. The voltages canbe then updated either asynchronously or synchronously [41]with V i ( t + 1) = sign( I (cid:48) i ( t )) V . The self-consistent state with V i = sign( I (cid:48) i ) V corresponds to the minimum of the energyfunction Eq. (7) and the memorized pattern [38]. Natural learning.
In other types of hardware-implementedsynapses, such as cross-bar grid memristors [44, 45], theweight-updating requires learning algorithms (such as theback-propagation method) [46] that have to be executed ex-ternally. In contrast, the weight-updating in magnetic texturesis natural and intrinsic. The positive feedback mechanism be-tween the training current and the texture response does notrequire any external interference. The reinforcement of theinterconnection of two “neurons” when activated by differentvoltages is analogous to Hebb’s learning rule in neuroscience[47] stating that simultaneous activation of neurons leads toincreased synaptic strength.By dividing the continuous training process into discretetemporal slices, i.e. regarding the voltages as a train of pulses,the conductance (weight) matrix evolves as G n +1 ij = G nij + ∆ nij [ { V i } ] . (8)Here ∆ nij updates at step- n the weight for the connection be-tween node i and j , depending mainly on the voltage differ-ence V i − V j . The conductance matrix will not reach overallsaturation since the magnetic strips twist and move but cannotbe easily created or destroyed, especially in the presence oftopological defects stabilized by the DMI. The strengtheningof certain connections therefore weakens others. This com-petition of the weight modulations is another typical featureof organic synapses that is replicated by the magnetic textures[41]. Discussion.
The functionality and efficiency of the pro-posed neural network relies on the magnitude of the AMR.The AMR ratio a is ∼ in Ni Fe thin films [48], − / in single-crystalline Co x Fe − x alloys depend-ing on growth direction [49], and ∼ in bilayeredLa . Sr . Mn O single crystals [50]. In antiferromagneticSr IrO it can reach a ∼ − [36]. A larger DMI thanused here ( > .
002 J / m ) reduces the pitch of the spirals andimproves the efficiency of the spin-transfer torque. The pa-rameters and voltages are chosen optimistically to keep com-putation times manageable. The general idea works for lessoptimal materials, but at the costs of higher power dissipationand training times. We show in the SM that the planar andanomalous Hall effects [38, 51] may be disregarded becausethe in-plane and perpendicular magnetizations vanish on av-erage in the absence of an external field [52].The bulk-type DMI considered above generates Bloch typedomain walls, that the current induced torque tends to realignsuch that the conductance increases. In materials with a dom-inant field-like spin-transfer torque [29] or with interfacial-type DMI [26, 53], a negative feedback reduces the con-ductance by applying currents (see SM [38]). Our networkscheme can cooperate as well with such negative-feedback (b) (c)(d) (e) (f)(a) {+ - + -}{- + - +} {- + + -} {+ - - +} {+ + - +}{- - + -} Figure 2. Simulation of a 4-node Hopfield network after training for the states { V i } = { + − + −} , { + − − + } and { + + − + } , respectively.(a)-(c) Magnetization distribution (top panel, arrows for in-plane magnetization and color code for out-of-plane magnetization) and currentdensity distribution log | j | (bottom panel, magenta cones for electric current j on a logarithmic scale). (d)-(f) The effective energy E (cid:48) (Eq. (7))is minimal for the trained voltage patterns. by swapping the roles of voltage and current in the inferringprocess and using the resistance instead of conductance asweights in the objective function.Our proof-of-principle device can only store a 4-pixels pat-tern. Larger pictures can be stored by increasing the num-ber of nodes, and multiple patterns can be memorized whenthe energy function has multiple local minima [40]. Scalingup the network increases the storing capacity in the form ofmore complex and multiple input patterns. Performance canbe optimized also by the node positions or employing threedimensional textures. Conclusion.
We proposed a neural network formed by elec-tric contacts to conducting magnets with a complex magne-tization texture. Because of the plasticity of the magnetictextures, the weights of the synapses can be automaticallyupdated during the training process via a positive feedbackmechanism between the current-induced spin-transfer torqueand the electrical conductance. We numerically simulatetraining of and retrieval from a 4-node Hopfield network ona spiral magnetic maze stabilized by DMI. The learning isnatural, based on physical laws without human intervention.The concept can be generalized to other materials with “plas-ticity”, such as reconfigurable ferroelectrics with conductingdomain walls [54, 55]. Our work paves the way to realizehardware-based neuromorphic computing with natural learn-ing.
Acknowledgements.
We are grateful to Shunsuke Fukami,Hangwen Guo, Zhe Yuan and Ke Xia for fruitful discussions. This work was supported by the JSPS Kakenhi (Grant Nos.20K14369, 19H006450). J.X. was supported by National Sci-ence Foundation of China (Grant No. 11722430) and Shang-hai Municipal Science and Technology Major Project (GrantNo. 2019SHZDZX01). W.Y. acknowledges the support fromthe State Key Laboratory of Surface Physics. ∗ [email protected][1] Jacques Mattheij, “Another Way Of Looking At Lee Sedolvs AlphaGo,” https://jacquesmattheij.com/another-way-of-looking-at-lee-sedol-vs-alphago/ ,online; accessed 17 March 2016.[2] E. Goi, Q. Zhang, X. Chen, H. Luan, and M. Gu, PhotoniX ,3 (2020).[3] J. Grollier, D. Querlioz, and M. D. Stiles, Proceedings of theIEEE , 2024 (2016).[4] W. A. Borders, A. Z. Pervaiz, S. Fukami, K. Y. Camsari,H. Ohno, and S. Datta, Nature , 390 (2019), number: 7774Publisher: Nature Publishing Group.[5] M. W. Daniels, A. Madhavan, P. Talatchian, A. Mizrahi, andM. D. Stiles, Physical Review Applied , 034016 (2020), pub-lisher: American Physical Society.[6] W. Yu, J. Lan, and J. Xiao, Physical Review Applied ,024055 (2020).[7] J. Grollier, D. Querlioz, K. Y. Camsari, K. Everschor-Sitte,S. Fukami, and M. D. Stiles, Nature Electronics , 1 (2020),publisher: Nature Publishing Group.[8] B. Kröse and P. Smagt, An introduction to Neural Networks (1993).[9] J. Torrejon, M. Riou, F. A. Araujo, S. Tsunegi, G. Khalsa,D. Querlioz, P. Bortolotti, V. Cros, K. Yakushiji, A. Fukushima,H. Kubota, S. Yuasa, M. D. Stiles, and J. Grollier, Nature ,428 (2017).[10] M. Romera, P. Talatchian, S. Tsunegi, F. A. Araujo, V. Cros,P. Bortolotti, J. Trastoy, K. Yakushiji, A. Fukushima, H. Kub-ota, S. Yuasa, M. Ernoult, D. Vodenicarevic, T. Hirtzlin, N. Lo-catelli, D. Querlioz, and J. Grollier, Nature , 1 (2018).[11] D. Prychynenko, M. Sitte, K. Litzius, B. Krüger, G. Bourianoff,M. Kläui, J. Sinova, and K. Everschor-Sitte, Physical ReviewApplied , 014034 (2018).[12] G. Bourianoff, D. Pinna, M. Sitte, and K. Everschor-Sitte, AIPAdvances , 055602 (2018).[13] D. Pinna, G. Bourianoff, and K. Everschor-Sitte, Physical Re-view Applied , 054020 (2020), publisher: American PhysicalSociety.[14] D. Ielmini and V. Milo, Journal of Computational Electronics , 1121 (2017).[15] M. A. Zidan, J. P. Strachan, and W. D. Lu, Nature Electronics , 22 (2018).[16] B. Rajendran and F. Alibart, IEEE Journal on Emerging andSelected Topics in Circuits and Systems , 198 (2016).[17] S. Yu, Proceedings of the IEEE , 260 (2018).[18] S. Lequeux, J. Sampaio, V. Cros, K. Yakushiji, A. Fukushima,R. Matsumoto, H. Kubota, S. Yuasa, and J. Grollier, ScientificReports , 31510 (2016).[19] Y. Huang, W. Kang, X. Zhang, Y. Zhou, and W. Zhao, Nan-otechnology , 08LT02 (2017).[20] S. Li, W. Kang, Y. Huang, X. Zhang, Y. Zhou, and W. Zhao,Nanotechnology , 31LT01 (2017).[21] X. Chen, W. Kang, D. Zhu, X. Zhang, N. Lei, Y. Zhang,Y. Zhou, and W. Zhao, Nanoscale , 6139 (2018).[22] S. Fukami and H. Ohno, Journal of Applied Physics ,151904 (2018).[23] Rodrigo Ceron, “AI today: Data, training and inferenc-ing,” (2019).[24] I. Dzyaloshinsky, Journal of Physics and Chemistry of Solids ,241 (1958).[25] T. Moriya, Physical Review , 91 (1960).[26] S. Rohart and A. Thiaville, Physical Review B , 184422(2013).[27] M. D. Stiles and A. Zangwill, Physical Review B , 014407(2002).[28] Z. Li and S. Zhang, Physical Review Letters , 207203 (2004).[29] S.-M. Seo, K.-J. Lee, H. Yang, and T. Ono, Physical ReviewLetters , 147202 (2009).[30] W. Thomson, Proceedings of the Royal Society of London ,546 (1857), publisher: Royal Society.[31] B. Krüger, Current-Driven Magnetization Dynamics : Analyti-cal Modeling and Numerical Simulation , PhD dissertation, Uni-versität Hamburg (2012).[32] “COMSOL Multiphysics (cid:114) , 501 (2016).[34] J. Lan, W. Yu, and J. Xiao, Nature Communications , 178(2017).[35] M. Viret, Y. Samson, P. Warin, A. Marty, F. Ott, E. Søndergård,O. Klein, and C. Fermon, Physical Review Letters , 3962 (2000), publisher: American Physical Society.[36] H. Wang, C. Lu, J. Chen, Y. Liu, S. L. Yuan, S.-W. Cheong,S. Dong, and J.-M. Liu, Nature Communications , 2280(2019), number: 1 Publisher: Nature Publishing Group.[37] G. Woltersdorf, M. Kiessling, G. Meyer, J.-U. Thiele, and C. H.Back, Physical Review Letters , 257602 (2009), publisher:American Physical Society.[38] See Supplemental Material at hppt://link.aps.org/ fordetails of numerical method, material parameters, discussionon pinning sites, demonstration of inferring process and moresupporting data.[39] C. Liu, S. Wu, J. Zhang, J. Chen, J. Ding, J. Ma, Y. Zhang,Y. Sun, S. Tu, H. Wang, P. Liu, C. Li, Y. Jiang, P. Gao, D. Yu,J. Xiao, R. Duine, M. Wu, C.-W. Nan, J. Zhang, and H. Yu,Nature Nanotechnology , 691 (2019), number: 7 Publisher:Nature Publishing Group.[40] J. J. Hopfield, Proceedings of the National Academy of Sci-ences , 2554 (1982).[41] R. Rojas, Neural Networks: A Systematic Introduction (Springer Science & Business Media, 2013) google-Books-ID:4rESBwAAQBAJ.[42] Note that ˆ G is the effective conductance matrix between nodes,while ˆΣ is local conductivity matrix.[43] D. J. Amit, H. Gutfreund, and H. Sompolinsky, Physical Re-view A , 1007 (1985), publisher: American Physical Society.[44] K.-H. Kim, S. Gaba, D. Wheeler, J. M. Cruz-Albrecht, T. Hus-sain, N. Srinivasa, and W. Lu, Nano Letters , 389 (2012),publisher: American Chemical Society.[45] M. Prezioso, F. Merrikh-Bayat, B. D. Hoskins, G. C. Adam,K. K. Likharev, and D. B. Strukov, Nature , 61 (2015).[46] S. G. Hu, Y. Liu, Z. Liu, T. P. Chen, J. J. Wang, Q. Yu, L. J.Deng, Y. Yin, and S. Hosaka, Nature Communications , 7522(2015), number: 1 Publisher: Nature Publishing Group.[47] A. Hartstein and R. H. Koch, in Advances in Neural Informa-tion Processing Systems 1 , edited by D. S. Touretzky (Morgan-Kaufmann, 1989) pp. 769–776.[48] T. G. S. M. Rijks, S. K. J. Lenczowski, R. Coehoorn, andW. J. M. de Jonge, Physical Review B , 362 (1997), pub-lisher: American Physical Society.[49] F. Zeng, Z. Ren, Y. Li, J. Zeng, M. Jia, J. Miao, A. Hoffmann,W. Zhang, Y. Wu, and Z. Yuan, Physical Review Letters ,097201 (2020), publisher: American Physical Society.[50] W. Ning, Z. Qu, Y.-M. Zou, L.-S. Ling, L. Zhang, C.-Y. Xi, H.-F. Du, R.-W. Li, and Y.-H. Zhang, Applied Physics Letters ,212503 (2011), publisher: American Institute of Physics.[51] N. Nagaosa, J. Sinova, S. Onoda, A. H. MacDonald, and N. P.Ong, Reviews of Modern Physics , 1539 (2010), publisher:American Physical Society.[52] T. Taniguchi, J. Grollier, and M. Stiles, Physical Review Ap-plied , 044001 (2015), publisher: American Physical Society.[53] A. Fert, V. Cros, and J. Sampaio, “Skyrmions on the track,”(2013).[54] D. Meier, J. Seidel, A. Cano, K. Delaney, Y. Kumagai,M. Mostovoy, N. A. Spaldin, R. Ramesh, and M. Fiebig, Na-ture Materials , 284 (2012), number: 4 Publisher: NaturePublishing Group.[55] J. P. V. McConville, H. Lu, B. Wang, Y. Tan, C. Cochard,M. Conroy, K. Moore, A. Harvey, U. Bangert, L.-Q.Chen, A. Gruverman, and J. M. Gregg, AdvancedFunctional Materials30