Fully analog memristive circuits for optimization tasks: a comparison
SSeptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 1
Chapter 1Fully analog memristive circuits for optimization tasks:a comparison
F. C. Sheldon † , , F. Caravelli † , C. Coffrin ∗† T-Division (T4),
Center for Nonlinear Studies and ∗ A-Division (A1)Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA
We introduce a Lyapunov function for the dynamics of memristive cir-cuits, and compare the effectiveness of memristors in minimizing thefunction to widely used optimization software. We study in particularthree classes of problems which can be directly embedded in a circuittopology, and show that memristors effectively attempt at (quickly) ex-tremizing these functionals.
1. Introduction
As the challenges of scaling traditional transistor-based computationalhardware continue to intensify, “Moores Law, governing the exponentialincrease of transistor density, is coming to an end. While the first comput-ers were analog, in the past decades digital computing has made incredibleprogress and our laptops are now more powerful than the supercomputersjust 30 years ago. On the other hand, there remain hard computationalproblems that still challenge computer scientists and modern digital com-puters; in particular many optimization problems. Recently, interest hasgrown in embedding algorithms directly in analog hardware in the hopethat the corresponding hardware speedup could yield a useful specializedprocessor. In this chapter we focus on the application of analog nanoscaleelectronic devices with memory, more specifically memristors. Proposalsfor specialized co-processors formed of memristors show extreme breadthand versatility in computing applications, ranging from optimization toartificial neural networks. Here we focus on understanding how the nativedynamics of memristive circuits encode features of optimization problems.Memristors are two-terminal devices that display pinched (at the ori-gin) hysteretic behavior in their voltage-current diagram. Physical mem- a r X i v : . [ n li n . AO ] S e p eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 2 F. C. Sheldon, C. Coffrin, F. Caravelli ristors have rather non-trivial voltage-current curves, but many corefeatures are captured by a simple description which we adopt in this pa-per. In this model, the state of the resistance varies between two limitingvalues and can be described by a parameter w which depends on the pre-vious history of the device dynamics and thus may be interpreted as amemory. We will refer to w as the internal memory parameter . In spirit,memristors have the essential property that the underlying dynamics arethe result of competition between resistance reinforcement, caused by theflow of currents through the device, and a thermodynamically driven de-cay. Recent advancements show that there is a deep connection betweenthe asymptotic memory states of the circuits and the solutions of combi-natorial optimization, and the ground states of the Ising model and spinglasses. Additionally, memristors offer a possible substrate to constructneuromorphic chips, e.g. electronic components that behave similarly tohuman neuronal cells. Central to all of these applications is that memris-tors, as we show in this paper, can perform computation without requiringCMOS, thus in a fully analog fashion. As a result, circuits of memristorshave been proposed as a potential basis for the next generation of passiveand low-energy computational architectures.Interest in specialized analog co-processors for solving optimizationproblems has generated a host of possible approaches. While some of theseproblems can be in principle be tackled using quantum computers, it is unlikely that these will be available for mass distribution. One of theproposed alternative paradigms is in-memory computation: removing theseparation between memory and computing typical of the von Neumann ar-chitecture. In this approach specialized circuits are designed to utilize activecomponents in concert with memristors to obtain the solution of a specificproblem. In this work, we consider a more fundamental question: dothe dynamics of circuits of memristors encode optimization problems na-tively? Understanding their asymptotic behavior requires characterizingthe interplay between nonlinear dynamics, interactions and constraints andas a result the dynamics of memristor networks is still an area of activeresearch, despite the the fact that the theory behind a single device wasintroduced over half a century ago.
With this purpose in mind, in this paper we study a specific optimizationproblem in the context of fully analog memristive circuits, e.g. circuitscomposed only of memristors. For these circuits we can take advantage ofan exact evolution equation for the internal memory parameters, which willserve as our case study. For these equations, we derive a novel Lyapunov eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 3 function (and which solves some of the problems of a Lyapunov functionprovided in the literature). Being the Lyapunov function being minimizedby the memristive network, we compare the results of the minimization tostate of the art optimization software.
2. Dynamical equation for memristor circuits2.1.
Single memristor and Lyapunov function
For the case of titanium dioxide devices, a rather simple toy model for theevolution of the resistance is the following: R ( w ) = R on (1 − w ) + wR off ≡ R on (1 + ξw ) ,ddt w ( t ) = αw ( t ) − R on β i ( t ) , (1)initially studied for α = 0, and where 0 ≤ w ≤ ξ = R off − R on R on ; in theequation above i ( t ) is the current flowing in the device at time t . Physically, w can be interpreted as the level of internal doping of the device, but thisis a crude description. The constants α, β and ξ control the decay andreinforcement time scales and the degree of nonlinearity in the equationrespectively, and can be measured experimentally. While ξ is adimensionaland depends only on the resistance boundaries, α has the dimension of aninverse time, while β has the dimension of time divided by voltage. Asidefrom applications to memory devices, there is interest in these componentsalso because memristors can serve as memory for neuromorphic computingdevices. We first demonstrate that this equation possesses a Lyapunov functionthat governs it’s asymptotic behavior. In order to understand the Lyapunovfunction of the full network, we begin with the case of a single memristordriven by a voltage generator V ( t ). From the equations above, we have ddt w ( t ) = αw ( t ) − R on β V ( t ) R on (cid:0) ξw ( t ) (cid:1) , (2)from which we obtain (cid:0) ξw ( t ) (cid:1) ddt w ( t ) = α (cid:0) ξw ( t ) (cid:1) w ( t ) − β V ( t )= α (cid:16) w ( t ) + ξw ( t ) − V ( t ) αβ (cid:17) . (3)Let us define now L ( w ) = a w ( t ) + b w ( t ) + c w ( t ) V ( t ) . (4) eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 4 F. C. Sheldon, C. Coffrin, F. Caravelli
We have ddt L ( w ) = (cid:16) a w ( t ) + 3 b w ( t ) + c V ( t ) (cid:17) dwdt + cw ( t ) dVdt . (5)Now assume that V ( t ) = V . If we choose a = − , b = − ξ, c = 1 αβ (6)Thus ddt L ( w ) = (cid:16) − w ( t ) − ξw ( t ) + 1 αβ V (cid:17) dwdt = − α (cid:0) dwdt (cid:1) . (7)Thus if α > dLdt ≤ dwdt (cid:54) = 0 , (8)with L ( w ) = V αβ w ( t ) − w ( t ) − ξw ( t ) . (9)Now, for α = 0 the solution is of the form w ( t ) = √ qV t − c and thus ddt w = 0 can be only satisfied only for w = 1 or w = 0. For α (cid:54) = 0 there isno explicit analytical solution but it can be expressed in the form s = V βq ( t ) = c − tf ( t ) = log (cid:0) αξq ( t ) + αq ( t ) + s (cid:1) α + tan − (cid:16) √ α (2 ξq ( t )+1) √ ξs − α (cid:17) √ α √ ξs − αw ( t ) = f − ( t )1 ≥ w ( t ) ≥ , (10)whose analysis goes beyond the scope of this paper.However, a way to see that the system must eventually reach one of theboundary points w = { , } , is the fact that there is fixed point for thedynamics, which is defined by the equation w ∗ (1 + ξw ∗ ) = V αβ . (11)However, the analysis of the stability of the fixed point reveals that this isan unstable fixed point. From this fact we can intuitively understand thatif w (0) > w ∗ , necessarily we have w ( ∞ ) = 1, and while if w (0) < w ∗ we eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 5 obtain w ( ∞ ) = 0. A similar analysis applies to the case of a network ofconnected memristors, as we will see shortly.Given the fact that w ( ∞ ) ∈ { , } , we have w n ( ∞ ) = w ( ∞ ) and we cansimplify the asymptotic form of the Lyapunov function to L ( w ∞ ) = V αβ w ∞ − w ∞ − ξw ∞ = ( V αβ − − ξ ) w ∞ . This function has asymptotic values= { V αβ − − ξ, } = { w ∗ (1 + ξw ∗ ) − − ξ, } . (12)The dynamics of a memristor are thus connected to an optimizationproblem of the form, L ∗ = min { V αβ − − ξ, } , or L ∗ = min { V αβ − ξ, } , (13)however we have no guarantee that the dynamics will “pick” the correctminimum of the Lyapunov function and from our analysis above, we seethat this should be depend on the initial conditions. It is easy to performsimulations of the system above. For instance, we find that for α = 0 . β = ξ = 10, and V = 0 .
92, the system ends in the real minimum ofthe asymptotic function 70% of the time, yet still the system can havea macroscopic portion of asymptotic states not in the minimum of theLyapunov “energy”. This fact shows that while the Lyapunov function isbeing minimized along the dynamics of the memristors, the system caneffectively be trapped in local minima. This is why in this paper we focuson the minimization of a continuous Lyapunov function for which we cancompare the observed asymptotic states from the memristor dynamics tominima obtained via state of the art optimization software.
Circuits
We now with to extend the analysis we did for a single memristor to acircuit. We consider a graph in which each edge contains a memristor andvoltage generator in series. The state of the internal memory parametersis thus a vector (cid:126)w in which each entry corresponds to an edge and each aredriven by voltage generators (cid:126)s ( t ). Memristors in the graph will now interactdue to shared currents at the nodes/electrical junctions of the graph. eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 6 F. C. Sheldon, C. Coffrin, F. Caravelli
The extension of eqn. (1) to a circuit can be done, and is given by ddt (cid:126)w ( t ) = α (cid:126)w ( t ) − β ( I + ξ Ω W ( t )) − Ω (cid:126)s ( t ) , (14)with the constraints 0 ≤ w i ≤ W ( t ) = diag ( (cid:126)w ( t )) is a diagonal matrix containing the internal memory pa-rameters. The projection operator Ω ij contains the information aboutthe topology of the graph and can be thought of as picking out configu-rations consistent with Kirchoff’s voltage law. As we will discuss shortly,components of Ω ij may also be considered as the interaction strength be-tween memristors in the graph. We note that because Ω is a projectionoperator, Ω = Ω we can always write (cid:126)s = Ω (cid:126)s + ( I − Ω) (cid:126)s , it is straight-forward to show that we can add to (cid:126)s any vector ˜ s = ( I − Ω) (cid:126)k , which willnot affect the dynamics. This form of freedom arises from the Kirchhoffconstraints from which the differential equation has been derived.The set of coupled differential equations above incorporate all dynami-cal and topological constraints of the circuit exactly. Kirchoff’s Lawsmanifest themselves via the projection operator Ω which intervenes in thedynamics. Such projector operator also emerges for purely resistive circuitswith edges of the graph containing voltage generators S i in series to resis-tors r i . For the case of constant resistance r i = r , the equilibrium currentscan be written in a vectorial form as (cid:126)i ( t ) = − r Ω (cid:126)S ( t ) , (15)where Ω = A t ( AA t ) − A is a non-orthogonal projector on the cycle space ofthe graph. The matrix A has the dimension Cycles × Edges of the graph(each row designates a fundamental cycle of the graph), and thus Ω has thecorrect dimension (e.g. the number of memristors).
We can also generalize equation (14) to various forms of driving in-cluding current generators in parallel with memristors or current/voltagegenerators driving the nodes of the circuit. We cast these in a general formusing a generic source vector (cid:126)x as, ddt (cid:126)w ( t ) = α (cid:126)w ( t ) − β ( I + ξ Ω A W ( t )) − (cid:126)x, (16)where we have (cid:126)x = Ω A (cid:126)s Voltage sources in series A ( A T A ) − (cid:126)s ext Voltage sources at nodesΩ B (cid:126)j Current sources in parallel B T ( BB T ) − (cid:126)j ext Current sources at nodes . eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 7 The purpose of this chapter is to further understanding of the asymp-totic dynamics of a circuit of memristors, and specifically the statistics ofthe resistive states. An analysis of the asymptotic states can be done viaLyapunov functions as we did for the case of a single memristor. After afirst attempt at deriving a Lyapunov function, plagued by constraints onthe external fields, here we provide a novel yet similar Lyapunov functionfree of these requirements. From the point of view of optimization withanalog dynamical systems, different Lyapunov functions provide differentways of embedding a computational problem in a physical system.We follow the same prescription as the single memristor case, but wherethe interaction matrix is a projection operator on the cycle basis of thecircuit. Lyapunov function for memristor circuits
We begin with the equations of motion,( I + ξ Ω W ) ˙ (cid:126)w = α (cid:126)w + αξ Ω W (cid:126)w − β (cid:126)x (17)where we have multiplied by ( I + ξ Ω W ). Consider L = − α (cid:126)w T W (cid:126)w − αξ (cid:126)w T W Ω W (cid:126)w + 12 β (cid:126)w T W (cid:126)x. (18)In this case, we have dLdt = ˙ (cid:126)w T (cid:18) − αW (cid:126)w − αξW Ω W (cid:126)w + 1 β W (cid:126)x (cid:19) = − ˙ (cid:126)w T ( W + ξW Ω W ) ˙ (cid:126)w = − ˙ (cid:126)w T √ W ( I + ξ √ W Ω √ W ) √ W ˙ (cid:126)w = −||√ W ˙ (cid:126)w || I + ξ √ W Ω √ W ) (19)and we have that dLdt ≤ I + ξ √ W Ω √ W ) is positive definite. Wethus have that L in equation (29) is a Lyapunov function for a circuit ofmemristors.An asymptotic form can be obtained by replacing w ki = w i for integer k , as asymptotically one has w i = { , } . Thus, the asymptotic Lyapunovfunction form is given by L ( (cid:126)w ) = − αξ (cid:126)w T Ω (cid:126)w + (cid:126)w T (cid:18) β (cid:126)x − α (cid:19) eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 8 F. C. Sheldon, C. Coffrin, F. Caravelli which is a form familiar from physics in the context of spin systems. Wecan re-express this in terms of spin variables σ i = 2 w i − L = 8 L ( (cid:126)σ ) α = (cid:126)σ · ( 2 (cid:126)xαβ − (cid:126) − ξ Ω (cid:126) − ξ (cid:126)σ ˜Ω (cid:126)σ (20)where ˜Ω has only the off-diagonal terms of Ω. The structure of the Lya-punov function above is very similar to the one described before, but onlycontains the spectral condition I + ξ √ W Ω √ W ≥
0, which is natural. Wecan thus identify an effective local field (cid:126)h = (cid:126)xαβ − (cid:126) − ξ Ω (cid:126) w i . As individ-ual memristors reach their boundaries and their dynamics halted, compo-nents of the derivative in equation (19) go to 0. As a test of the fact that theLyapunov function above works when including boundary effects, in Fig. 1we plot dLdt evaluated numerically for 100 instances (Ω ,(cid:126)h ), in which Ω wasobtained from random circuits and (cid:126)h is a gaussian-distributed vector.We now wish to show that the Lyapunov function converges asymptot-ically only on the boundary of the set [0 , N , which is what one observesnumerically. Number of fixed points and stability
As for the case of the one dimensional model, the fixed points of the dy-namics are important in order to understand the stability of the system. Inthe previous section we have assumed that our Lyapunov function can bereplaced with an asymptotic form which is on the binary set w i = { , } .We wish to show this feature in this section.The fixed points are determined via (cid:126)w ∗ = ( I + ξ Ω W ∗ ) − (cid:126)sαβ . (21)Let us assume that (cid:126)w = (cid:126)w ∗ + δ (cid:126)w , where (cid:126)w ∗ is a fixed point. Then, wehave ddt δ (cid:126)w = ∂ (cid:126)w (cid:126)f ( (cid:126)w ∗ ) δ (cid:126)w. (22)For memristors one has f i ( (cid:126)w ) = αw i − (cid:88) k ( I + ξ Ω W ) − ik (Ω s ) k , (23) eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 9 Fig. 1.: Derivative of the Lyapunov function of eqn. (29) for a 100 randominitial conditions and instances (Ω ,(cid:126)h ). We see that the derivative is alwaysnegative, and thus L is decreasing.from which, if we use ∂ x A − = − A − ( ∂ x A ) A − ∂ w j f i = αδ ij + ξ β (cid:88) krts ( I + ξ Ω W ) − ik Ω kr ( ∂ w j W ) rt ( I + ξ Ω W ) − ts (Ω s ) s . (24)Evaluating this at the fixed point, we have αβ (cid:126)W = ( I + ξ Ω W ) − Ω (cid:126)s (25)from which J ij = ∂ w j f i = α (cid:16) δ ij + ξ (cid:88) k ( I + ξ Ω W ) − ik Ω kj W j (cid:17) = α (cid:16) δ ij + ξ ( I + ξ Ω W Ω) − ij W j (cid:17) (26)where the last line can be derived from the Neumann representation of theinverse and the projection condition. We now aim to prove that J ij (cid:31) α > ξ >
0, will follow from ( I + ξ Ω W Ω) − ij W j (cid:31) eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 10 F. C. Sheldon, C. Coffrin, F. Caravelli
Now we have that for any matrix A , A ∼ P AP − , from which weobtain AD ∼ √ DA √ D for D (cid:31)
0. Thus, ( I + ξ Ω W Ω) − ij W j ∼ √ W i ( I + ξ Ω W Ω) − ij (cid:112) W j . This matrix is clearly positive as it is symmetric and( I + ξ Ω W Ω) − ij is positive because Ω W Ω is positive. This implies that J ij (cid:31) C (Σ) and C ([0 , N ). We thus ask ourselves what is C (Σ). We can write the fixedpoint equation without loss of generality as (cid:126)w + ξ Ω (cid:126)w = (cid:126)sαβ ≡ (cid:126)b, (27)where ( (cid:126)w ) i = w i . The equation above can be written as a set of N constraints of the form w i + ξ Ω ii w i − b i + ξ (cid:88) j (cid:54) = i w j = 0 (28)which defines the set of N intersecting quadrics. The intersection of thesequadrics defines an algebraic variety of degree 2. According to B´ezouttheorem, for a system of well behaved polynomial equations ( N equationswith N variables) of degrees d we have at most d N solutions, which isexactly 2 N in our case. However 2 N discrete points are a set of measurezero in [0 , N .Naturally, this implies that if one initializes the memristors at a randominitial condition in w i (0) ∈ [0 , , N , e.g. { , } N .
3. Analysis and comparisons
In this section we provide evidence of the capability of memristors to sig-nificantly lower the energy as measured by the Lyapunov function.While we have demonstrated a particular form of optimization problemthat is ‘native’ to circuits of memristors, it is common across analog sys-tems that embedding an arbitrary problem into this form is difficult.. Forthis reason we focus on problem instances that are directly embeddable inmemristor circuits; i.e. that arise from different circuit structures. eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 11 The instances
To generate instances native to memristor circuits, we formalize the opti-mization algorithm as a map from a circuit graph G to a projection operatorΩ( G ). This becomes the coupling matrix of our objective function.The underlying graphs G we chose are an Erdos-Renyi random graph( ER ), a 2-dimensional lattice ( Lattice2d ) and a 3 dimensional lattice (
Lat-tice3d ). Given these graphs, we then obtain the projection operatorΩ ij ( G ) = A t ( AA t ) − A (which is a dense matrix), which is based on thecycle space of the graph. A graphical representation of the underlyingcircuit is shown in Fig. 2.Fig. 2.: The three circuit instances we consider. We have an Erdos-Renyi underlying circuit (left), a 2-dimensional lattice (center) and a 3-dimensional lattice (right). Given these, we then build the cycle matrixof the circuit A and calculate the projection operator Ω = A t ( AA t ) − A ,which is a dense matrix, and enters in the Lyapunov function of eqn. (29). Minimization of the continuous Lyapunov function
We compare the result of the minimization of the function L = − α (cid:126)w T W (cid:126)w − αξ (cid:126)w T W Ω W (cid:126)w + 12 β (cid:126)w T W (cid:126)x. (29)using memristive circuits to other optimization algorithms. Specifically,we compare the memristive algorithm in which the dynamical equation(14) is evolved numerically until it reaches a steady state, to an interiorpoint nonlinear optimization algorithm. As a solver, we use the
Ipopt , anopen source (second order) software for large-scale nonlinear optimization. eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 12 F. C. Sheldon, C. Coffrin, F. Caravelli
Specifically, the software is state of the art for nonlinear problems of theform min w ∈ R d f ( w ) (30)s.t. g L ≤ g ( w ) ≤ g U (31) w L ≤ w ≤ w U . (32)where f ( w ) is the function of interest (in our case equation (29)), w L and w U are 0 and 1 respectively in this work, and where we introduce no g ( w )function constraints in the optimization. The results between the two algo-rithms for 15 specific instances are shown in Table 1, for the case of the ERcircuits and lattices of 2- and 3- dimensions. The number of variables weconsider is fairly large, e.g. in the range N ∈ [112 , , N and measure thedistributions of runtime and solution quality. In the interest of breadth, afirst order optimization algorithm based on gradient descent, and a randomassignment algorithm, e.g. we generate random values between [0 , N , arealso included in the comparison. The results are shown in Fig. 3, 4 and5, which compare optimization via memristor networks ( mem , light blue),random assignment ( rand , brown), Ipopt ( nlp , purple) and gradient decent( grad , red).First and foremost, we note that overall Ipopt yields the best solutionquality among the optimization algorithms we considered for each specificinstance. In Fig. 3, 4 and 5 we plot examples of distribution of energystates for the
ER, Lattice2d and
Lattice3d cases. We see that gradientdescent and Ipopt are typically close to each other for these cases, and inparticular in the ER case the memristive optimization is also close to thebest known solutions. For comparison, we plot in all these cases the resultsof a naive random optimization, from which it can be observed that thememristive circuit results are always way below the random assignment.For each class of problems we generated 5 instances. The minimum energyand average time per execution for each instance and class are shown inTab. 1. We see that memristors have a runtime advantage in terms of
Ipopt (a factor of 100), as these run much faster and one can initialize thesystem many times more in an equal amount of time. Also, we observe thatthe density of the Ω matrix places a significant computational burden forcomputing derivatives in second-order methods, such as
Ipopt , which is aproblem feature that the memristor-based approach avoids. eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 13 The optimal solutions found by
Ipopt for the Lyapunov function confirmthat (within a tolerance of 10 − ) the solutions are to be found near theboundary of [0 , N . These results somewhat confirm that the asymptoticstates of a memristive circuit are to be found in local minima of a Lyapunovfunction, and in the discrete of the system.Fig. 3.: Distribution of the minima obtained with random sampling (rand),memristors (mem), gradient descent (grad) and Ipopt (nlp) for the Erdos-Renyi class (Instance 1). We see that the distribution of minima for mem-ristors are rather close to the NLP and Grad results.
4. Conclusions
In the present paper we have discussed the properties of memristive circuitsfrom an optimization perspective. In particular, we have derived a newLyapunov function for a memristive circuit of an arbitrary topology, andshown that if each memory parameter is constrained between [0 , eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 14 F. C. Sheldon, C. Coffrin, F. Caravelli
Fig. 4.: Distribution of the minima obtained with random sampling (rand),memristors (mem), gradient descent (grad) and Ipopt (nlp) for the
Lattice2d class (Instance 1). We see that the system absolute minimum is close tothe tail of the non-linear programming optimization code (Ipopt), while onaverage these are half way between the random and nlp results.device analytically.These results have a variety of implications.
In primis , this shows thatit is possible to overcome some of the problems of previously proposedLyapunov functions in the literature. Moreover, we have tested (from thestandpoint of optimization) whether analog circuits of memristors can beused for minimizing non-linear functions. We have tested three indicativeclasses of circuits, and found that while in none of these cases memristordynamics obtain better minima than state of the art software (
Ipopt ), theyare nonetheless able to obtain good quality minima when the system isinitialized multiple times. From this point of view, memristive dynamicshas the advantage of providing fast good quality solutions. For instance,in the case of non-planar circuits
Ipopt took two order of magnitude longerthan memristive circuits to provide a plausible minimum. In this sense,we have confirmed that memristive dynamics is naturally associated to theminimization of a Lyapunov function. When run in hardware, we expectthis speed advantage to be substantially increased. eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 15 Fig. 5.: Distribution of the minima obtained with random sampling (rand),memristors (mem), gradient descent (grad) and Ipopt (nlp) for the
Lattice3d class (Instance 1). We see that the system absolute minimum is close tothe tail of the non-linear programming optimization code (
Ipopt ).Some comments about the difficulty of the instances we considered arein order. The class we consider, drawn from circuit structures, is previouslyunexplored and thus the difficulty of optimization problems in this class isunknown. We can however draw a few inferences about this class from ourresults. We note first that the solvers we test produce a range of potentialsolutions, giving evidence that these instances are not simply convex andcontain a range of local minima. The software
Ipopt takes considerabletime to find minima in the case of ER and Lattice3d , but not for
Lattice2d ,in which the underlying circuit is planar. We emphasize that, while thecircuit is planar, the matrix Ω is dense (none of the elements are zero).This said, it has been proven that for the case of planar circuits the matrixΩ has exponentially small support on the underlying graph. From thispoint of view, our results suggest that the class
Lattice2d is not as hard asthe other two we consider due to such planarity hidden in the matrix Ω;this can also be seen from the fact that
Ipopt takes significantly less timein finding good quality minima for this class.While we proposed a Lyapunov function for a continuous set of vari- eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 16 F. C. Sheldon, C. Coffrin, F. Caravelli ables, it is still an open question whether there exist an efficient embeddingof a QUBO functional in a memristive circuit such that the QUBO func-tional is minimized along the dynamics as well. The key issue is that whilememristors reach the boundaries of the space M = [0 , N , it is unknownif an efficient embedding exists. This is left for future investigations. eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 17 i n s t a n c e n e n l p - e n g r a d - e n m e m - e n r a nd - e nn l p - t m g r a d - t mm e m - t m r a nd - t m E R - . - . - . - . . . . − E R - . - . - . - . . . . − E R - . - . - . - . . . . − E R - . - . - . - . . . . − E R - . - . - . - . . . . − L a tt d - . - . - . - . . . . − L a tt d - . - . - . - . . . . − L a tt d - . - . - . - . . . . − L a tt d - . - . - . - . . . . − L a tt d - . - . - . - . . . . − L a tt d - . - . - . . . . . − L a tt d - . - . - . - . . . . − L a tt d - . - . - . - . . . . · − L a tt d - . - . - . - . . . . − L a tt d - . - . - . - . . . . − T a b l e .: R e s u l t o f t h e o p t i m i z a t i o n o f e a c h i n s t a n ce ( - ) o f o f t h e t h r eec l a ss e s c o n s i d e r e d i n t h i s a r t i c l e , t h e E r d o s - R e n y i ( E R - r a nd o m ) , a nd t h e L a tt i ce a nd L a tt i ce C l a ss e s . T h e nu m b e r o f n o d e s o f t h ec i r c u i t a r e g i v e n i n n od e s c o l u m n ( n ) , w h il e t h e nu m b e r o f e d g e s o f t h e g r a ph (t h e v a r i a b l e s ) i n t h e e d ge c o l u m n ( e ) . T h e r e s u l t s o f t h e o p t i m i z a t i o nu s i n g I pop t , g r a d i e n t d e s ce n t , m e m r i s t o r s a nd r a nd o m a r e i n t h ec o l u m n s ( n l p , g r ad , m e m , r a n d ) - e n c o l u m n s r e s p ec t i v e l y , w h il e t h e a v e r ag e t i m e ( i n s ec o nd s ) f o r t h e s o l u t i o n t o b e o b t a i n e d i n t h e ( n l p , g r ad , m e m , r a n d ) - t m c o l u m n s . eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 18 F. C. Sheldon, C. Coffrin, F. Caravelli
5. Acknowledgments
This work was carried out under the auspices of the NNSA of the U.S.DoE at LANL under Contract No. DE-AC52-06NA25396, in particularvia DOE-ER grants PRD20190195. Also, FCS is supported by a CNLSFellowship.
References
1. F. Caravelli and J. P. Carbajal, Memristors for the curious outsiders,
Tech-nologies . (118) (2018).2. L. Chua and Sung Mo Kang, Memristive devices and systems, Proceedingsof the IEEE . (2), 209–223 (1976). ISSN 0018-9219. doi: 10.1109/PROC.1976.10092. URL http://ieeexplore.ieee.org/document/1454361/ .3. J. J. Yang, D. B. Strukov, and D. R. Stewart, Memristive devices for com-puting, Nature Nanotechnology . (1), 13–24 (jan, 2013). ISSN 1748-3387. doi:10.1038/nnano.2012.240. URL .4. L. Chua, If it’s pinched it’s a memristor, Semiconductor Science and Technol-ogy . (10), 104001 (oct, 2014). ISSN 0268-1242. doi: 10.1088/0268-1242/29/10/104001. URL http://stacks.iop.org/0268-1242/29/i=10/a=104001?key=crossref.ad43eb8113f35f3a395277739b7dcf63 .5. D. B. Strukov, G. S. Snider, D. R. Stewart, and R. S. Williams, The miss-ing memristor found, Nature . (7191), 80–83 (may, 2008). ISSN 0028-0836. doi: 10.1038/nature06932. URL .6. K. G. Johnsen, An introduction to the memristor a valuable circuit elementin bioelectricity and bioimpedance, J Electr Bioimp . , 20–28 (2012). doi:10.5617/jeb.305.7. F. Traversa and T. Di Ventra, Memcomputing: Leveraging memory andphysics to compute efficiently, Jour. App. Phys. (180901) (2018). doi:10.1063/1.5026506.8. F. L. Traversa and M. Di Ventra, Polynomial-time solution of prime fac-torization and np-complete problems with digital memcomputing machines,
Chaos . (023107) (2017). doi: 10.1063/1.4975761.9. F. C. Sheldon and M. Di Ventra, Conducting-insulating transition in adia-batic memristive networks, Physical Review E . (1), 012305 (jan, 2017).ISSN 2470-0045. doi: 10.1103/PhysRevE.95.012305. URL https://link.aps.org/doi/10.1103/PhysRevE.95.012305 .10. A. Adamatzky and B. D. L. Costello, Physarum attraction: Why slimemold behaves as cats do?, Communicative & Integrative Biology . (3),297–299 (may, 2012). ISSN 1942-0889. doi: 10.4161/cib.19924. URL .11. C. Coffrin, H. Nagarajan, and B. R., Evaluating ising processing units withinteger programming. in: Rousseau lm. in stergiou k. (eds) integration of con- eptember 3, 2020 0:29 ws-rv9x6 Book Title AdamatzkyChapterpage 19 straint programming, artificial intelligence, and operations research, LectureNotes in Computer Science . (2019). doi: https://doi.org/10.1007/978-3-030-19212-9 11.12. G. Rieffel and W. H. Polak, Quantum Computing: A Gentle Introduction .World Scientific, Singapore (1988).13. M. Di Ventra and Y. V. Pershin, The parallel approach,
Nature Physics . (4), 200–202 (apr, 2013). ISSN 1745-2473. doi: 10.1038/nphys2566. URL .14. F. L. Traversa and et al., Evidence of an exponential speed-up in the solutionof hard optimization problems, Complexity . (7982851) (2018). doi: 10.1155/2018/7982851.15. F. L. Traversa, C. Ramella, F. Bonani, and M. Di Ventra, Memcom-puting NP-complete problems in polynomial time using polynomial re-sources and collective states, Science Advances . (6), e1500031–e1500031(jul, 2015). ISSN 2375-2548. doi: 10.1126/sciadv.1500031. URL http://advances.sciencemag.org/cgi/doi/10.1126/sciadv.1500031 .16. L. Chua, Memristor-The missing circuit element, IEEE Transactions on Cir-cuit Theory . (5), 507–519 (1971). ISSN 0018-9324. doi: 10.1109/TCT.1971.1083337. URL http://ieeexplore.ieee.org/document/1083337/ .17. I. Gupta, A. Serb, R. Berdan, A. Khiat, and T. Prodromakis, Volatilitycharacterization for rram devices, IEEE Electron Device Lett. (1) (2017).doi: 10.1109/LED.2016.2631631.18. F. Caravelli, F. L. Traversa, and M. Di Ventra, Complex dynamicsof memristive circuits: Analytical results and universal slow relaxation, Physical Review E . (2), 022140 (feb, 2017). ISSN 2470-0045. doi:10.1103/PhysRevE.95.022140. URL https://link.aps.org/doi/10.1103/PhysRevE.95.022140 .19. F. Caravelli, Locality of interactions for planar memristive circuits, Phys.Rev. E . (052206) (2017). doi: 10.1103/PhysRevE.96.052206.20. A. Zegarac and F. Caravelli, Eur. Phys. Lett. Perspectives . (10001)(2019).21. F. Caravelli, Asymptotic behavior of memristive circuits, Entropy . (21(8)) (2019).22. F. Caravelli, The mise en sc´ene of memristive networks: effective mem-ory, dynamics and learning, International Journal of Parallel, Emergentand Distributed Systems . (4), 350–366 (jul, 2018). ISSN 1744-5760. doi:10.1080/17445760.2017.1320796. URL .23. W. Fulton, Algebraic curves, Mathematics Lecture Note seriesis (W.A. Ben-jamin) (1974).24. A. W¨achter and L. T. Biegler, On the implementation of a primal-dual inte-rior point filter line search algorithm for large-scale nonlinear programming,
Mathematical Programming .106