Paraconsistent Foundations for Quantum Probability
aa r X i v : . [ c s . A I] J a n Paraconsistent Foundationsfor Quantum Probability
Ben GoertzelJanuary 20, 2021
Abstract
It is argued that a fuzzy version of 4-truth-valued paraconsistent logic(with truth values corresponding to True, False, Both and Neither) canbe approximately isomorphically mapped into the complex-number alge-bra of quantum probabilities. I.e., p-bits (paraconsistent bits) can betransformed into close approximations of qubits. The approximation er-ror can be made arbitrarily small, at least in a formal sense, and can berelated to the degree of irreducible ”evidential error” assumed to plaguean observer’s observations. This logical correspondence manifests itselfin program space via an approximate mapping between probabilistic andquantum types in programming languages.
Contents Introduction
The mathematics of quantum mechanics has been viewed and analyzed from ahuge variety of different perspectives, each shedding light on different subtletiesof its underlying structure and its connection to our everyday reality. Here weadd an additional thread to this conceptual polyphony, demonstrating a closeconnection between fuzzy paraconsistent logic and quantum probabilities. Thisconnection suggests new variations on existing interpretations of quantum re-ality and measurement. It also provides some tantalizing connections betweenthe probabilistic and fuzzy logic used in modern AI systems and quantum prob-abilistic reasoning, which may have implications for quantum-computing imple-mentations of logical inference based AI.The ideas here arose as a spin-off from the work reported in [Goe21], whichuses a variety of paraconsistent intuitionistic logic called Constructible Duality(CD) Logic as a means for giving a rigorous logic foundation to the PLN (Prob-abilistic Logic Networks) logic [GIGH08] that has been used in the OpenCog AIproject [GPG13a, GPG13b] for well over a decade now. Notation and conceptsfrom [Goe21] are used liberally here, so the reader is basically required to ingestthe relevant parts of that paper before this one.Constructible Duality Logic features four-valued truth values which in [Goe21]are called p-bits, each of which may take any of the values: True, False, Bothand Neither. In [Goe21], an uncertain version of CD logic is created, involvingtabulation and normalization of positive and negative evidence for a proposi-tion in a novel way with elements of both probabilistic and fuzzy reasoning.Here one more step is taken, leveraging an extension of Knuth, Skilling andGoyal’s [GKS10] work on the foundations of quantum inference to approxima-tively map these uncertain CD truth values (p-bits) into complex numbers witha quantum-theory-friendly complex-probability interpretation.Appropriately enough given the paraconsistent logic theme, our conclusionsregarding the relation between uncertain paraconsistent logic and quantumprobability arithmetic are both discouraging and exciting. On the discouragingside, the approach reiterates the familiar reasons why there cannot ever be anexact isomorphism between fuzzy or probabilistic logic (in any of their currentforms) and quantum logic or probability. The crux of the matter (to simplifyjust a bit) is: Quantum algebra must be distributive, and fuzzy or probabilisticlogic must use t-norm/conorm pairs for conjunction/disjunction – but the onlydistributive t-norm/conorm pairs is min/max, which is not additively genera-tive and thus can’t be the basis of a morphism between paraconsistent logicoperations and complex number operations.On the positive side, though, we do show that you can map between uncer-tain CD logic and quantum probability theory with small error – most likelyarbitrarily small error, though other strange things may possible arise as theerror is shrunk all the way toward zero – via a route that starts with assuminga certain small percentage of the observations underlying the truth values areerroneous. There are t-norm/conorm pairs that closely approximate min/max(and are thus approximately distributive) and are also additively generative –2o using these to map fuzzy paraconsistent truth values into the complex plane,one leverages Knuth et al’s work to obtain a result that fuzzy disjunction andconjunction approximately isomorphically map into complex number additionand multiplication.Conceptually, this means one can interpret the use of complex arithmeticin quantum mechanics – which is where most of the much-discussed ”quantumweirdness” comes from – as a different mathematical perspective on the use ofp-bits to quantify observations. If one admits that a proposition evaluated ina particular situation may sometimes be both True and False, or neither Truenor False, rather than always being clearly on the True side or the False side –and if one admits that every one of one’s observations has a certain potentialto be illusory or deceptive – then one concludes that the algebra of one’s truthvalues is approximately isomorphic to that of the complex plane.Whether the fuzzy paraconsistent logic view or the complex-probability viewis the best way to look at a given situation then depends on what one needsto do. Obviously for very many physics calculations, the complex-probabilityview is directly what one wants. On the other hand if one is thinking aboutquantum AI or quantum biology, the story is less clear. One of the tricky issuesin quantum biology is the difficulty of drawing boundaries between quantumand classical portions of a system; the paraconsistent treatment of boundariesgiven in [Web10] and fuzzified/probabilized in [Goe21] may become relevant.Regarding quantum computing for AI, the considerations here suggest itmay be useful to look at mappings on the programming-language side that cor-respond to the mappings between logics given here. Uncertain CD logic expres-sions map into pairs of types in dependent-type based programming languageswhich include probabilistic types. It seems fairly clear how to create quantumtypes by analogy to these probabilistic types; and it also seems clear that, liftingthe core ideas from this paper to the program world using the Curry-Howardcorrespondence, one obtains an approximate mapping from pairs of classical-probability-incorporating types to pairs of quantum-probability-incorporatingtypes. How this mapping cashes out in terms of practical quantum computingdevice, programming language or algorithm design is a wide open question.
CD logic works with 4-valued truth values that we have in [Goe21] called p-bits – ”paraconsistent bits”, each of which has 4 possible values: True (1,0) , False(0,1), Both (1,1) or Neither (0,0).Given any logical language featuring constructs for True, False, ∧ , ∨ , → and ¬ , and a mapping from expressions in the language into a Heyting algebra H , theHeyting algebra operations of meet, join and complement form an intuitionisticlogic. Patterson [PPA98] shows that, similarly, if one has a mapping h fromexpressions in the language into the product algebra H×H op (where H op denotesthe opposite algebra to H ), then one obtains a CD logic with rules as follows. • Basic mapping of logic operations3 h ( α ∧ β ) = h ( α ) ⊓ h ( β ) – h ( α ∨ β ) = h ( α ) ⊔ h ( β ) – h ( α → β ) = h ( α ) → h ( β ) – h ( α ) = ¬ h ( α ) • Mapping of Four Units – h (True) = (1 , – h (False) = (0 , – h (Neither) = (0 , – h (Both) = (1 , • Logical operation across coordinates – ( x, x ′ ) ⊓ ( y, y ′ ) = ( x ⊓ y, x ′ ⊔ y ′ ) – ( x, x ′ ) ⊔ ( y, y ′ ) = ( x ⊔ y, x ′ ⊓ y ′ ) – ( x, x ′ ) → ( y, y ′ ) = ( x → y, x ′ ⊓ y ′ ) – ¬ ( x, x ′ ) = ( x ′ , x )CD logic is crisp, but we show in [Goe21] that it can naturally be prob-abilized, via considering an ensemble of N micro-situations, in each of whicha certain proposition may be evaluated to have any of the four CD truth val-ues. One then associates with the proposition a 2D count value ( n + , n − ) ,where n + denotes the number of micro-situations in which there is positiveevidence for the proposition (True or Both truth values), and n + denotes thenumber of micro-situations in which there is negative evidence for the propo-sition (False or Neither truth values). The 2D count can be normalized into( w + , w − ) = ( n + N , n − N ), yielding a paraconsistent uncertain truth value wherethe first component measures the amount of positive evidence and the secondmeasures the amount of negative evidence, and the sum of the two componentsmay vary from 0 to 2.Given a t-norm/conorm pair F = ( ⊤ , ⊥ ), one can define the operations ⊓ F and ⊔ F via ( w +1 , w − ) ⊓ F ( w +2 , w − ) = ( ⊤ ( w +1 , w +2 ) , ⊥ ( w − , w − ))( w +1 , w − ) ⊔ F ( w +2 , w − ) = ( ⊥ ( w +1 , w +2 ) , ⊤ ( w − , w − )) ¬ F ( w + , w − ) = ( w − , w + )Note that here we carry over the negation operator from CD logic ratherthan using ¬ F ( w + , w − ) = (1 − w + , − w − ). This means we don’t have a nicealgebraic relationship within each coordinate, but we get ( ⊓ F , ⊔ F , ¬ F ) to be aDeMorgan triplet on 2D pairs, as ⊓ F and ⊔ F form a t-norm / conorm pair, ¬ F is an involutive negator, and ∀ w + , w − ∈ [0 ,
1] : ¬ F ⊔ F ( w + , w − ) = ⊓ F ( ¬ F ( w + ) , ¬ F ( w − ))4 From Primitive Symmetries to Quantum Prob-abilities
Knuth, Skilling and Goyal [GKS10] present an elegant argument starting fromsome basic symmetries on combinations of observation sequences and endingup with quantum probabilities as the uniquely appropriate way to manage 2Dtruth values describing observation sets. Conceptually and in some formal re-spects these arguments are a 2D extension of those in Knuth and Skilling’spaper ”Foundations of Inference” [KS00] which derives probability theory andinformation theory via basic symmetry arguments.They consider an n-ary tree whose nodes above the leaf level represent com-posite objects, and where the objects represented by the children of a node areinterpreted as a partition of the object represented by the node. Looking at afunction p that measures the size of a node in the tree or a descending path inthe tree, they explore the algebraic implications of some basic symmetry prop-erties (commutativity, associativity, nontrivial dependency of a binary functionon both variables, etc.) for the functional form of p .First they argue that a few basic symmetry properties imply the rule p ( B M C ) = p ( B ) + p ( C )for disjoint destinations B and C from a binary source node A = B L C , wheree.g. p ( B ) denotes the uncertainty quantification associated with B . This isroughly the same as the argument from [KS00] except that here the uncertaintyquantifications are assumed 2D rather than 1D.This part of their argument rests on a set of classical theorems arguing thatany combinational operator one might use for L , if it satisfies a few reasonable-looking properties, must be isomorphic to the standard component-wise vectorarithmetic operator +, in the sense that p ( B M C ) = f − ( f ( p ( B )) + f ( p ( C )))The argument is that if L is isomorphic to + in this sense, then we may as wellconsider it as actually being +, acting on a space of uncertainty values rescaledby f . Interestingly, not all conorm operators used to implement L (disjunction)operations in fuzzy logic actually obey the properties needed to make this sortof isomorphism work. But many do, and these are typically called ”additivelygenerative” ones [KMP13].The next part of their argument moves on from addition to multiplication.Where −−→ U V denotes a descending path from U to V in the tree, and ◦ denotesthe concatenation of paths, they show that basic symmetry properties imply Philosophically, this is an interesting application of the Univalence Principle from homo-topy type theory that ”equals is equal to equivalence”; there is likely also a formal connectionbut we will not explore this here. ( −−→ U V ◦ −−→
V W ) = p ( −−→ U V ) ∗ p ( −−→ V W ) p ( B ) = p ( −−→ BA ) p ( −→ AO ) p ( O )where O is the root object of the tree.Key among these basic symmetry properties are left and right distributivitybetween ◦ and L – which, notably, do not hold for any of the t-norm/t-conormpairs used to quantify conjunction and disjunction in fuzzy logic, except formin/max (which are not additively generated). So again we see that the ”ba-sic” symmetry properties used are actually quite restrictive in some relevantcontexts. This point gets at the heart of our unique contribution here, which isa way to partially dodge these issues via using a fuzzy conjunction / disjunctionquantification that approximatively fulfills all the symmetries one wants, evenif it fails to do so exactly.Applying these arguments and further related ones to functions p mappingpaths into ordered pairs of reals, they arrive at the conclusion that the additionand multiplication involved in the above relationships must be the standard onesused in the complex number system. From this point they proceed in a similarvein to Youssef [You94] (though with different particulars), moving forward toHilbert space and quantum mechanics from the assumption of complex-valueduncertainty quantifications. A final point to make in reviewing the Knuth / Skilling / Goyal work is thatexplicit consideration of negation plays no role in their derivations – they areconcerned fundamentally with symmetries regarding conjunction and disjunc-tion and their interrelationship. This is an interesting contrast to intuitionisticlogic where negation is the most subtle and vexed of the logical operations. Itis also convenient in the context of mapping paraconsistent logic into quantumprobabilities, as we’ll see below.
What do we mean by an approximate mapping from fuzzy paraconsistent truthvalues into quantum probabilities? Suppose ( w +1 , w − ) and ( w +2 , w − ) representbodies of evidence drawn from distinct sets of micro-situations; our aim here isto find a mapping function σ defined by some t-norm/conorm pair F = ( ⊤ , ⊥ )with additive generating functions ( f, f ∗ ) so that Indeed Youseff’s [You94] analysis of quantum probabilities in terms of the FrobeniusTheorem could be used as an alternate basis for the discussion here, with some technicaldifferences but leading to the same basic point. ( a, b ) = ( f ( a ) , f ∗ ( b ))and σ (( w +1 , w − ) ⊓ F ( w +2 , w − )) ≈ σ ( w +1 , w − ) + σ ( w +2 , w − ) (1) σ (( w +1 , w − ) ⊔ F ( w +3 , w − )) ≈ σ ( w +1 , w − ) ∗ σ ( w +3 , w − ) (2) σ ( ¬ F ( w +1 , w − )) ≈ ¬ σ ( w +1 , w − ) (3)where the operations on the lhs are defined as above and the operations onthe rhs are standard complex number arithmetic, except for ¬ which is a simplereversal of coordinates ( ¬ ( a, b ) = ( b, a ), or in complex number notation ¬ z = iz ∗ ). It should be clear from the discussion in Section 3 why it’s impossible toreduce the ≈ to a true equality. The algebra on the right hand side is distribu-tive, and the only t-norm/t-conorm pair that’s distributive is max/min. But forEquation 1 to work as a precise equality, the operator ⊓ F needs to be additivelygenerated, and max isn’t.However, this argument doesn’t prevent us from making the relationshipshold with ≈ and a small approximation error. In [Goe] we have presented anapproximative version of Dupre’ and Tipler’s [DT06] derivation of the rules ofstandard 1D probability theory from basic symmetries, showing that if one hasoperators that approximately obey the symmetries, then these operators mustgenerally be the same as the standard probabilistic operators within a closedegree of approximation. Though the formal details will be different, it seemsclear that a similar methodology will work with regard to Goyal, Skilling andKnuth’s arguments going from basic symmetries to complex-arithmetic-based2D probability theory rules.To see how this sort of approximative argument can be leveraged here, weturn to the Schweizer-Sklar (SS) t-norms [SS11], defined via ⊤ SS p ( x, y ) = ⊤ min ( x, y ) if p = −∞ ( x p + y p − /p if − ∞ < p < ⊤ prod ( x, y ) if p = 0(max(0 , x p + y p − /p if 0 < p < + ∞⊤ D ( x, y ) if p = + ∞ . where • ⊤ min ( a, b ) = min { a, b } • ⊤ prod ( a, b ) = a · b ⊤ D ( a, b ) = b if a = 1 a if b = 10 otherwise.An additive generator for ⊤ SS p for ∞ < p < ∞ is f SS p ( x ) = ( − log x if p = 0 − x p p otherwise.Note that as p → −∞ , ⊤ SS p ( x, y ) → min ( x, y ). Since for negative p thatare large in absolute value, T SS p ( x, y ) ≈ min ( x, y ), we also have that ⊤ SS p ( x, y )approximately obey the laws that min ( x, y ) obeys exactly – such as obeyingdistributive laws when considered together with its t-conorm. For these p , there-fore, the SS t-norm/t-conorm pair defines a mapping σ so that σ ( a, b ) = ( f SS p ( a ) , − f SS p (1 − b ))which has properties1. Equation 1 holds precisely, since ⊤ SS p is additively generated2. Equation 2 holds approximately, since ⊤ SS p is approximately distributive3. Equation 3 holds preciselySo if we take p that approaches −∞ but doesn’t get there yet, then we havea mapping function σ that maps the CD logic of p-bits with SS t-norm/conorm approximately isomorphically into the complex arithmetic operators.There is however an important subtlety regarding negation. The CD nega-tion operation maps into the reflection operator z → iz ∗ , whereas the complexaddition operator obviously works with the negation z → − z . If we map the lat-ter back into the CD domain, it maps into the coordinatewise negation operator ¬ c ( w + , w − ) = (1 − w + , − w − ) – which does not properly form a DeMorgantriple with ⊤ SS p , and doesn’t agree with crisp CD negation in the case wherethe truth values are crisp.The mapping from conjunction / disjunction to complex number multiplica-tion / addition doesn’t require any considerations involving negation, so as faras this approximate isomorphism is concerned, one is free to consider negationhowever one wishes. If one takes an intuitionistic-logic-like perspective, one cansay that on both sides there are multiple relevant negation-type operators, eachof which has different useful algebraic properties.One more relevant detail to note is that Goyal, Skilling and Knuth [GKS10]initially find three different multiplication operators that seem almost acceptableas conjunctions for 2D probability values based on basic algebraic requirementslike associativity, commutativity and so forth. They then narrow down to the8raditional complex-number multiplication based on some modestly subtle ar-guments regarding the results that the three candidates give in some examplecases of multiple sequential and parallel measurements with natural symmetries(varying on standard Stern-Gerlach experiments).So summing up, the situation is that: The SS t-norm/co-norm approximatesdistributivity closely, and also supports additive generativity, thus allowing anisomorphic mapping that approximately takes paraconsistent disjunction intocomplex number addition, and approximately takes paraconsistent conjunctioninto what, after considering some additional technical criteria derived from phys-ical sensibleness, is required to be complex number multiplication. We’ve presented the approximate mapping from the paraconsistent world intothe quantum world from a formal math perspective – but what sense does itmake conceptually?There is a intriguing connection between the approximate mapping presentedabove and the semantics of uncertainty as regards existential and universalquantifiers leveraged in deriving PLN truth value formulas for these operators[GIGH08] . To model the uncertainty of these quantifiers in PLN, what has beendone is to assume that every observation underlying every truth value estimatehas a certain probability of being erroneous.So for instance, suppose every single one of one’s 100 observations of rocksconcur with the proposition that rocks are hard. There are multiple sorts ofuncertainty involved here. One is that these 100 rocks might end up not to berepresentative of the overall population of rocks. This kind of uncertainty istaken into account in the imprecise and indefinite probabilities used in the PLNframework, which involve confidence-weights attached to probability estimates,and formulas for assessing these weights based on evidence counts n . There isalso another kind of uncertainty though, which we may call ”evidential error(EE) uncertainty” – there is the possibility that some of the observations ofrocks being hard, were actually wrong ... that sometimes one perceived a softrock to actually be hard, due to some sort of error in one’s perceptual systemsor a hack into the simulation running our physical reality, or whatever. Theprobability of this kind of error pushes one to consider one’s 100 observations ofhard rocks as perhaps actually reflecting an observation of 100 rocks of which n − k are hard and k are not hard, where generally kn is small but not zero.In PLN theory the consideration of EE uncertainty pushes one to model theuncertainties of universal and existential quantifiers using third-order probabil-ity distributions. A similar mode of thinking seems to make sense in the currentcontext.If some of the observations underlying a p-bit ( w + , w − ) are incorrect, thenone views the pair ( n + , n − ) as actually representing a distribution P ( n + ,n − ) over9 set of values ( n + − k, n − − k ) where k may be positive or negative and generallyone needs k ≪ n = n + + n − to have meaningful inferences. Under the simplestassumptions the mean of the distribution P ( n + ,n − ) will be µ P ( n + ,n − ) = ( n + , n − ).To cash out the implications of this expanded view of ( n + , n − ) semanticsfor conjunction and disjunction operators, one could define e.g. a conjunctionoperator P ( n +1 ,n − ) ⊓ P P ( n +2 ,n − ) which maps distributions to distributions. One could then define an intersectionoperator on p-bits via projecting to and from these distributions,( n +1 , n − ) ⊓ ∗ P ( n +2 , n − ) = µ P ( n +1 ,n − ⊓ P µ P ( n +2 ,n − (with some attention to make sure ⊓ ∗ P fulfills the t-norm requirements). (Whilewe have used ( n + , n − ) language here, porting this to ( w + , w − ) language isimmediate.)Exactly what this distribution will look like, will of course depend on howthe distributions P ( n + ,n − ) are constructed. However, qualitatively as kn → ⊓ ∗ P will behave similarly to T SS p as p → −∞ . I.e. roughly speaking p ′ = nk for ⊓ ∗ P will behave much like p does for the SS t-norm.From this view, the SS t-norm can be viewed as a heuristic approximationfor ⊓ ∗ P , valuable for initial exploration because of its simple analytical form.The path from EE uncertainty to ⊓ ∗ P provides a conceptual framework in whichthe approximate mapping between paraconsistent and quantum logic naturallyemerges.The conceptual story then looks like: Four-valued paraconsistent logicunder small amounts of evidential-error uncertainty maps into quan-tum probability, within a close degree of approximation.
Warrell [War16] has proposed a simple and appealing way of extending standarddependent type theory to include additional probabilistic types of interest foradvanced AI. The basic concept is to augment standard constructive type theorywith a new primitive random ρ which denotes sampling from a Bernoulli distri-bution, and which comes along with some (obvious) rules for probabilistically-weighted beta reduction on pseudo-expressions in the dependent type language . These are referred to as pseudo-expressions because they are not necessarily type-correct;the reduction rule can be applied even if the ”type pseudo-expression” τ is not type-correct ρ ( τ ) → ρβ ( τ true)random ρ ( τ ) → − ρβ ( τ false)which has the meaning that the type of the expression random ρ ( τ ) inherits from τ with with probability ρ and inherits from τ with probability 1 − ρ .In [Goe21] we have extended Warrell’s approach via introducing constructslike random S ( τ )random M ( τ )where S represents a situation randomly chosen from an ensemble thereof, or M represents a sub-metagraph drawn from a larger assumed metagraph.The mapping described here would allow one to approximatively replacee.g. random S ( τ ) with qrandom S ( τ ) which chooses S from a (complex num-ber) amplitude distribution rather than a conventional real-number probabil-ity distribution. The approximate isomorphism explored here would lead toan approximate isomorphism between programs with probabilistic types corre-sponding to paraconsistent logic expressions, and programs with quantum typescorresponding to quantum logical expressions.One may speculate that this provides a potentially interesting novel sourceof quantum algorithms – by taking classical algorithms and ”quantizing” themusing EE-based or similarly-behaving transformations. The argument given here, while conceptually compelling and apparently math-ematically sound, is also somewhat sketchy as presented. One suspects thatfilling in all the details in a fully rigorous way will lead to some additionaldiscoveries as well as perhaps more clarity on the limitations of the approach.The core message that complex-valued quantum probability can be approxi-matively viewed as a transformed version of 4-valued paraconsistent logic – thatqubits can be viewed essentially as noisy transformed p-bits – is one that seemsto have some subtlety to it from a philosophical and formal-logic view. Whetherthis approach to the quantum world has any concrete practical value remainsto be seen and there is clearly a fairly long road toward mining any such value.We have the intuition that quantum biology, quantum computing and especiallyquantum AI may be domains in which such exploration could be fruitful.
References [DT06] Maurice J Dupre and Frank J Tipler. The cox theorem: Unknownsand plausible value. arXiv preprint math/0611795 , 2006.11GIGH08] B. Goertzel, M. Ikle, I. Goertzel, and A. Heljakka.
Probabilistic LogicNetworks . Springer, 2008.[GKS10] Philip Goyal, Kevin H Knuth, and John Skilling. Origin of com-plex quantum amplitudes and feynman?s rules.
Physical Review A ,81(2):022109, 2010.[Goe] Ben Goertzel. Probability theory ensues from assumptions of ap-proximate consistency: A simple derivation and its implications foragi.[Goe21] Ben Goertzel. Paraconsistent foundations for probabilistic reasoning,programming and concept formation, 2021.[GPG13a] Ben Goertzel, Cassio Pennachin, and Nil Geisweiller.
EngineeringGeneral Intelligence, Part 1: A Path to Advanced AGI via Embod-ied Learning and Cognitive Synergy . Springer: Atlantis ThinkingMachines, 2013.[GPG13b] Ben Goertzel, Cassio Pennachin, and Nil Geisweiller.
EngineeringGeneral Intelligence, Part 2: The CogPrime Architecture for Inte-grative, Embodied AGI . Springer: Atlantis Thinking Machines, 2013.[KMP13] Erich Peter Klement, Radko Mesiar, and Endre Pap.
Triangularnorms , volume 8. Springer Science & Business Media, 2013.[KS00] K Knuth and J Skilling. Foundations of inference.
Axioms 1(1) ,2000.[PPA98] Anna L Patterson, Vaughan Pratt, and Gul Agha.
Implicit program-ming and the logic of constructible duality . 1998.[SS11] Berthold Schweizer and Abe Sklar.
Probabilistic metric spaces .Courier Corporation, 2011.[War16] Jonathan H Warrell. A probabilistic dependent type system based onnon-deterministic beta reduction. arXiv preprint arXiv:1602.06420 ,2016.[Web10] Zach Weber. A paraconsistent model of vagueness.
Mind ,119(476):1025–1045, 2010.[You94] Saul Youssef. Quantum mechanics as complex probability theory,.