The Past Hypothesis and the Nature of Physical Laws
TThe Past Hypothesis and the Nature of Physical Laws
Eddy Keming Chen * September 21, 2020
Forthcoming in Barry Loewer, Eric Winsberg, and Brad Weslake (eds.),
Time’s Arrows and the Probability Structure of the World , Harvard University PressTherefore I think it is necessary to add to the physical laws the hypothesis that inthe past the universe was more ordered, in the technical sense, than it is today—Ithink this is the additional statement that is needed to make sense, and to make anunderstanding of the irreversibility. Richard Feynman (1964 Messenger Lectures)
Abstract
If the Past Hypothesis underlies various arrows of time, what is the statusof the Past Hypothesis? In this paper, I examine the role of the Past Hypothesisin the Boltzmannian account and defend the view that the Past Hypothesis is acandidate fundamental law of nature. Such a view is known to be compatiblewith Humeanism about laws, but as I argue, it is also supported by a minimalnon-Humean “governing” conception of laws. Some worries arise from the non-dynamical and time-dependent character of the Past Hypothesis as a boundarycondition, the intrinsic vagueness in its specification, and the nature of the initialprobability distribution. I show that these worries do not have much force, and inany case, they become less relevant in a new quantum framework for analyzingtime’s arrows—the Wentaculus. Hence, the view that the Past Hypothesis is acandidate fundamental law should be more widely accepted than it is now.
Keywords: time’s arrow, counterfactuals, laws of nature, vagueness, objective probabilities,typicality, scientific explanation, Past Hypothesis, Statistical Postulate, Humeanism, non-Humeanism, minimal primitivism, the Mentaculus, the Wentaculus, quantum statisticalmechanics, density matrix realism * a r X i v : . [ phy s i c s . h i s t - ph ] S e p ontents One of the hardest problems in the foundations of physics is the problem of the arrowsof time. If the dynamical laws are (essentially) time-symmetric, what explains theirreversible phenomena in our experiences, such as the melting of ice cubes, the decayingof apples, and the mixing of cream in co ff ee? Macroscopic systems display an entropygradient in their temporal evolutions: their thermodynamic entropy is lower in thepast and higher in the future. But why does entropy have this temporally asymmetrictendency? Following Goldstein (2001), let us distinguish between the two parts of theproblem of irreversibility:1. The Easy Part: if a system is not at maximum entropy, why should its entropytend to be larger at a later time?2. The Hard Part: why should there be an arrow of time in our universe that isgoverned by fundamental reversible dynamical laws?The Easy Part was studied by Boltzmann (1964)[1896]. Crucial to Boltzmann’s answeris this: • Key to the Easy Part: states of larger entropy occupy much larger volume in thesystem’s phase space than those states of lower entropy.2o far, answering the Easy Part does not require any time-asymmetric postulates. Boltzmann’s program is primarily focused on closed subsytems of the universe. Butits success leads us to expect that a Boltzmannian account can work at the universallevel. If we model the universe as a mechanical system, we expect that typically, thenon-equilibrium state of the universe will evolve towards higher entropy at later times.However, why is the entropy lower in the past? That is the Hard Part. A proposedanswer suggests that it has to do with the initial condition of the universe: • Key to the Hard Part: the universe had a special beginning.We can introduce this as an explicitly time-asymmetric postulate in the theory, by usingthe Past Hypothesis:
Past Hypothesis (PH)
At the initial time of the universe, the microstate of the universeis in a low-entropy macrostate. Given that some microstates are anti-entropic, it is standard to introduce a probabilitydistribution over the microstates compatible with the low-entropy macrostate:
Statistical Postulate (SP)
The probability distribution of the initial microstate of theuniverse is given by the uniform one (according to the natural measure) that issupported on the macrostate of the universe.However, a detailed probability distribution may be unnecessary. In the typicalityframework, we just need to be committed to a typicality measure:
Typicality Postulate (TP)
The initial microstate of the universe is typical inside themacrostate of the universe. Unlike SP, TP is compatible with a variety of measures that agree on what is typical.PH, SP, and TP are physical postulates that have an empirical status. The answer to theHard Part of the problem of irreversibility requires PH and either SP or TP. In fact, wealso need to assume (an unconditionalized) notion of probability or typicality to answerthe Easy Part. I call the answers to the Easy Part and the Hard Part the
Boltzmannianaccount of the arrow of time.How to characterize the initial macrostate of PH remains an open question. Weknow that the matter distribution is more or less uniform in the early universe, whichis contrary to the usual conception of low entropy. However, the initial gravitational Boltzmann’s
Stosszahlansatz (hypothesis of molecular chaos) is often blamed for introducing an illicittime asymmetry. But it is an innocent theoretical postulate if we understand it correctly—as a typicalityor probability measure over initial conditions. See (Goldstein et al. The Past Hypothesis was originally suggested in (Boltzmann 1964)[1896] (although he seems to favoranother postulate that can be called the
Fluctuation Hypothesis ) and discussed in (Feynman 2017)[1965].For recent discussions, see (Albert 2000), (Goldstein 2001), (Callender 2004, 2011), (Lebowitz 2008), (North2011), (Loewer 2020), and (Goldstein et al. For more on the notion of typicality and its application in statistical mechanics, see (Goldstein 2012),(Lazarovici & Reichert 2015), and (Wilhelm 2019).
Weyl Curvature Hypothesis : the Weyl curvature vanishes atthe initial singularity. The urgent question is, of course, how to understand this interms of quantum theory or quantum gravity. Some steps have been taken in the LoopQuantum Cosmology framework by Ashtekar & Gupt (2016). This is compatible with aBoltzmannian account, but the final details will depend on the exact theory of quantumgravity, which is currently absent.In the philosophical literature, a number of objections have been raised against theBoltzmannian account. First, some criticize the answer to the Easy Part: the explanationis too hand-wavy and not completely rigorous (Frigg 2007). Second, some criticize theanswer to the Hard Part, such as that PH is not even false because the entropy of theearly universe is not well-defined (Earman 2006), or that PH is not su ffi cient to explainthe thermodynamic behaviors of subsystems (Winsberg 2004), or that it is ad hoc , andtherefore, not explanatory (Price 2004; Carroll 2010, p.346). These are responses in theliterature, and more work on these issues is certainly welcome. However, my interesthere is di ff erent. I take the Boltzmannian account as a starting point. My aim is toexplore the conceptual and scientific ramifications of accepting PH and its explanationof time’s arrow.In this paper, I focus on the connection between PH and our concept of fundamentallaws of nature. What is the status of PH if the Boltzmannian program turns out to besuccessful? Can PH be accepted as a candidate fundamental law even though it is aboundary condition of the universe? What di ff erences does it make to our conceptsof laws, chances, and possibilities? Can PH be completely expressed in mathematicallanguage? What is the relevance of quantum theory to these issues? I argue for thefollowing theses: Nomic Status
The Past Hypothesis is a candidate fundamental law of nature. Axiomatic Status
The Past Hypothesis is a candidate axiom of the fundamental physicaltheory. Relevance
Whether the Past Hypothesis has nomic status (and / or axiomatic status) isrelevant to the success of explaining time’s arrows, the metaphysical account oflaws, the nature of objective probability, and the mathematical expressibility offundamental physical theory.Some of these ideas have been defended along Humean lines, but I think weshould accept them regardless of whether we think fundamental laws supervene onmatter distribution or are part of the fundamental facts of the world. It turns out they A candidate fundamental law of nature has all it takes to be a fundamental law of nature, but it maynot turn out to be the true law of the actual world if it makes false predictions. For example, Newton’sdynamical law F = ma is a candidate fundamental law of nature but it is not the actual fundamental law. The status of a candidate fundamental law of nature and the status of a candidate axiom of thefundamental physical theory may be equivalent. I distinguish the two theses because some people maybe happy to accept one but deny the other. They may be reluctant to call PH a fundamental law , perhapsdue to it being a boundary condition. My methodology belowis naturalistic and functionalist.
Whatever plays the role of a fundamental law can be afundamental law.
Whatever that cannot be derived from more fundamental laws and plays the right roles in guiding our inferences about the past and the future, underlyingvarious scientific explanations, high-level regularities, our manifest image of influenceand control, and so on, is a candidate fundamental law. To borrow a phrase fromLoewer (2020), PH “looks, walks and talks” like a fundamental law. So, we shouldinterpret it as such. Hence, I disagree with people who think that even if PH is trueand plays all the roles we suggest, it still cannot be a fundamental law—it may just bea special but contingent initial condition. I also disagree with people who think thatwhether or not PH is a fundamental law makes no substantive di ff erences. Here is the roadmap. In §2, I provide more details of the Boltzmannian account anddiscuss the modifications to PH when we move from classical mechanics to quantummechanics. The variations result in three types of physical theories: the ClassicalMentaculus, the Quantum Mentaculus, and the Wentaculus theories. In §3, I providepositive arguments for the nomic and axiomatic statuses of PH. Some of these havebeen mentioned in the literature, but it is worth emphasizing and clarifying the exactargumentative structure. I also put forward a novel argument based on considerationsabout the nature of the quantum state. In §4, I discuss some apparent obstacles fromrecognizing PH as a fundamental law. This has to do with its nature as a boundarycondition, the status of the Statistical Postulate and the Typicality Postulate, and theintrinsic vagueness in their specifications. I argue that these worries do not have muchforce even on some non-Humean views, and they become even less worrisome in theWentaculus theory.
Before we get into the philosophical and conceptual issues, let us be more explicit aboutwhat the Boltzmannian account is and how to state PH in that account. Although theBoltzmannian account is more or less the same in classical and in quantum theories, theexact form of PH is subtly di ff erent. I will exploit this di ff erence in §4 to dissolve someof the worries about the classical version of PH. Readers familiar with the standardBoltzmannian statistical mechanics can jump to §2.3, where a new framework calledthe Wentaculus is introduced. This view is, I think, in the same spirit as the suggestion made by Demarest (2019). I do not claim that we should reduce laws to these roles; that would be the strategy of metaphysicalfunctionalism about laws. Rather, I am merely appealing to the methodology in naturalistic metaphysicsof science that I think many people accept independently of the issue of the arrows of time. Maudlin (2007) §4 seems to regard PH as an important boundary condition but does not think ofit as a fundamental law. The disagreement is based on a di ff erent view about how laws govern that Icall Dynamical Law Primitivism, which I discuss in §3.4. Carroll (2010) (p.345) suggests that there is nosubstantive di ff erence between the statements “the early universe had a low entropy” and “it is a law ofphysics that the early universe had a low entropy.” Carroll seems to be worried about the distinctionbetween boundary conditions and laws; I discuss this in §4.1. .1 The Classical Mentaculus Let us start with the Boltzmannian account in classical statistical mechanics andsummarize its basic elements from the “individualistic viewpoint.” Let us consider aclassical-mechanical system with N particles in a box of volume Λ = [ , L ] ⊂ R and aHamiltonian H = H ( X ) = H ( q , ..., q N ; p , ..., p n ) that specifies the standard interactionsin accord with Newtonian gravitation, Coulomb’s law, and other forces obeyed by theclassical system.1. Microstate: at any time t , the microstate of the system is given by a point in a6 N -dimensional phase space, X = ( q , ..., q N ; p , ..., p n ) ∈ Γ total ⊆ R N , (1)where Γ total is the total phase space of the system.2. Dynamics: the time dependence of X t = ( q ( t ) , ..., q N ( t ) ; p ( t ) , ..., p n ( t )) is given bythe Hamiltonian equations of motion: d q i ( t ) dt = ∂ H ∂ p i , d p i ( t ) dt = − ∂ H ∂ q i . (2)3. Energy shell: the physically relevant part of the total phase space is the energyshell Γ ⊆ Γ total defined as: Γ = { X ∈ Γ total ∶ E ≤ H ( x ) ≤ E + δ E } . (3)We only consider microstates in Γ .
4. Measure: the measure µ V is the standard Lebesgue measure on phase space,which is the volume measure on R N . The Lebesgue measure on a finite volumecan be normalized to yield a probability distribution.5. Macrostate: with a choice of macro-variables, the energy shell Γ can be partitionedinto macrostates Γ ν : Γ = ⋃ ν Γ ν . (4)A macrostate is composed of microstates that share similar macroscopic features(i.e., similar values of the macro-variables), such as volume, density, and pressure.Caveat: the partition of microstates into macrostates is exact only after westipulate some choices of the parameters for coarse-graining (the size of the cells)and correspondence (between functions on phase space and thermodynamicquantities). We call these C-parameters . Without exact choices of the C-parameters,the partition is inexact and the boundaries between macrostates are vague. I follow the discussion in (Goldstein & Tumulka 2011). These do not intend to be rigorousaxiomatizations of classical statistical mechanics. What is presented here di ff ers in emphasis from (Chen2018), as here I am explicit about the sources of vagueness, which will be discussed in §4. For more discussions, see (Chen 2020b). a) Vague boundaries (b) Exact boundaries Figure 1: The partition of microstates into macrostates on phase space: (a) without exactchoices of the C-parameters, (b) with exact choices of the C-parameters. X representsthe actual microstate of the universe at t . M represents the vague boundaries of thePH macrostate. Γ represents an admissible precisification of M , where Γ ′ representsanother admissible precisification. The diagrams are not drawn to scale.See Figure 1. It is also expected that given the nature of the actual forces, somepartitions will be superior to others in supporting generalizations in the specialsciences.6. Unique correspondence: given exact choices of the C-parameters, the macrostatespartition the energy shell, and as a consequence, every phase point X belongs toone and only one Γ ν . (This point is implied by Γ eq that has almostthe entire volume with respect to µ V : µ V ( Γ eq ) µ V ( Γ ) ≈ . (5)A system is in thermal equilibrium if its phase point X ∈ Γ eq .
8. Boltzmann Entropy: the Boltzmann entropy of a classical-mechanical system inmicrostate X is given by: S B ( X ) = k B log ( µ V ( Γ ( X ))) , (6)where Γ ( X ) denotes the macrostate containing X . The thermal equilibrium statethus has maximum entropy.Caveat: Without exact values of the C-parameters, there will be many admissiblechoices of the Γ ( X ) ’s. Moreover, what is admissible is also vague. Since k B is ascaling constant that plays no direct dynamical role, its value is also vague. Hence,7he Boltzmann entropy of a microstate should be understood as a vague quantity.If we stipulate some C-parameters and the value of k B , we can arrive at an exactboundary for the macrostate that contains X and an exact value of Boltzmannentropy for the system.9. Low-Entropy Initial Condition: on the assumption that we can model the universeas a classical-mechanical system of N point particles, we postulate a speciallow-entropy boundary condition, which Albert (2000) calls the Past Hypothesis (PH): X t ∈ Γ PH , µ V ( Γ PH ) ≪ µ V ( Γ eq ) ≈ µ V ( Γ ) , (7)where Γ PH is the PH macrostate with a volume much smaller than that of theequilibrium macrostate. Hence, S B ( X t ) , the Boltzmann entropy of the microstateat the boundary, is very small compared to that of thermal equilibrium. Here, Γ PH is underspecified; we can add further details to specify the macroscopic profile(temperature, pressure, volume, density) of Γ PH .The answer to the Easy Part of the problem of irreversibility lies in the first eightbullet points, which make plausible the hypothesis of the typical tendency for asystem to evolve to higher entropy towards the future. Even though microstatesare “created equal,” macrostates are not. Their volumes are disproportionate anduneven. Macrostates with higher entropy have much larger volume in the energy shell.Furthermore, the largest macrostate is by far that of thermal equilibrium. It is plausiblethat, unless the dynamics are extremely contrived, a typical microstate starting from amedium-entropy macrostate will find its way through larger and larger macrostatesand eventually arrive at thermal equilibrium. That is a process in which a system’sentropy gradually increases until it reaches the entropy maximum. Of course, for theactual universe, there can be exceptions to the entropy increase, such as short-livedfluctuations in which entropy decreases.However, this solves only half the problem. If typical microstates compatible witha medium-entropy macrostate will, at most times, increase in entropy towards thefuture, then typical micorstates compatible with the same macrostate will also, at mosttimes, increase in entropy towards the past. Hence, given the resources so far, we haveshown that the medium-entropy macrostate is overwhelmingly likely at an entropyminimum produced by a thermodynamic fluctuation from equilibrium. We are led tothe Hard Part of the problem: why is the entropy so much lower in the past direction oftime? Enter the Past Hypothesis. Given PH, the actual microstate starts in an atypicalregion of the energy shell, in a low-entropy macrostate M . Suppose we choose aprecisification Γ . Given the Easy Part, typical initial microstates compatible with Γ will evolve towards macrostates of higher entropy in the future direction. But there isnothing earlier than t , as it is stipulated to be the initial time—say, the time of the Big Boltzmann’s original H-theorem (Boltzmann 1964)[1896] is an attempt to show this. Lanford (1975)produces an exact result for a simple system of hard spheres where the Boltzmann equation is shownto be satisfied for a short duration of time, and hence, Boltzmann entropy is shown to be increasingtowards the future. However, it is plausible that the equation continues to be valid and Boltzmannentropy continues to rise afterwards. Therefore, if we find the universe to be in a medium-entropy macrostate Γ t , say thestate we are in right now, then the actual microstate is not like a typical microstateinside Γ t , but a special one that is compatible with Γ . The reason that the entropy waslower in the past is because the universe started in a special macrostate, a state of verylow Boltzmann entropy. Assuming PH, it is reasonable to expect with overwhelmingprobability that entropy will be higher in the future and was lower in the past, and thesense of probability is specified by bullet point • A measure of probability: the natural measure picks out the correct probabilitymeasure of the initial condition. This interpretation yields the Statistical Postulate. • A measure of typicality: the natural measure is a simple representer of a vague“collection” of measures that are equivalent as the measure of typicality of theinitial condition. This interpretation yields the Typicality Postulate.PH together with the Statistical Postulate supports the following classical-mechanicalversion of the Second Law of Thermodynamics (this is adapted from the MathematicalSecond Law described in (Goldstein et al.
The Second Law for X At t , the actual phase point of the universe X starts in alow-entropy macrostate and, with overwhelming probability, it evolves towardsmacrostates of increasingly higher entropy until it reaches thermal equilibrium,except possibly for entropy decreases that are infrequent, shallow, and short-lived;once X t reaches Γ eq , it stays there for an extraordinarily long time, except possiblefor infrequent, shallow, and short-lived entropy decreases.The Second Law can be stated also in the language of typicality. For simplicity, I willconduct the discussion below mostly in the language of probability.The Second Law above is stated for the behavior of the universe, but it also makesplausible what Goldstein et al. (2020) call a ‘development conjecture’ about isolatedsubsystems in the universe: Development Conjecture
Given PH, an isolated system that, at a time t before thermalequilibrium of the universe, has macrostate ν appears macroscopically in thefuture, but not the past, of t like a system that at time t is in a typical microstatecompatible with ν .Classical mechanics with just the fundamental dynamical laws (expressed in equa-tions (2)) are time-symmetric. Introducing the probability measure takes care of the EasyPart of the problem of irreversibility, but to solve the Hard Part of the problem—theretrodiction to the past, we need to explicitly introduce something that breaks the timesymmetry. PH is a simple postulate that does the job. The bullet points about energy It can also be stipulated that t is some time close to the Big Bang, in which case some anti-thermodynamic behavior can be displayed in the short duration before t . Classical Mentaculus : The Classical Mentaculus Fundamental Dynamical Laws (FDL) : the classical microstate of the uni-verse is represented by a point in phase space that obeys the Hamiltonianequations of motion described in equations (2).2.
The Past Hypothesis (PH) : at a temporal boundary of the universe, themicrostate of the universe lies inside M , a low-entropy macrostate that,given a choice of C-parameters, corresponds to Γ , a small-volume set ofpoints on phase space that are macroscopically similar.3. The Statistical Postulate (SP) : given the macrostate M , we postulate auniform probability distribution (with respect to the standard Lebesguemeasure) over the microstates compatible with M .This account therefore assigns probability 1 to the initial macrostate. If the probabilitydistribution is given a status of objective probability, it delivers more than just the SecondLaw. It provides an exact probability for any proposition formulable in the languageof phase space. This is the reason that Albert and Loewer regard the Mentaculus asproviding a “probability map of the world.” Hence, the Mentaculus has an ambitiousscope: it is possible to recover all the non-fundamental regularities, including thespecial science laws (such as laws of economics), and other arrows of time such as theepistemic arrow, the records arrow, the influence arrow, and the counterfactual arrow.Whether the Albert-Loewer project can succeed in their ambitious goal of recoveringall the non-fundamental regularities and arrows of time is an interesting question.Nonetheless, the Classical Mentaculus as formulated provides an underpinning forthe thermodynamic arrow of time. Given the universality and importance of theSecond Law, the Mentaculus should be taken as a serious contender for a promisingframework of the structure of a fundamental physical theory. In the next subsection,I examine how to adapt the Classical Mentaculus to the quantum domain. In §3, Idiscuss the suggestion that PH should be taken as a candidate fundamental law andsome ramifications of the more ambitious project. Let us turn to the Boltzmannian account of quantum statistical mechanics from the“individualist viewpoint.” Consider a quantum-mechanical system with N fermions(with N > ) in a box Λ = [ , L ] ⊂ R and a Hamiltonian ˆ H . (Here, I follow thediscussions in (Goldstein et al. t , the microstate of the system is given by a normalized10and anti-symmetrized) wave function: ψ ( q , ..., q N ) ∈ H total = L ( Λ N , C k ) , ∥ ψ ∥ L = , (8)where H total = L ( Λ N , C k ) is the total Hilbert space of the system.2. Dynamics: the time dependence of ψ ( q , ..., q N ; t ) is given by the Schrödingerequation: i ̵ h ∂ψ∂ t = ˆ H ψ. (9)3. Energy shell: the physically relevant part of the total Hilbert space is the subspace(“the energy shell”): H ⊆ H total , H = span { φ α ∶ E α ∈ [ E , E + δ E ]} , (10)This is the subspace (of the total Hilbert space) spanned by energy eigenstates φ α whose eigenvalues E α belong to the [ E , E + δ E ] range. Let D = dim H , the numberof energy levels between E and E + δ E .We only consider wave functions ψ in H .4. Measure: given a subspace H , the measure µ S is the surface area measure on theunit sphere in that subspace S ( H ) .
5. Macrostate: with a choice of macro-variables, the energy shell H can beorthogonally decomposed into macro-spaces (subspaces): H = ⊕ ν H ν , ∑ ν dim H ν = D (11)Each H ν corresponds to small ranges of values of macro-variables that are chosenin advance.Caveat: similarly to the classical case, the decomposition of Hilbert space intomacrostates requires some stipulation of the exact values of the C-parameters.But in the quantum case, these parameters includes coarse-graining sizes, cor-respondences of functions, and also the cut-o ff values of how much support aquantum state needs to be inside a subspace to be counted towards belongingto the macrostate (see the next bullet point). Without the exact choices of theC-parameters, the decomposition is inexact and it is vague which microstatebelongs to which macrostate. Again, it is also expected that given the nature ofthe actual forces, some decompositions will be superior to others for supportinggeneralizations in the special sciences. For simplicity, let us assume that the subspaces we deal with are finite-dimensional. In caseswhere the Hilbert space is infinite-dimensional, it is an open and challenging technical question. Forexample, we could use Gaussian measures in infinite-dimensional spaces, but we no longer have uniformprobability distributions. For technical reasons, Von Neumann (1955) suggests that we round up these macro-variables(represented by quantum observables) so as to make the observables commute. See (Goldstein et al.
11. Non-unique correspondence: typically, a wave function is in a superposition ofmacrostates and is not entirely in any one of the macrostates (even if we representmacrostates with exact subspaces). However, we can make sense of situationswhere ψ is (in the Hilbert space norm) very close to a macrostate H ν : ⟨ ψ ∣ P ν ∣ ψ ⟩ ≈ , (12)where P ν is the projection operator onto H ν . This means that ∣ ψ ⟩ lies almostentirely in H ν . In this case, we say that ∣ ψ ⟩ is in macrostate ν .7. Thermal equilibrium: typically, there is a dominant macrostate H eq that has adimension that is almost equal to D:dim H eq dim H ≈ . (13)A system with wave function ψ is in equilibrium if the wave function ψ is veryclose to H eq in the sense of (12): ⟨ ψ ∣ P eq ∣ ψ ⟩ ≈ .
8. Boltzmann Entropy: the Boltzmann entropy of a quantum-mechanical systemwith wave function ψ that is in macrostate ν is given by: S B ( ψ ) = k B log ( dim H ν ) , (14)where H ν denotes the subspace containing almost all of ψ in the sense of (12). Thethermal equilibrium state thus has the maximum entropy: S B ( eq ) = k B log ( dim H eq ) ≈ k B log ( D ) , (15)where eq denotes the equilibrium macrostate.9. Low-Entropy Initial Condition: on the assumption that we can model the universeas a quantum-mechanical system, let us postulate a special low-entropy boundarycondition on the universal wave function—the quantum-mechanical version ofPH: Ψ ( t ) ∈ H PH , dim H PH ≪ dim H eq ≈ dim H (16)where H PH is the PH macro-space with dimension much smaller than that of theequilibrium macro-space. Hence, the initial state has very low entropy in thesense of (25). More details can be added to narrow down the range of choices of H PH .The quantum Boltzmannian account is similar to the classical one. The higher-entropy macrostates have much higher dimensions than lower-entropy ones, and theequilibrium macrostate has by far the largest dimension. It is plausible that, unlessthe dynamics is very contrived, a medium-entropy wave function will find its way Again, we assume that H PH is finite-dimensional, in which case we can use the surface area measureon the unit sphere as the typicality measure for withoverwhelming probability the entropy is higher in the future and lower in the past, withthe probability measure specified in bullet point The Second Law for Ψ At t , the actual wave function of the universe Ψ starts in alow-entropy macrostate, and with overwhelming probability, it evolves towardsmacrostates of increasingly higher entropy until it reaches thermal equilibrium,except possibly for entropy decreases that are infrequent, shallow, and short-lived;once Ψ t reaches H eq , it stays there for an extraordinarily long time, except possiblyfor infrequent, shallow, and short-lived entropy decreases.(This makes plausible a similar Development Conjecture for typical isolated subsystems.)Note again we stipulate three basic postulates in the quantum version of theBoltzmannian account: the fundamental dynamical laws, PH, and SP. Let us call thisthe Quantum Mentaculus: The Quantum Mentaculus Fundamental Dynamical Laws (FDL): the quantum microstate of the uni-verse is represented by a wave function Ψ that obeys the Schrödingerequation (9).2. The Past Hypothesis (PH) : at a temporal boundary of the universe, thewave function Ψ of the universe lies inside a low-entropy macrostate that,given a choice of C-parameters, corresponds to H PH , a low-dimensionalsubspace of the total Hilbert space.3. The Statistical Postulate (SP) : given the subspace H PH , we postulate auniform probability distribution (with respect to the surface area measureon the unit sphere of H PH ) over the wave functions compatible with H PH .The Quantum Mentaculus, as a candidate fundamental theory of physics, facesthe quantum measurement problem. To solve the measurement problem, there arethree promising options: Everettian quantum mechanics, Bohmian mechanics, andGRW spontaneous collapse theories. We have three distinct kinds of the QuantumMentaculus.First, the Everettian version is completely the same as the original QuantumMentaculus in terms of the basic postulates. However, it diverges greatly from common13ense: we have to give up the expectation that experimental outcomes are unique anddeterminate. Instead, our experiences are to be understood as experiences of agents inan emergent multiverse (Wallace 2012).Second, the Bohmian version posits that in addition to the wave function, whichevolves unitarily according to the Schrödinger equation, particles have precise locations,and their configuration Q = ( Q , Q , ..., Q N ) follows the guidance equation, which is anadditional law in the theory: dQ i dt = ̵ hm i Im ∇ i ψ ( q ) ψ ( q ) ( q = Q ) (17)Moreover, the initial particle distribution is given by the quantum equilibrium distribu-tion: ρ t ( q ) = ∣ ψ ( q , t )∣ (18)Adding the above two postulates to the Quantum Mentaculus completes the BohmianMentaculus.Third, the GRW version requires revisions to the linear evolution represented by theSchrödinger equation. The wave function typically obeys the Schrödinger equation,but the linear evolution is interrupted randomly (with rate N λ , where N is the numberof particles and λ is a new constant of nature of order 10 − s − ) by collapses: Ψ T + = Λ k ( X ) / Ψ T − ∣∣ Λ k ( X ) / Ψ T − ∣∣ , (19)where Ψ T − is the pre-collapse wave function, Ψ T + is the post-collapse wave func-tion, the collapse center X is chosen randomly with probability distribution ρ ( x ) =∣∣ Λ k ( x ) / Ψ T − ∣∣ dx , k ∈ { , , ... N } is chosen randomly with uniform distribution on thatset of particle labels, and the collapse rate operator is defined as: Λ k ( x ) = ( πσ ) / e − ( Qk − x ) σ , (20)where Q k is the position operator of “particle” k , and σ is another new constant ofnature of order 10 − m postulated in current GRW theories. The GRW Mentaculusreplaces the deterministic Schrödinger evolution of the wave function by this stochasticprocess. It still requires PH. However, as Albert (2000) §7 points out, it is plausible(though not proven) that SP is no longer needed, and the GRW collapses su ffi ce to makeanti-entropic trajectories unlikely (through the quantum probabilities stipulated by theGRW stochastic process). (See Ismael’s contribution in this volume.) In this subsection, let us consider the Boltzmannian account of quantum statisticalmechanics with a very special “fundamental density matrix.” This account is inspiredby (Dürr et al. dynamical role as the wave function does inthe previous theories. In a quantum system represented by a density matrix W , W isthe complete characterization of the quantum state; it does not necessarily refer to astatistical state representing our ignorance of the underlying wave function. In general, W can be a pure state or a mixed state. A density matrix ˆ W is pure if ˆ W = ∣ ψ ⟩ ⟨ ψ ∣ forsome ∣ ψ ⟩ . Otherwise it is mixed. For a spinless N -particle quantum system, a densitymatrix of the system is a positive, bounded, self-adjoint operator ˆ W ∶ H → H withtr ˆ W =
1, where H is the Hilbert space of the system. In terms of the configuration space R N , the density matrix can be viewed as a function W ∶ R N × R N → C . In the unitarycase, ˆ W always evolves deterministically according to the von Neumann equation: i ̵ h d ˆ W ( t ) dt = [ ˆ H , ˆ W ] . (21)Equivalently: i ̵ h ∂ W ( q , q ′ , t ) ∂ t = ˆ H q W ( q , q ′ , t ) − ˆ H q ′ W ( q , q ′ , t ) , (22)where ˆ H q means that the Hamiltonian ˆ H acts on the variable q . The von Neumannequation generalizes the Schrödinger equation (9).Importantly, now the “fundamental” quantum state can be either pure or mixed.Even when it is mixed, there is no underlying pure state that is more basic. The mixedstate is completely objective. This perspective, called Density Matrix Realism , is in sharpcontrast to the prevalent view called
Wave Function Realism . In the density-matrixrealist framework, we need to modify the Boltzmannian quantum statistical mechanicsdescribed in the earlier section. Here are the key changes: • Microstate: at any time t , the microstate of the system is given by a density matrixˆ W ( t ) that can be pure or mixed. (Macrostates are still represented by orthogonalsubspaces of the energy shell.) • Dynamics: in the unitary case, the density matrix ˆ W ( t ) evolves according to thevon Neumann equation (21). • Being in a macrostate: typically, a density matrix is a superposition of macrostatesand is not entirely in any one of the macrospaces. However, we can make sense ofsituations where ˆ W ( t ) is very close to a macrostate H ν :tr ( ˆ W ( t ) I ν ) ≈ , (23)where I ν is the projection operator onto H ν . This means that almost all of ˆ W ( t ) isin H ν . In this situation, we say that ˆ W ( t ) is in macrostate H ν . • Thermal equilibrium: typically, there is a dominant macro-space H eq that has adimension that is almost equal to D:dim H eq dim H ≈ . (24)15 system with density matrix ˆ W ( t ) is in equilibrium if ˆ W ( t ) is very close to H eq inthe sense of (23): tr ( ˆ W ( t ) I eq ) ≈ • Boltzmann entropy: the Boltzmann entropy of a quantum-mechanical systemwith density matrix ˆ W ( t ) that is very close to a macrostate ν is given by: S B ( ˆ W ( t )) = k B log ( dim H ν ) , (25)for which W is in macrostate H ν in the sense of (23).Next, let us consider how to adapt PH in a density-matrix realist framework. Thewave-function version of PH says that every initial wave function is entirely containedin the PH subspace H PH . Similarly, for density-matrix theories, we can propose thatevery initial density matrix is entirely contained in the PH subspace:tr ( ˆ W ( t ) I PH ) = H PH ≪ dim H eq ≈ dim H (26)where I PH is the projection operator onto the PH subspace. Assuming H PH is finite-dimensional, there is also a natural probability distribution over all density matricesinside this subspace. See (Chen & Tumulka 2020) for a mathematical characterization.The probability distribution and the density-matrix Past Hypothesis support a SecondLaw for W , which is similar to the Second Law for Ψ .However, there is an even more natural way to implement the idea of PH in thedensity-matrix framework, which I favor. PH picks out a particular subspace H PH . Itis canonically associated with its projection I PH . In matrix form, it can be representedas a block-diagonal matrix that has a k × k identity block, with k = dim H PH , and zeroeverywhere else. There is a natural density matrix associated with I PH , namely thenormalized projection I PH dim H PH . Hence, we have picked out the natural density matrixassociated with the PH subspace. I propose that the initial density matrix is thenormalized projection onto H PH : ˆ W IPH ( t ) = I PH dim H PH . (27)I call this postulate the Initial Projection Hypothesis (IPH) in (Chen 2018). Crucially, itis di ff erent from (16) and (26); while IPH picks out a unique quantum state given PH,the other two permit infinitely many possible quantum states inside the PH subspace.Remarkably, we no longer need a fundamental postulate about probability or typicalityfor the quantum state. We know that we can decompose a density matrix non-uniquely into a probability-weighted average of pure states, and in the canonical way we candecompose ˆ W IPH ( t ) as an integral of pure states on the unit sphere of H PH with respectto the uniform probability distribution:ˆ W IPH ( t ) = ∫ S ( H PH ) µ ( d ψ ) ∣ ψ ⟩ ⟨ ψ ∣ . (28)The decomposition here is not an intrinsic expression of what ˆ W IPH ( t ) is, as witnessedby the non-uniqueness. But the expression is something that can nonetheless be usedfruitfully in statistical analysis (Chen 2020d, section 3.2.3).16y doing away with the need for an extra postulate about initial quantum states, weonly need two basic postulates. I call the theory the Wentaculus : The Wentaculus Fundamental Dynamical Laws (FDL): the quantum state of the universe isrepresented by a density matrix ˆ W ( t ) that obeys the von Neumann equation(21).2. The Initial Projection Hypothesis (IPH) : at a temporal boundary of theuniverse, the density matrix is the normalized projection onto H PH , a low-dimensional subspace of the total Hilbert space. (That is, the initial quantumstate of the universe is ˆ W IPH ( t ) as described in equation (27).)Similar to the Quantum Mentaculus, the Wentaculus also su ff ers from the quantummeasurement problem. There are three promising solutions, each of which gives rise toa distinct version of the Wentaculus.First, there is the Everettian Wentaculus that looks exactly like the basic Wentaculus.For this theory, we need to embrace the idea that there is a (vague) multiplicity ofemergent worlds that is similar to the original Everettian theory. What is interestingabout the Everettian Wentaculus is that it suggests an astonishing possibility. If IPH isinterpreted as a fundamental law, then the theory is strongly deterministic , in the senseof (Penrose 1989) that the laws pick out a unique micro-history of the fundamentalontology (represented by W ( t ) ). This theory does not postulate any objective probability.This is another step towards the Everettian aspiration of constructing a theory withoutany fundamental contingency.Second, the Bohmian Wentaculus postulates that, in addition to the universal densitymatrix W that evolves unitarily according to the von Neumann equation, there areactual particles that have precise locations in physical space, represented by R . Theparticle configuration Q = ( Q , Q , ..., Q N ) ∈ R N follows the guidance equation (writtenfor the i -th particle): dQ i dt = ̵ hm i Im ∇ q i W ( q , q ′ , t ) W ( q , q ′ , t ) ( q = q ′ = Q ) , (29)Moreover, the initial particle distribution is given by the density-matrix version of thequantum equilibrium distribution: P ( Q ( t ) ∈ dq ) = W ( q , q , t ) dq . (30)Third, the GRW Wentaculus postulates that the universal density matrix typicallyobeys the von Neumann equation, but the linear evolution is interrupted randomly bycollapses (with rate N λ , where N is the number of particles and λ is a new constant of This version of the guidance equation is first proposed by Bell (1980), then recast as the dynamicalequation for the fundamental density matrix in Dürr et al. (2005). − s − ): W T + = Λ k ( X ) / W T − Λ k ( X ) / tr ( W T − Λ k ( X )) , (31)where W T − is the pre-collapse density matrix, W T + is the post-collapse density matrix,with k uniformly distributed in the N -element set of particle labels and X distributed by ρ ( x ) = tr ( W T − Λ k ( x )) , with the collapse rate operator defined as before in (20).In this section, we presented several versions of PH. They can all be traced back tothe original Boltzmannian idea that the initial state of the universe is special and haslow entropy. Di ff erences arise when we move to the Wentaculus framework where IPHselects a unique and simple initial microstate of the universe. The microstate is given bya mixed-state density matrix. The reason a mixed state can play the role of a microstateis because it enters directly into the fundamental dynamical equations, such as (21),(29), and (31). The di ff erent initial-condition postulates—(7), (16), and (26)—form afamily, and I shall continue using the generic label, the “Past Hypothesis,” to refer tothem and will only use specific labels when their di ff erences are relevant. It is clear that PH has a special status in the Boltzmannian account. It has been suggestedthat PH is like a law of nature. For example, this is emphasized by Feynman (2017)[1965]as quoted in the epigraph. Making a similar point about classical statistical mechanics,Goldstein et al. (2020) suggests that PH is an interesting kind of law:The past hypothesis is the one crucial assumption we make in additionto the dynamical laws of classical mechanics. The past hypothesis may wellhave the status of a law of physics—not a dynamical law but a law selectinga set of admissible histories among the solutions of the dynamical laws.In this section, I o ff er four types of positive arguments to support the view that PHis a candidate fundamental law of nature. These arguments also support the weakerthesis that PH is a candidate axiom in the fundamental theory. My methodology isnaturalistic and functionalist. I argue for the Nomic Status of PH by locating the rolesof the fundamental laws in our physical theories and by showing that PH plays suchroles. These roles include backing scientific explanations, constraining nomologicalpossibilities, and supporting objective probabilities.Some of these arguments (§3.1–3.3) are related to ideas that have appeared in theliterature. I try to make the premises explicit, with the hope that the arguments areclear enough for others who disagree to examine and criticize. Usually, the argumentsare made in the Humean framework, but as I argue, they can also be made on behalf ofnon-Humeans who have a minimalist conception of what it is for laws to really govern .That is the account I favor. Of course, the minimalist account is at odds with the ideaabout “dynamical governing”: To my knowledge, the W-GRW equations first appear in (Allori et al. ynamical Governing Only dynamical laws can be fundamental laws of nature.In §3.4, I o ff er a new argument for the Nomic Status of PH based on considerations ofthe nature of quantum entanglement. The (fundamental) nomic status of PH is supported by the nomic status of the SecondLaw of Thermodynamics. The Second Law is a law of nature; whatever underlies a lawis a law. A law that cannot be derived from other laws is a fundamental law; therefore,PH is a fundamental law. Let us spell out the argument in more detail: P1 The Second Law of Thermodynamics is a law of nature. P2 A law of nature can be scientifically explained only by appealing to more funda-mental laws of nature and laws of mathematics. P3 The Second Law of Thermodynamics is scientifically explained (in part) by PH, andPH is not a law of mathematics. C1 So, PH is a law of nature and is more fundamental than the Second Law. P4 PH is not scientifically explained by fundamental laws. P5 A law of nature that is not scientifically explained by fundamental laws is afundamental law. C2 So, PH is a fundamental law of nature.Comments on P1. First, the Second Law of Thermodynamics summarizes animportant regularity: the tendency for things to become more chaotic and more decayedas time passes. It is part of our concept of lawhood that this irreversible tendencyis law-like. We learn about this law much more directly in our experiences than themicroscopic equations of motion. Second, nature’s irreversible tendency is encoded inour concept of physical necessity. For example, we learn that it is physically impossiblefor a metal rod to spontaneously heat up on one side and then cool down on the other;it is physically impossible to create a perpetual motion machine of the second kind (amachine that violates the Second Law), and this is impossible no matter who tries to doit – whenever and wherever. Hence, the Second Law is not an accidental feature ofthe world. Of course, the usual formulation of the Second Law in terms of the absolutemonotonic increase of entropy is too strong. It should be modified in two ways: it holdsfor the overwhelming majority of nomologically possible initial conditions, and foreach entropic trajectory, there can be short-lived, shallow, and infrequent decreases ofentropy (see Second Laws for X and for Ψ ).Recognizing the importance of the Second Law, Eddington (1928) suggests: This becomes more complicated if an “Albertian demon” turns out to be physically possible. See(Albert, 2000, §5) and Maudlin’s contribution in this volume.
19f someone points out to you that your pet theory of the universe is indisagreement with Maxwellâ ˘A ´Zs equations—then so much the worse forMaxwell’s equations. If it is found to be contradicted by observation—well,these experimentalists do bungle things sometimes. But if your theory isfound to be against the second law of thermodynamics I can give you nohope; there is nothing for it but to collapse in deepest humiliation.That may be too strong. Nevertheless, we should accept that the Second Law isnomologically necessary. It is not merely an accidental feature of the world, such asthe contingent fact that all gold spheres are less than one mile in diameter. Third,counterfactuals are backed by laws of nature; laws are what we hold fixed whenevaluating counterfactuals. The Second Law backs counterfactuals about macroscopicprocesses that display a temporal asymmetry: if there were a half-mixed ink drop in mywater cup right now, it would have been more separated in the past and more evenlymixed in the future. (See §3.2 for more on the counterfactual arrow.)Comments on P2. The notion of scientific explanation here is not a fully analyzednotion. What is relevant to this argument is that the Second Law is supposed to be derived as a theorem from the basic postulates of the Mentaculus or the Wentaculus. It isexpected that, assuming the laws of mathematics, the fundamental dynamical laws, PH,and SP, an initial microstate starting from the initial macrostate will, with overwhelmingprobability, travel to macrostates of increasingly higher entropy until it reaches thermalequilibrium (except possibly for short-lived and infrequent decreases of entropy). So, itis the Second Law’s mathematical derivation from the basic postulates of the physicaltheory that is the relevant notion here. A standard response to P2 is that a non-fundamental law (such as those in the specialsciences) can be explained (in part) by some contingent boundary conditions. Examplesmay include the laws of genetics and laws of economics. However, it is not clear whatdoes the explanatory work in those cases. Let us suppose that some special science law S arise from boundary conditions B . Suppose B obtains. Then there is a high objectiveprobability that S obtains. What is the origin of these objective probabilities? What isthe physical explanation? If everything is ultimately physical, and the physical theoryis informationally complete in so far as the motion of objects go, then it seems thatthe objective probabilities are ultimately backed by some postulates in physics. Theprobabilities in physics may supply conditional probabilities on which Pr ( S ∣ B ) is high.What does the explaining, then, is the probability supplied by physics, and the reallaw should be the high probability of S obtaining given B , which is consistent with thephysicalist picture we started with.Winsberg (this volume) o ff ers another potential counterexample to P2. He suggeststhat due to the near certainty of the existence of Boltzmann brains and large fluctuationsin future epochs of the universe, it is important to postulate also a Near Past Hypothesis(see also (Chen 2020c)): Near Past Hypothesis (NPH)
We are inside the first epoch of the universe between the This is di ff erent from the notion of metaphysical explanation. See (Loewer 2012), (Hicks & van Elswyk2015). On their views, a fundamental law of nature is metaphysically explained (but not scientificallyexplained) by the matter distribution. X , the Second Law for Ψ , and the Second Lawfor W . In those versions, fluctuations are already taken into account (in a non-indexicalway). Hence, we do not need to invoke NPH to derive those versions of the SecondLaw from PH, SP, and the dynamical laws.Comments on P3. This premise is true if we grant the explanatory success of theBoltzmannian account, which is assumed in this paper. It is clear that PH is not a law ofmathematics.Comments on P4. P4 is an open scientific question. Perhaps some future theory(e.g. along the lines of (Carroll & Chen 2004)) can explain PH using some simple andsatisfactory dynamical laws. Still, it is also a scientific possibility that PH remains afundamental law in the final theory and is not explained further. Given the opennessof P4, we should accept C2 only to the degree of acknowledging PH as a candidate fundamental law.Comments on P5. This follows from our concept of a fundamental law of nature.The argument above supports the (fundamental) nomic status of PH. If NomicStatus implies Axiomatic Status, then the argument also supports the idea that PH is anaxiom in the fundamental physical theory. But there is another, more straightforwardargument for the Axiomatic Status. The predictive consequences of a physical theoryshould come entirely from its axioms and their deductive consequences. A goodphysical theory aims at capturing as many regularities as possible using simple axioms.The Second Law describes an important regularity. Therefore, we postulate PH and SPin addition to the fundamental dynamical laws. These postulates have an axiomaticstatus in the Mentaculus.The Mentaculus is a good theory; it is better than “Mentaculus-,” the Mentaculusminus PH and SP. The Mentaculus predicts not only the motion of planets, but alsothe overwhelming probability that my table will not spontaneously rearrange itselfinto the shape of a statue. The Mentaculus- can tell us everything about the motion ofplanets but is silent about many macroscopic regularities we see around us. Even so, theMentaculus is a pretty simple theory. Someone might suggest that to achieve maximalpredictive power, we can add a statement about the exact microstate of the universe at t as an additional axiom to Mentaculus-. But the exact microstate is a detailed fact thatcomplicates the Mentaculus - such that its axioms will no longer be simple enough. (Inthe Wentaculus, IPH pins down a quantum microstate, but its informational contentand simplicity level are the same as those of PH in the Mentaculus.)21 .2 Arguments from Other Asymmetries The thermodynamic arrow of time described by the Second Law is best explained byPH. That provides strong support for the Nomic Status and the Axiomatic Status ofPH. What about other arrows of time? In this subsection, I present arguments based onthe counterfactual arrow, the records arrow, the epistemic arrow, and the interventionarrow. The upshot is that they can also be traced back to the nomic status of PH, withoutwhich they would be left completely mysterious. Many of these ideas can be foundin (Albert 2000, 2012) and (Loewer 2007), and they are also discussed in (Frisch 2005,2007), (Demarest 2019), (Fernandes, this volume), (Callender 2004), (Horwich 1987),and (Reichenbach 1956).
The records arrow.
We have photographs and videos of WWII but no photographs ofthe next major world war. We have detailed accounts of the life of President Washingtonbut no detailed accounts of the life of the 65th president of the United States. There arecraters on the moon indicating past meteorite impacts but no craters indicating futuremeteorite impacts. Similarly, there are fossils, rocks, ice sheets, all of which tell us thestate of our planet in the past, but we do not have similarly abundant records that tellus the state of our planet in the future.What is it about our world such that there are abundant records of the past but few,if any, records about the future? One could appeal to some A-theory of time, accordingto which the future does not yet exist and the past has already happened. So therecannot be records about the future because there are no facts about the future. Butthis does not seem to provide a satisfactory scientific explanation, as the temporallyasymmetric probabilistic correlations have to be accepted as brute facts. In any case,in a block-universe picture compatible with a B-theory of time, the past, present, andfuture are all equally real; all events exist tenselessly. There are strong probabilisticcorrelations between physical records (e.g. fossils) that exist at a particular time andphysical systems (e.g. dinosaurs) that exist at an earlier time, but no strong correlationsbetween physical records and events at a later time. PH o ff ers an explanation. Albert (2000) suggests that a record is a relation betweentwo temporal ends of a physical process. A record enables us to infer what happensinside the temporal interval. For example, in a lab, the record of an electron passingthrough a small slit is the relation between the "ready state" of the measuring instrumentat t and the "click" state of the measuring instrument at t . If the instrument movesfrom "ready" to "click," then we can infer that an electron has passed through the slitbetween t and t . But if the instrument was not at "ready" at t , we cannot infer that.However, to know that the instrument was indeed "ready," we also need to rely onan earlier record. This seems to go back in time ad infinitum , to records about the lab,and to records about the larger environment, and eventually to records about earlierstates of the universe. To know that the cosmic microwave background (CMB) data isreliable, we also need to postulate that there is some "ready" state at the beginning of theuniverse. PH, stipulated at (or around) t , is the "mother of all ready states." It provides The notions of “earlier” and “later” here can be fleshed out in a way that does not refer to an intrinsicarrow of time. What matters here is not there are facts about earlier times or later times but simply thatthe probabilistic correlations are temporally asymmetric. However, it is not enough that PH be true. We also need to justify the important factthat physical records are reliable. For this, we need PH to have the status of a law. (IfPH is not derived from other laws, it will have the status of a fundamental law.) If atheory predicts that it is unlikely for physical systems that look like records to reliablyindicate past events, then the theory would undermine the rational justification forbelieving in it. Such a theory would be epistemically self-undermining because we believein physical theories based on records about past experiments and observations. TheMentaculus without PH is such a theory. It would predict that most "records" comeabout from random fluctuations. If we dig out a shoe of Napoleon, most likely it cameabout from random fluctuations and not from a low-entropy, past state. Postulating PHas a law avoids that. The uniform probability distribution conditionalized on PH willpredict that most physical systems that look like "records" will be reliable records aboutthe past (here we set aside the problems of large future fluctuations).
The epistemic arrow.
Given some information about the present, there is some sensein which our knowledge about the past is more vast, detailed, and easily gained thanour knowledge about the future. We know that the sun will (likely) rise tomorrow, butwe do not know who will win the US presidential election of 2028 and when exactlythe stock market will crash over the next 20 years. But we know exactly who wonthe election of 1860, when exactly the stock market crashed in the last 20 years, andso on. Similar to the records arrow, the epistemic arrow is especially puzzling in ablock-universe picture compatible with a B-theory of time. All facts about the past,present, and future are out there. Why is our knowledge so skewed towards onetemporal direction?Albert (2000) explains the epistemic arrow in terms of the records arrow, which in turnis explained by the Nomic Status of PH. The basic idea is this. We distinguish betweeninferences based on records from inferences based on predictions or retrodictions.The latter uses only the current macrostate together with the dynamical laws plus SP(construed as an unconditionalized uniform probability distribution on the energy shellof phase space) to the past (retrodictions) and to the future (predictions). Inferencesbased on predictions will tell us that with overwhelming probability, the sun will risetomorrow and the ice cubes in my co ff ee will melt in the next hour, but inferences basedon retrodictions will get most things wrong about the past. For example, retrodictionswill tell us that the ice cubes in my co ff ee were actually smaller in the past (theyspontaneously got larger in my co ff ee), and all the books about someone named Lincolnwinning the 1860 election came about from random collisions of particles. However,inferences based on records are much more powerful and demand much less detailedinformation about the current macrostate of the world. We can infer to past statesreliably by assuming that records are reliable. Such an inference is backed by theassumption that the recording device was "ready" at a time before the event, and therewas another recording device measuring the "ready" state of that one, and so on. As More details are needed to fully explain the records asymmetry; Rovelli (2020) provides an interestinganalysis that adds additional constraints on the initial condition. This is somewhat parallel to the situation of empirical adequacy and records in Everettian quantummechanics. See (Barrett 1996) on the latter.
The counterfactual arrow.
I am at home right now. If I had been in my o ffi ce, thefuture would be somewhat di ff erent but the past would have been pretty much thesame. Trump did not press the nuclear button on Independence Day this year. If hehad, events on Labor Day would be dramatically di ff erent but events on MemorialDay would have been pretty much the same. Why is there a temporal asymmetry ofcounterfactual dependence? The semantics for counterfactuals is a controversial issue.It is not clear if there is a unified theory that explains every instance of counterfactualsin ordinary language. But if we focus on the counterfactuals that are important forcontrol, decision, and action, it is often accepted that such counterfactuals are backed bylaws. This is made explicit in Lewis (1979)’s metric for comparing similarity relationsamong worlds, but it should also be compatible with a strict-conditional approach.If the laws are temporally asymmetric, and if laws entail that changes in the presentmacrostate would lead to vast di ff erences in the future but not much in the past, thenthe counterfactual arrow has an explanation.However, given equations (2) in classical mechanics or equations (9, 17) in unitaryquantum mechanics, changes to the current state (such as the location of Trump’s indexfinger and the location of the nuclear button) will lead to macroscopic di ff erences in bothdirections of time. It is only by assuming PH as a fundamental law can we explain thefollowing: most physically possible trajectories compatible with the current macrostatewill be such that if Trump had pressed the button, most future trajectories would bemacroscopically di ff erent from the actual ones but the past trajectories would be prettymuch the same. This is also due to the records arrow. Assuming PH, there will be anabundance of records about the past. In so far as PH makes it very likely that thoserecords are reliable, PH constrains the past histories of the trajectories even if certainmacroscopic features get changed in the present state. However, few, if any, recordsexist about the future, so there is no such constraint on future macrostates.The correct counterfactual semantics will no doubt involve context sensitivity andother parameters. Still, it is hard to deny that laws of nature play an important role indetermining the truth values of counterfactuals. The intervention arrow.
We can exert influence towards future events but we canno longer act to bring about changes to the past. The intervention arrow is intimatelyconnected to the counterfactual arrow, and it is not clear which is conceptually prior.Some contemporary analyses of influence and intervention are couched in the causalmodeling framework. In that framework, we have directed acyclic graphs withvariables representing events and arrows representing the direction of e ff ect. But if thefundamental dynamical laws are time-symmetric, what is the scientific explanationfor these arrows? Often, the arrows are taken to be primitives in the causal model,left unexplained. However, if PH explains the counterfactual arrow, then it can alsoexplain the arrow of intervention. We can flesh out the language of intervention interms of intervention counterfactuals, and the arrow of intervention counterfactuals24an be explained in a similar way by the records arrow and PH. Loewer (2007) providessuch an account.The arguments from the entropic arrow (the Second Law) and the other arrows canbe taken together as an inference to the best explanation. PH (and SP) ground theseasymmetries of time. Moreover, to explain them satisfactorily, we need to postulate PHas a fundamental law and we need SP to provide objective probabilities. An opponentmay take all of these arrows to be fundamental features of the world, and they canpostulate them as primitives in the theory. But that would strike many as a fragmentedand unsatisfactory view. Postulating PH and SP in addition to the dynamical laws is amuch simpler and more unified way to think about the various arrows of time: from aset of simple axioms, we can derive all of the temporally asymmetric regularities—weget a big bang for the buck. Humeanism provides a natural home for PH to be a fundamental law and for SPto specify objective probabilities. According to Lewis (1983), the fundamental lawsand postulates of objective probabilities are the axioms of the best system that are trueabout the mosaic and optimally balances various theoretical virtues such as simplicity,informativeness, and fit. On this account, the dynamical equations such as equations (2,9, and 17) could be axioms of their respective best systems and the GRW chances couldbe the objective probabilities in a GRW world. Can the classical Mentaculus, quantumMentaculus, and the Wentaculus count as axiomatizations of the best system? Thisdepends on whether PH can count as a fundamental Lewisian law and whether SP cancount as objective probabilities on the best-system account. Anticipating the need toadd a boundary condition into the best system, Lewis (1983) writes,A law is any regularity that earns inclusion in the ideal system. (Or, incase of ties, in every ideal system.) The ideal system need not consist entirelyof regularities; particular facts may gain entry if they contribute enoughto collective simplicity and strength. (For instance, certain particular factsabout the Big Bang might be strong candidates.) But only the regularities ofthe system are to count as laws. (p.367)But if a statement such as PH is axiomatic in the best system, why not count it towardslaws? In the same paper (p. 368), Lewis distinguishes between fundamental laws andderived laws. He suggests that fundamental laws are those statements that the idealsystem takes as axiomatic and invokes only perfectly natural properties. But PH certainlyis axiomatic in the Mentaculus and the Wentaculus. Moreover, PH can be stated in thefundamental language of the respective theory. (There will be some residual vagueness,which we discuss in §4.3.) So it seems that Lewis should be open to the idea that PH isa fundamental law according to the best-system account. Hence, if we are committed to the Humean conception that laws supervene onthe mosaic in the way specified by the best-system account, we are led to accept the For a related point, see (Callender 2004). meta-physically determine what the laws are. They are constitutive of laws. Laws are justcertain ideal summaries of facts in the world. Laws are nothing over and above themosaic.On non-Humean theories, however, laws do not supervene on the mosaic. Lawsmay be as fundamental as the mosaic itself. Following Hildebrand (2013), we candistinguish between two types of non-Humean theories:1. Primitivism: fundamental laws are primitive facts in the world.2. Reductionism: fundamental laws are analyzed in terms of something outside themosaic.Carroll (1994) and Maudlin (2007) maintain primitivist versions of non-Humeanism.Hildebrand (2013) provides a survey of the reductionist versions according to whichlaws are further explained by relations among universals, dispositions, essences, orsome other more fundamental entities. It is not clear what the further analysis buys us.It is not clear to me how to reformulate various modern physical laws and objectiveprobabilities in terms of those entities, and it is less clear to me what advantages thereare to reduce laws to something further. Here I agree with Maudlin that the concept oflaws seems more familiar to us than the concepts employed in the further analysis (suchas in terms of dispositions, universals, and the like). Maudlin’s version of primitivismis influential in contemporary discussions of the metaphysics of laws in philosophy ofphysics. However, Maudlin (2007) favors a more restrictive version of primitivism thatI call
Dynamical Law Primitivism : Dynamical Law Primitivism
Fundamental laws are primitive facts in the world, andonly dynamical laws can be fundamental laws.This view is connected to Maudlin’s view about the intrinsic and primitive arrow oftime. In contrast, the spirit of the present project is to analyze time’s arrow in terms ofsomething else. In fact, the extra commitment about the primitive arrow of time can bedisentangled from the basic idea about how laws govern.The basic non-Humean idea is simply that laws really govern . They metaphysicallyexplain why the nomological possibilities are the way they are and why things are asconstrained as they are. The metaphysical explanation can take the form of constraints:given S ( t ) , some complete specification of the state of the universe at some time, thereis a constraint on what the history of the world is like. If the theory is deterministic,then there is only one microscopic history compatible with the S ( t ) . A fundamentaldynamical law is a kind of conditional constraint. Constraints can take other forms,such as by selecting a space of possible histories. This is the form of certain equationsin general relativity and Maxwellian electrodynamics. It is also true of PH, as it selects Hildebrand (2013) suggests that reductionist theories have an advantage over primitivist theories foranswering the problem of induction. I disagree, but I set it aside for future work.
26a set of admissible histories among the solutions of the dynamical laws.” There isconceptual space for a minimalist conception of primitivism that places no restrictionon the form of fundamental laws and in particular, not all of them have to be dynamicallaws. The basic view is this:
Minimal Primitivism
Fundamental laws are primitive facts in the world; there is norestriction on the form of fundamental laws. In particular, boundary conditionscan be fundamental laws.Even though fundamental laws can take on any form, we expect them to be relativelysimple and informative. These theoretical virtues are no longer constitutive of whatlaws are, but they can serve as our best guides to find the primitive laws : Epistemic Guides
Even though theoretical virtues such as simplicity and informative-ness are not constitutive of fundamental laws, they are good epistemic guides fordiscovering the fundamental laws.Chen & Goldstein (2020) develop this idea in more detail. It seems to me thatMinimal Primitivism is a good version of non-Humeanism, and it may well be onethat best fits our scientific practice and the actual conception of laws. The minimalprimitivist view does not commit to an intrinsic and irreducible arrow of time, makingit compatible with the Boltzmannian project of analyzing time’s arrow in terms of theentropy gradient. PH, as we have discussed already, is virtuous in the right ways.It provides a simple explanation for the restrictions of physical possibilities and theoverwhelming probability of irreversibility. According to the Epistemic Guides on theMinimal Primitivist conception, we have found strong evidence that PH is a candidatefundamental law.Hence, both Humeanism and non-Humeanism (in the minimalist form) support theidea that PH is a candidate fundamental law.
I suggest that there is a new reason to take PH as a fundamental law: it can help us solvea long-standing puzzle in the foundations of quantum mechanics. One of the chiefinnovations of quantum theory that has no classical analog is quantum entanglement.It is also the origin of the quantum measurement problem. If we solve the measurementproblem using one of the three strategies discussed in §2: along the lines of Everett,Bohm, and GRW, we are still left with the quantum state that plays an importantdynamical role in the respective theories. Hence, the puzzle of quantum entanglementcan be traced to the nature of the quantum state. What does the quantum state representphysically? What is it in the world? Given the role of Ψ in formulating well-posedinitial-value problems in EQM and BM, and its role in dynamical collapses in GRW, it isreasonable to think that Ψ represents something objective. Here are some options for arealist interpretation (see (Chen 2019b) for more detail): Why is this expectation rational? I do not know of any non-circular justification. We can appeal tothe success of physics and the discovery of simple and informative laws in the past. We may also appealto a deeper “meta-law” that metaphysically constrain the physical laws.
27. High-dimensional field: on this view, the fundamental space is isomorphic tothe 3N-dimensional configuration space; the wave function represents a physicalfield on that space (Albert 1996). Even the defenders acknowledge that this viewhas highly counter-intuitive consequences. It is also an open question whether itreally succeeds in recovering the manifest image of lower-dimensional objects.2. Low-dimensional multi-field: on this view, the fundamental space is the physicalspace(time); the wave function represents a “multi-field” that assigns physicalquantities to every spatial region composed of N points (Forrest 1988, Belot 2012,Chen 2017, Hubert & Romano 2018). However, this view also appears to haveundesirable consequences. In the multi-field interpretation of Everettianism, sinceentanglement relations are still in the 4-dimensional mosaic, its Lorentz-invariancecomes at a surprising cost—the failure of what Albert (2015) calls narratability .This also arises in Wallace & Timpson (2010)’s spacetime state realist interpretationof Everett. For the Bohmian framework, the multi-field guides particles, but thereis no influence (back-reaction) of the particles on the multi-field, even though boththe particles and the multi-field are fundamental material entities.3. Nomological interpretation: on this view, the fundamental space is the physicalspace(time) and the fundamental ontology consists in particles, fields, or flashes onthat space; the wave function represents a physical law (Dürr et al. ff erent kind. First, Ψ t is time-dependent.Can nomological entities change in time? I do not see why not. Moreover, asdefenders of this view have long recognized, if the universal quantum state obeysthe Wheeler-DeWitt equation H Ψ = Ψ will be time-independent and notchanging. Second, the universal wave function is a very detailed function andmay be too complicated to be a law. This problem seems much more serious. Callthis the problem of complexity.Taking PH to be a fundamental law provides a solution to the problem of complexityin the nomological interpretation of the quantum state. This solution works in theWentaculus framework, where IPH is given a fundamental nomic status. For IPH, thenormalized projection onto H PH contains no more and no less information than H PH ,specified by PH in the Mentaculus. If H PH is simple and informative enough to benomological, then so is its normalized projection, which is W IPH ( q , q ′ , t ) . That is, wecan a ff ord the same status of a fundamental law to W IPH ( q , q ′ , t ) . In the EverettianWentaculus, we can interpret W IPH ( q , q ′ , t ) as a law that determines the “local beables,”such as a matter-density field in physical space. In the Bohmian Wentaculus, wecan interpret W IPH ( q , q ′ , t ) as similar to the Hamiltonian function H ( p , q ) : providinga velocity field of particle trajectories. In the GRW Wentaculus, we can interpret W IPH ( q , q ′ , t ) as providing conditional probabilities for the configurations of“localbeables,” such as a matter-density field or flashes in spacetime.However, to solve the complexity problem, it is not su ffi cient for IPH to be acontingent initial condition; it is crucial that IPH has the status of a fundamental law.If it is nomologically possible for W to di ff er from the state specified by IPH, then28he initial quantum state could well be (i.e. physically possible) too complicated to beregarded as nomological. Hence, only by assuming the nomic status of IPH do weobtain a solution to the problem of complexity, thereby arriving at a satisfactory way tounderstand the nature of the quantum state. In contrast to the first two interpretationsaccording to which quantum entanglement relations are in the material ontology, thenomological interpretation of the quantum state locates the origin of entanglementin the laws. And by taking a nomological interpretation of the quantum state in theWentaculus framework, we see a unified solution to two problems in the foundationsof physics: the problem of irreversibility and the nature of the quantum state. Again,this is compatible with both Humeanism and (the minimal form of) non-Humeanismabout laws. (For more detail, see (Chen 2018, 2020a).) In the previous section, I have provided positive arguments for the fundamental, nomicstatus of PH. In this section, I discuss three apparent conflicts between these argumentsand the ordinary conceptions of laws of nature. These apparent conflicts may explainsome people’s hesitation in accepting the fundamental, nomic status of PH. However,as I argue, these conflicts are merely apparent if we adopt the Mentaculus framework,and at any rate, they become even less worrisome in the Wentaculus framework.
It is often said that PH is merely a boundary condition. The contrast between boundaryconditions and fundamental laws can be seen in the di ff erences between the paradigmcases of dynamical laws—such as equations (2, 9, 17, 21, and 29), and boundaryconditions that select subclasses of the dynamically possible trajectories. Boundaryconditions do not directly play a dynamical role.It is not clear how to make this worry precise. First, it is unclear why everylaw has to be dynamical. Second, by restricting the possible initial conditions, aboundary-condition law can be an important ingredient in the theory, as in the case ofthe Mentaculus. In fact, in the Wentaculus framework, the boundary condition IPHplays a direct dynamical role, akin to the dynamical role of the Hamiltonian function inclassical mechanics. For example, in the Bohmian version, IPH (27), the von Neumanequation (21), and the guidance equation (29) can be combined into one equation: dQ i dt = ̵ hm i Im ∇ q i W IPH ( q , q ′ , t ) W IPH ( q , q ′ , t ) ( Q ) = ̵ hm i Im ∇ q i ⟨ q ∣ e − i ˆ Ht /̵ h ˆ W IPH ( t ) e i ˆ Ht /̵ h ∣ q ′ ⟩⟨ q ∣ e − i ˆ Ht /̵ h ˆ W IPH ( t ) e i ˆ Ht /̵ h ∣ q ′ ⟩ ( q = q ′ = Q ) (32)Hence, in the Bohmian version, IPH does not just select a subclass of velocity fields; itpins down a unique velocity field. In the Everettian version, IPH does not just select asubclass of possible multiverses; it pins down a unique one. In the GRW version, IPHis directly involved in setting the chances of collapses.There is a related worry about admitting boundary condition laws. Typically wedistinguish between dynamical laws and initial conditions. If initial conditions can29e laws, then how can we distinguish between laws and contingent data? It seemsthat the distinction would collapse. However, that is not the case. Some boundaryconditions such as PH have the elite status as fundamental laws, but it does not followthat all boundary conditions are similarly elite. This is because not all boundaryconditions exemplify the optimal balance of required theoretical virtues (either asconstitutive of what laws are or as epistemic guides to primitive laws), includingsimplicity, informativeness, and fit.One may worry that it is odd to have a fundamental law that refers to a particulartime ( t ). Our most familiar fundamental laws are general statements that do not refer toa particular time or place. But why is it a requirement that laws cannot refer to particularevents? Suppose some places or times are in fact physically distinguished; then it seemsappropriate for laws to refer to them. Think about the Aristotelian tendency for thingsto move towards the center of the (geocentric) universe. If that is indeed the case, thenwe should have a fundamental law describing that motion, and a simple candidatewould just be to state that the center of the world, C , is the place towards which thingsevolve. Similarly, if the initial time, t , is indeed special (as the initial state accounts fora great many regularities), then it is appropriate to postulate a law that refers to t . Another worry concerns the nature of objective probabilities. The Mentaculus postulatesboth PH and SP. PH and SP share the explanatory burden, so they should have thesame status. It is important that the probabilities of SP be objective. However, if thedynamics are deterministic (as in the Bohmian and the Everettian versions), objectiveprobabilities seem to be either 0 or 1. How can non-trivial probabilities be objectiveand represent something beyond subjective credences? (In so far as typicality plays asimilar explanatory role as probability, TP may face the same prima facie problem as SP.)Humeanism has the resources to solve this problem. Loewer (2001) suggests that de-terministic “chances” can gain entry in the Lewisian best system by the informativenessthey bring and the simplicity of the postulate, such as the uniform measure specifiedby SP. On non-Humeanism, how to understand deterministic “chances” is an openproblem. On Minimal Primitivism, perhaps the notion of primitive constraining (ofthe initial state) can come in degrees that can be represented by probabilities. Anotherway to allow deterministic “chances" is to use the notion of typicality specified by TP,which can be interpreted as only allowing the initial conditions that are typical . Onthis understanding, abnormal, anti-entropic initial conditions are not nomologicallypossible. (A similar problem arises in Bohmian mechanics, which requires a quantumequilibrium distribution in addition to the deterministic dynamical laws, to deduce theusual Born rule for subsystems.)In the Wentaculus framework, it is clear that there is an objective anchor for SP(and TP). Since IPH selects a unique initial quantum state of the universe, we no longerneed a probability distribution over initial quantum states. However, as a purelymathematical fact, the W IPH ( t ) induces a “uniform” probability distribution over purestates. This is not a fundamental postulate of probability in the theory, unlike SPor TP in the Mentaculus. It arises as a mathematical consequence of the objective30uantum microstate of the universe. This emergent probability distribution, thoughnot fundamental, can play the same role in statistical analysis about typical behaviors.For example, the usual conjecture about typical pure states approaching equilibriumcan be translated into the following: most parts of the density matrix will be very closeto the equilibrium subspace, which is equivalent to the claim that the density matrixwill approach equilibrium. Hence, the Wentaculus framework provides an objectiveanchor for SP and TP. The final worry about the fundamental nomic status of PH has to do with the fact thatPH is vague, and an exact version of PH would be arbitrary in an unprecedented way.In Figure 1 of §2.1, I made clear that the boundaries of macrostates are fuzzy, and themacrostates only form an exact partition if we stipulate some arbitrary choices of theparameters for coarse-graining. These are the C-parameters: the size of coarse-grainingcells, the exact correspondence between macroscopic quantities and functions on phasespace, and (in the quantum case) the cut-o ff value for macrostate inclusion. There arebetter or worse ways to choose the C-parameters. But it is implausible that there aresome exact values of C-parameters as known to Nature. The vagueness comes up inthe Classical Mentaculus and the Quantum Mentaculus in how PH selects an initialmacrostate that constrains the initial microstate. In the quantum case, PH only selectsan exact subspace in Hilbert space when we choose some arbitrary C-parameters.In the Quantum Mentaculus, can we stipulate that there is an exact subspace H PH asknown to Nature? This leads to what I call untraceable arbitrariness . There is an infinityof admissible changes to the boundary of H PH that do not change the nomologicalstatus of most microstates compatible with H PH . This is unlike the kind of arbitrarinessof natural constants or other fundamental laws. For example, any change to the value ofthe gravitational constant in Newtonian theory will make most worlds (compatible withthe original Newtonian theory) impossible. What about the case of stochastic theories?Are the dynamical chances traceable? Yes they are, but not in the way of changingstatus from physically possible to physically impossible. The traceable changes arereflected in the probabilistic likelihood of most worlds.I discuss this in more details in (Chen 2020b). Here I provide a simple illustrationby considering mechanisms for flipping a coin (see Figure 2(a)). Suppose we have astochastic coin and it is flipped three times. It landed Heads, Tails, and Heads. In thecolor version of the diagram, the possible sequences are marked in black and the actualsequence is marked in red. The simplest chance hypotheses are going to be: • H α : the chance of landing Heads at each flip is the same, and it is α . However, it does not answer the related question about the nature of the quantum equilibriumdistribution in Bohmian theories. But that is di ff erent from the issue of the status of SP and TP. Admissibility here is vague and rightly so. It can be interpreted as some measure of simplicity oftheories. We want PH to be simple enough, and di ff erent ways of carving out the boundary will lead todi ff erent exact versions of PH. But we only want to consider those versions that are su ffi ciently simple(and not too gerrymandered). H . is the hypothesis thatthe coin is fair, H is the hypothesis that the coin always lands Heads, and so on.Among these hypotheses, a sequence of coin flips will select exactly one of the chancehypotheses from this class based on which one receives the highest likelihood value. Inthis case, it is H / with the highest likelihood value being 4 /
27. Conversely, H / willassign a determinate chance to every sequence of coin flips. If Nature stochasticallyacts according to H / , then changing the chance of Heads even slightly, say to 0 . / / H / to every sequence, but it is far lesssimple. The gerrymandered chance hypothesis is not a serious competitor to H / . Theactual hypothesis is by far simpler and more fit than any competitor.Traceability is lost in the case of the deterministic coin. In this case, there are nodynamical transition chances. The objective probabilities come from probabilities overinitial conditions. Suppose a deterministic coin is flipped and it lands Heads, Tails, andHeads. In the color version of Figure 2(b), the red dots represent the initial conditions ofthe coin (which also include details about the flipping mechanism) that deterministicallylead to the sequence HTH, and the black dots represent initial conditions that lead toother outcomes (such as HHH, TTT, and so on). To simplify things, suppose the samplespace is finite, so there are only finitely many initial conditions to consider. Then wecan draw di ff erent probabilistic hypotheses as di ff erent “circles” over initial conditions.Suppose the Black Circle encloses 4 red dots and 23 black dots. Then it represents aprobabilistic hypothesis that all and only the dots within the Black Circle are possibleinitial conditions and each dot has equal probability. The Black Circle has the highestlikelihood given the data of HTH. If the Green Circle encloses 4 red dots and 25 blackdots, then the Green Circle is less likely than the Black Circle given HTH. However, it iseasy to have a nearby circle, say the Blue Circle, that (like the Black Circle) has 4 red32ots and 23 black dots. This is possible if the red dots are su ffi ciently localized in statespace such that it is easy to preserve their proportion to the black dots while changingthe boundary of the circle.Moreover, specifying the Blue Circle need not be more complicated than specifyingthe Black Circle. They are the same kind of probabilistic hypotheses, and there is noreason to think that one is more gerrymandered than the other, unlike the situationwith the stochastic coin where to recover the same likelihood, one has to resort totime-dependent chances. This reasoning can be generalized if the state space gets richerand the sequence of coin flips gets longer. There may be infinitely many ways to slightlychange the boundary of the Black Circle and keep the relative proportions constant .This means that there will be a large class of probabilistic hypotheses that have thesame likelihood given a particular history of coin flips. No particular hypothesis isfar simpler and more fit than all competitors. Hence, super-empirical virtues such assimplicity will become more relevant. Furthermore, since the variation of the boundaryis incremental, comparing simplicity can generate a sorites series: is there a determinateclass of hypotheses that are simple enough? It is implausible for there to be a sharp line.Hence, we have a vague “collection” of hypotheses that pass the simplicity bar.The contrast between the stochastic coin and the deterministic coin is analogous tothe comparison between GRW theories and a vague law such as PH. GRW chances aretraceable, but the boundary of the PH macrostate and the exact probability distributionare untraceable. Hence, there are reasons to think that if PH is a fundamental law in theMentaculus, then it is a vague law. I call this phenomenon nomic vagueness .However, it is not clear why vagueness disqualifies a statement from being afundamental law. After all, we should be led by empirical evidence and scientificpractice to consider what the laws are, and our metaphysical commitments to precisionand exactness should not be given absolute priority. It is a surprising consequencethat a fundamental law can be vague. This gives us reason to think that perhapsthe final theory of the world will not be completely mathematical expressible, in sofar as vagueness and higher order vagueness defy classical logic and set-theoreticmathematics. This is a radical consequence about nomic vagueness that deserves moreattention (Chen 2020e).Nevertheless, the situation is di ff erent in the Wentaculus. It gets rid of nomicvagueness without introducing untraceable arbitrariness. According to IPH, the initialmacrostate and the initial microstate is represented by the quantum state of the universe— W IPH ( t ) . It enters directly into the fundamental micro-dynamics. Hence, W IPH ( t ) willbe traceable from the perspective of two realist interpretations of the quantum state(Chen 2019b):1. W IPH ( t ) is ontological: if the initial density matrix represents something in thefundamental material ontology, IPH is obviously traceable. Any changes to thephysical values W IPH ( t ) will leave a trace in every world compatible with IPH.2. W IPH ( t ) is nomological: if the initial density matrix is on a par with the fundamen-tal laws, then W IPH ( t ) plays the same role as the classical Hamiltonian functionor fundamental dynamical constant of nature. It is traceable in the Everettianversion with a matter-density ontology as the initial matter-density is obtained33rom W IPH ( t ) . It is similarly traceable in the GRW version with a matter-densityontology. For the GRW version with a flash ontology, di ff erent choices of W IPH ( t ) will, in general, lead to di ff erent probabilities of possible macro-histories. Inthe Bohmian version, di ff erent choices of W IPH ( t ) will lead to di ff erent velocityfields such that for typical initial particle configurations (and hence typical worldscompatible with the theory), they will take on di ff erent trajectories.The traceability of W IPH ( t ) is due to the fact that we have connected the low-entropymacrostate (now represented by W IPH ( t ) ) to the micro-dynamics (where W IPH ( t ) appears). Hence, W IPH ( t ) is playing a dual role at t (and only at that time): it is boththe microstate and the macrostate. In contrast, the untraceability of Γ in classicalmechanics is due to the fact that classical equations of motion directly involve onlythe microstate X , not Γ . Similarly, H PH in the standard wave-function formulation isuntraceable because the Schrödinger equation directly involves only the wave function,not H PH . Many changes could be made to Γ and H PH that would not trickle downwhatsoever in typical worlds compatible with these postulates. The Mentaculus,but not the Wentaculus, faces a dilemma between nomic vagueness and untraceablearbitrariness. I have argued that, in the Boltzmannian framework, PH is a candidate fundamental lawof nature. Such a view is supported by the theoretical roles PH plays in the theory. Inarguing for the Nomic Status and the Axiomatic Status of PH, we see that whether it isa law makes a di ff erence to many other issues in the foundations of physics. Moreover,its nomic status calls for some re-thinking about the nature of physical laws. I suggestthat, according to Humeanism and a minimal version of non-Humeanism, boundaryconditions can be fundamental laws, SP and TP can be objective, and fundamentallaws and chances can be vague. The conflicts with our concept of laws of nature aremerely apparent, and in any case, they become much less worrisome if we adopt theWentaculus framework. Hence, the view that PH is a candidate fundamental lawshould be more widely accepted than it is now. Acknowledgement
My ideas in this paper have been influenced by discussions with many people over theyears. I am especially grateful to David Albert, Craig Callender, Sheldon Goldstein,Barry Loewer, and Roderich Tumulka. I would also like to thank Eugene Chua, SaakshiDulani, Veronica Gomez, Ned Hall, Tim Maudlin, Kerry McKenzie, Elizabeth Miller, JillNorth, Charles Sebens, Ted Sider, Cristi Stoica, Anncy Thresher, David Wallace, BradWeslake, Isaac Wilhelm, Eric Winsberg, and the participants in my graduate seminar onthe arrows of time at UCSD in spring 2020.34 eferences
Albert, David Z. 1996. Elementary Quantum Metaphysics.
Pages 277–84 of:
Cushing,J. T., Fine, A., & Goldstein, S. (eds),
Bohmian Mechanics and Quantum Theory: AnAppraisal . Dordrecht: Kluwer Academic Publishers.Albert, David Z. 2000.
Time and chance . Cambridge: Harvard University Press.Albert, David Z. 2015.
After physics . Cambridge: Harvard University Press.Allori, Valia, Goldstein, Sheldon, Tumulka, Roderich, & Zanghì, Nino. 2013. Predictionsand primitive ontology in quantum foundations: a study of examples.
The BritishJournal for the Philosophy of Science , (2), 323–352.Ashtekar, Abhay, & Gupt, Brajesh. 2016. Initial conditions for cosmological perturbations. Classical and Quantum Gravity , (3), 035004.Barrett, Je ff rey A. 1996. Empirical adequacy and the availability of reliable records inquantum mechanics. Philosophy of Science , (1), 49–64.Bell, John S. 1980. De Broglie-Bohm, delayed-choice, double-slit experiment, and densitymatrix. International Journal of Quantum Chemistry , (S14), 155–159.Belot, Gordon. 2012. Quantum states for primitive ontologists. European Journal forPhilosophy of Science , (1), 67–83.Boltzmann, Ludwig. 1964. Lectures on gas theory . Berkeley: University of CaliforniaPress.Callender, Craig. 2004. Measures, explanations and the past: Should ‘special’ initialconditions be explained?
The British journal for the philosophy of science , (2), 195–217.Callender, Craig. 2011. Thermodynamic Asymmetry in Time. In:
Zalta, Edward N.(ed),
The Stanford Encyclopedia of Philosophy , fall 2011 edn. Metaphysics Research Lab,Stanford University.Carroll, John W. 1994.
Laws of nature . Cambridge University Press.Carroll, Sean. 2010.
From eternity to here: the quest for the ultimate theory of time . Penguin.Carroll, Sean M, & Chen, Jennifer. 2004. Spontaneous Inflation and the Origin of theArrow of Time. arXiv preprint hep-th / .Chen, Eddy Keming. 2017. Our Fundamental Physical Space: An Essay on theMetaphysics of the Wave Function. Journal of Philosophy , .Chen, Eddy Keming. 2018. Quantum Mechanics in a Time-Asymmetric Universe: Onthe Nature of the Initial Quantum State. The British Journal for the Philosophy of Science , forthcoming .Chen, Eddy Keming. 2019a. Quantum States of a Time-Asymmetric Universe: WaveFunction, Density Matrix, and Empirical Equivalence. arXiv:1901.08053 .35hen, Eddy Keming. 2019b. Realism about the wave function. Philosophy Compass , (7).Chen, Eddy Keming. 2020a. From Time Asymmetry to Quantum Entanglement: AHumean Unification. Noûs , forthcoming .Chen, Eddy Keming. 2020b. Nomic Vagueness. arXiv preprint arXiv:2006.05298 .Chen, Eddy Keming. 2020c. Time’s Arrow and De Se Probabilities. arXiv preprintarXiv:2001.09972 .Chen, Eddy Keming. 2020d. Time’s Arrow in a Quantum Universe: On the Statusof Statistical Mechanical Probabilities. In:
Allori, Valid (ed),
Statistical Mechanicsand Scientific Explanation: Determinism, Indeterminism and Laws of Nature . Singapore:World Scientific.Chen, Eddy Keming. 2020e. Welcome to the Fuzzy-Verse.
New Scientist , (3298),36–40.Chen, Eddy Keming, & Goldstein, Sheldon. 2020. Minimal Primitivism about Laws ofNature. In preparation .Chen, Eddy Keming, & Tumulka, Roderich. 2020. Uniform Probability DistributionOver All Density Matrices. arXiv preprint arXiv:2003.13087 .Demarest, Heather. 2019. Mentaculus Laws and Metaphysics.
Principia: an internationaljournal of epistemology , (3), 387–399.Dürr, Detlef, Goldstein, S, & Zanghì, N. 1996. Bohmian Mechanics and the Meaning ofthe Wave Function. In: in Experimental Metaphysics: Quantum Mechanical Studies inhonor of Abner Shimony .Dürr, Detlef, Goldstein, Sheldon, Tumulka, Roderich, & Zanghì, Nino. 2005. On therole of density matrices in Bohmian mechanics.
Foundations of Physics , (3), 449–467.Earman, John. 2006. The “past hypothesis”: Not even false. Studies in History andPhilosophy of Science Part B: Studies in History and Philosophy of Modern Physics , (3),399–430.Eddington, Arthur Stanley. 1928. The Nature of the Physical World . New York: Macmillan.Feynman, Richard. 2017.
The Character of Physical Law . Cambridge: MIT press.Forrest, Peter. 1988.
Quantum metaphysics . Blackwell Publisher.Frigg, Roman. 2007. A field guide to recent work on the foundations of thermodynamicsand statistical mechanics.
The Ashgate companion to the new philosophy of physics , 99–196.Frisch, Mathias. 2005. Counterfactuals and the past hypothesis.
Philosophy of Science , (5), 739–750. 36risch, Mathias. 2007. Causation, counterfactuals, and entropy. In:
Price, Huw, & Corry,Richard (eds),
Causation, physics, and the constitution of reality: Russell’s republic revisited .Oxford University Press.Goldstein, Sheldon. 2001. Boltzmann’s approach to statistical mechanics.
Pages 39–54of:
Bricmont, J., Dürr, D., Galavotti, M. C., Ghirardi, G., Petruccione, F., & Zanghì, N.(eds),
Chance in Physics . Berlin: Springer.Goldstein, Sheldon. 2012. Typicality and notions of probability in physics.
Pages 59–71of: Probability in physics . Springer.Goldstein, Sheldon, & Teufel, Stefan. 2001. Quantum spacetime without observers:ontological clarity and the conceptual foundations of quantum gravity.
Physics meetsPhilosophy at the Planck scale , 275–289.Goldstein, Sheldon, & Tumulka, Roderich. 2011. Approach to thermal equilibrium ofmacroscopic quantum systems.
Pages 155–163 of: Non-Equilibrium Statistical PhysicsToday: Proceedings of the 11th Granada Seminar on Computational and Statistical Physics,AIP Conference Proceedings , vol. 1332. American Institute of Physics, New York.Goldstein, Sheldon, & Zanghì, Nino. 2013. Reality and the role of the wave function inquantum theory.
The wave function: Essays on the metaphysics of quantum mechanics ,91–109.Goldstein, Sheldon, Lebowitz, Joel L, Mastrodonato, Christian, Tumulka, Roderich,& Zanghì, Nino. 2010a. Approach to thermal equilibrium of macroscopic quantumsystems.
Physical Review E , (1), 011109.Goldstein, Sheldon, Lebowitz, Joel L, Tumulka, Roderich, & Zanghì, Nino. 2010b.Long-time behavior of macroscopic quantum systems: Commentary accompanyingthe English translation of John von Neumann’s 1929 article on the quantum ergodictheorem. The European Physical Journal H , (2), 173–200.Goldstein, Sheldon, Lebowitz, Joel L, Tumulka, Roderich, & Zanghì, Nino. 2020. Gibbsand Boltzmann entropy in classical and quantum mechanics. In:
Allori, Valid (ed),
Statistical Mechanics and Scientific Explanation: Determinism, Indeterminism and Laws ofNature . Singapore: World Scientific.Hicks, Michael Townsen, & van Elswyk, Peter. 2015. Humean laws and circularexplanation.
Philosophical Studies , (2), 433–443.Hildebrand, Tyler. 2013. Can primitive laws explain? Philosophers’ Imprint .Horwich, Paul. 1987.
Asymmetries in Time: Problems in the Philosophy of Sciences . MITPress.Hubert, Mario, & Romano, Davide. 2018. The wave-function as a multi-field.
EuropeanJournal for Philosophy of Science , (3), 521–537.Lanford, Oscar E. 1975. Time evolution of large classical systems. Pages 1–111 of:
Moser,J (ed),
Dynamical systems, theory and applications . Springer.37azarovici, Dustin, & Reichert, Paula. 2015. Typicality, irreversibility and the status ofmacroscopic laws.
Erkenntnis , (4), 689–716.Lebowitz, Joel L. 2008. Time’s arrow and Boltzmann’s entropy. Scholarpedia , (4), 3448.Lewis, David. 1979. Counterfactual Dependence and Time’s Arrow. Noûs , , 455–76.Lewis, David. 1983. New Work for a Theory of Universals. Australasian Journal ofPhilosophy , , 343–77.Loewer, Barry. 2001. Determinism and chance. Studies in History and Philosophy of SciencePart B: Studies in History and Philosophy of Modern Physics , (4), 609–620.Loewer, Barry. 2007. Counterfactuals and the Second Law. In:
Price, Huw, & Corry,Richard (eds),
Causation, Physics, and the Constitution of Reality: Russell’s RepublicRevisited . Oxford University Press.Loewer, Barry. 2012. Two accounts of laws and time.
Philosophical Studies , (1),115–137.Loewer, Barry. 2020. The Mentaculus. In:
Barry Loewer, Brad Weslake, Eric Winsberg(ed),
Time’s Arrows and the Probability Structure of the world . Harvard University Press,forthcoming.Maudlin, Tim. 2007.
The Metaphysics Within Physics . New York: Oxford UniversityPress.North, Jill. 2011. Time in thermodynamics.
The oxford handbook of philosophy of time ,312–350.Penrose, Roger. 1979. Singularities and Time-Asymmetry.
Pages 581–638 of:
Hawking,SW, & Israel, W (eds),
General relativity . Cambridge: Cambridge University Press.Penrose, Roger. 1989.
The Emperor’s New Mind: Concerning Computers, Minds, and theLaws of physics . Oxford: Oxford University Press.Price, Huw. 2004. On the origins of the arrow of time: Why there is still a puzzle aboutthe low-entropy past.
Contemporary debates in philosophy of science , 219–239.Reichenbach, Hans. 1956.
The direction of time . Vol. 65. Univ of California Press.Rovelli, Carlo. 2020. Memory and entropy. arXiv preprint arXiv:2003.06687 .Von Neumann, John. 1955.
Mathematical foundations of quantum mechanics . PrincetonUniversity Press.Wallace, David. 2012.
The Emergent Multiverse: Quantum theory according to the Everettinterpretation . Oxford: Oxford University Press.Wallace, David, & Timpson, Christopher G. 2010. Quantum mechanics on spacetime I:Spacetime state realism.
The British Journal for the Philosophy of Science , (4), 697–727.38ilhelm, Isaac. 2019. Typical: A Theory of Typicality and Typicality Explanations. TheBritish Journal for the Philosophy of Science .Winsberg, Eric. 2004. Can Conditioning on the “Past Hypothesis” Militate Against theReversibility Objections?
Philosophy of Science ,71