MCMP – Philosophy of Mathematics LudwigMaximiliansUniversität München

 Philosophy
Mathematical Philosophy  the application of logical and mathematical methods in philosophy  is about to experience a tremendous boom in various areas of philosophy. At the new Munich Center for Mathematical Philosophy, which is funded mostly by the German Alexander von Humboldt Foundation, philosophical research will be carried out mathematically, that is, by means of methods that are very close to those used by the scientists.
The purpose of doing philosophy in this way is not to reduce philosophy to mathematics or to natural science in any sense; rather mathematics is applied in order to derive philosophical conclusions from philosophical assumptions, just as in physics mathematical methods are used to derive physical predictions from physical laws.
Nor is the idea of mathematical philosophy to dismiss any of the ancient questions of philosophy as irrelevant or senseless: although modern mathematical philosophy owes a lot to the heritage of the Vienna and Berlin Circles of Logical Empiricism, unlike the Logical Empiricists most mathematical philosophers today are driven by the same traditional questions about truth, knowledge, rationality, the nature of objects, morality, and the like, which were driving the classical philosophers, and no area of traditional philosophy is taken to be intrinsically misguided or confused anymore. It is just that some of the traditional questions of philosophy can be made much clearer and much more precise in logicalmathematical terms, for some of these questions answers can be given by means of mathematical proofs or models, and on this basis new and more concrete philosophical questions emerge. This may then lead to philosophical progress, and ultimately that is the goal of the Center.

 video
Recent metamathematical wonders and the question of arithmetical realism
Andrey Bovykin (Bristol) gives a talk at the MCMP Colloquium (16 January, 2013) titled "Recent metamathematical wonders and the question of arithmetical realism". Abstract: Metamathematics is the study of what is possible or impossible in mathematics, the study of unprovability, limitations of methods, algorithmic undecidability and "truth". I would like to make my talk very historical and educational and will start with preGodelean metamathematics and the first few metamathematical scenarios that emerged in the 19th century ("Parallel Worlds", "Insufficient Instruments", "Absence of a Uniform Solution", and "Needed Objects Are Yet Absent"). I will also mention some metamathematical premonitions from the era before Gauss that are the early historical hints that a mathematical question may lead to a metamathematical answer, rather than a "yes" or "no" answer. (These historical quotations have recently been sent to me by a historian Davide Crippa.) Then I am planning to sketch the history and evolution of postGodelean metamathematical ideas, or, rather, generations of ideas. I split postGodelean history of ideas into five generations and explain what people of each generation believed in (what are the right objects of study? what are the right questions? what are the right methods?). The most spectacular, and so far, the most important completed period in the history of metamathematics started in 1976 with the introduction of the Indicator Method by Jeff Paris and the working formulation of the Reverse Mathematics Programme by Harvey Friedman. I will describe this period (what people believed in that era) in some detail. We now live in a new period in metamathematics, quite distinct from the ideas of 1980s. I will speak about the question of universality of Friedman's machinery, the theory of templates, meaninglessnessisation (with an entertaining example of eaninglessnessisation), Weiermann's threshold theory and some new thoughts about the space of all possible strong theories. I will concentrate in most detail on the question of the Search for Arithmetical Splitting (the search for "equally good" arithmetical theories that contradict each other). and sketch some possible ways that may lead to the discovery of Arithmetical Splitting. There are people, who strongly deny the possibility of Arithmetical Splitting, most often under the auspices of arithmetical realism: "there exists the true standard model of arithmetic". They usually cite Godel's theorem as existence of ***true*** but unprovable assertions and concede only to some epistemological difficulties in reaching the "arithmetical truth". I will mention seven common arguments usually quoted by antiArithmeticalSplitting people and explain why it seems that all of these arguments don't stand. Historically and very tentatively, I would like to suggest that antiArithmeticalSplitting views should perhaps be considered not a viable antiPluralist philosophy of mathematics, but rather a conservative resistance to the inevitable new generation of metamathematical wonders. So antiArithmeticalSplitting views are more in line with such historical movements as "rejection of complex numbers", "resistance to nonEuclidean geometries in 1830s1850s", and "refusal to accept concepts of abstract mathematics in the second half of the 19th century" and other conservative movements of the past. Arithmetical Splitting has not been reached yet, but we are well equipped to discuss it before its arrival.

 video
The Univalence Axiom
Steve Awodey (CMU) gives a talk at the MCMP Colloquium (16 July, 2014) titled "The Univalence Axiom". Abstract: In homotopy type theory, the Univalence Axiom is a new principle of reasoning which implies that isomorphic structures can be identified. I will explain this axiom and consider its background and consequences, both mathematical and philosophical.

 video
In Good Company? On Hume's Principle and the assignment of numbers to infinite concepts.
Paolo Mancosu (UC Berkeley) gives a talk at the MCMP Colloquium (8 May, 2014) titled "In Good Company? On Hume's Principle and the assignment of numbers to infinite concepts.". Abstract: In a recent article (Review of Symbolic Logic 2009), I have explored the historical, mathematical, and philosophical issues related to the new theory of numerosities. The theory of numerosities provides a context in which to assign numerosities to infinite sets of natural numbers in such a way as to preserve the partwhole principle, namely if a set A is properly included in B then the numerosity of A is strictly less than the numerosity of B. Numerosities assignments differ from the standard assignment of size provided by Cantor’s cardinality assignments. In this talk, I generalize some specific worries emerging from the theory of numerosities to a line of thought resulting in what I call a ‘good company’ objection to Hume’s principle. The talk has four main parts. The first takes a historical look at nineteenthcentury attributions of equality of numbers in terms of oneone correlations and argues that there was no agreement as to how to extend such determinations to infinite sets of objects. This leads to the second part where I show that there are countably infinite many abstraction principles that are ‘good’, in the sense that they share the same virtues of HP and from which we can derive the axioms of second order arithmetic. The third part connects this material to a debate on Finite Hume Principle between Heck and MacBride and states the ‘good company’ objection. Finally, the last part gives a tentative taxonomy of possible neologicist responses to the ‘good company’ objection and makes a foray into the relevance of this material for the issue of crosssortal identifications for abstractions.

 video
Learning Experiences, Expected Inaccuracy, and the Value of Knowledge
Simon Huttegger (UC Irvine) gives a talk at the MCMP Colloquium (8 May, 2014) titled "Learning Experiences, Expected Inaccuracy, and the Value of Knowledge". Abstract: I argue that van Fraassen's reflection principle is a principle of rational learning. First, I show that it follows if one wants to minimize expected inaccuracy. Second, the reflection principle is a consequence of a postulate describing genuine learning situations, which is related to the value of knowledge theorem in decision theory. Roughly speaking, this postulate says that a genuine learning experience cannot lead one to foreseeably make worse decisions after the learning experience than one could already have made before learning.

 video
AntiMathematicism and Formal Philosophy
Eric Schliesser (Ghent) gives a talk at the MCMP Colloquium (25 June, 2014) titled "AntiMathematicism and Formal Philosophy". Abstract: Hannes Leitgeb rightly claims that "contemporary critics of mathematization of (parts of) philosophy do not so much put forward arguments as really express a feeling of uneasiness or insecurity visàvis mathematical philosophy." (Leitgeb 2013: 271) This paper is designed to articulate arguments in the place of that feeling of uneasiness. The hope is that this will facilitate more informed discussion between partisans and critics of formal philosophy. In his (2013) paper Leitgeb articulates and refutes one argument from Kant against formal philosophy. This paper will show, first, that Kant's argument as part of a much wider 18th century debates of formal methods in philosophy (prompted by success of Newton and anxiety over Spinoza). Now, obviously 'philosophy' has a broader scope in the period, so I will confine my discussion of arguments about formal methods in aeas we still consider 'philosophical' (metaphysics, epistemology, moral philosophy, etc.) In order to facilitate discussion I offer the fllowing taxonomy of arguments: (i) the global strategy, by which we mean that that the epistemic authority and security of mathematical applications as such are challenged and deprivileged; (ii) the containment strategy, by which the successful application of mathematical technique is restricted only to some limited domain; (iii) nonepistemic theories, by which the apparent popularity of mathematics within some domain of application is explained away in virtue of some nontruthtracking features. The nonepistemic theory is generally part of a debunking strategy. I will offer examples from Hume and Mandeville of all three strategies. The second main aim is to articulate arguments that call attention to potential limitations of formal philosophy. By 'limitation' I do not have in mind formal, intrinsic limits (e.g., Godel). I explore two: (A) firs, formal philosophers often assume a kind of topic neutrality or generality when justifying their methods (even if this idea has been challenged from within Dutilh Novaes 2011). But this means that formal methods are not selfjustifying (outside logic and mathematics, perhaps) and unable to ground their own worth; it follows straightforwardly that for such grounding formal approaches require substantive (often normative) nonformal premises. (B) Second, Leitgeb (2013: 2745) insightfully discusses the ways in which formal approaches like any other method, may be abused. But because abuses with esoteric techniques may be hard to detect by bystanders (philosophical and otherwise) absent other means of control and containment, there is a heavy responsibility on practitioners of formal philosophy to develop institutional and moral safeguards that are common in, say, engineering and medical sciences against such abuses. Absent safeguards, formal philosophers require a strong collective selfpolicing ethos; it is unlikely that current incentives promote such safeguards.

 video
Geometrical Roots of Model Theory: Duality and Relative Consistency
Georg Schiemer (Vienna/MCMP) gives a talk at the MCMP Colloquium (9 July, 2015) titled "Geometrical Roots of Model Theory: Duality and Relative Consistency". Abstract: Axiomatic geometry in Hilbert's Grundlagen der Geometrie (1899) is usually described as modeltheoretic in character: theories are understood as theory schemata that implicitly define a number of primitive terms and that can be interpreted in different models. Moreover, starting with Hilbert's work, metatheoretic results concerning the relative consistency of axiom systems and the independence of particular axioms have come into the focus of geometric research. These results are also established in a modeltheoretic way, i.e. by the construction of structures with the relevant geometrical properties. The present talk wants to investigate the conceptual roots of this metatheoretic approach in modern axiomatics by looking at an important methodological development in projective geometry between 1810 and 1900. This is the systematic use of the "principle of duality", i.e. the fact that all theorems of projective geometry can be dualized.The aim here will be twofold: First, to assess whether the early contributions to duality (by Gergonne, Poncelet, Chasles, and Pasch among others) can already be described as modeltheoretic in character. The discussion of this will be based on a closer examination of two existing justifications of the general principle, namely a transformationbased account and a (proto)prooftheoretic account based on the axiomatic presentation of projective space. The second aim will be to see in what ways Hilbert's metatheoretic results in Grundlagen, in particular his relative consistency proofs, were influenced by the previous uses of duality in projective geometry.