MCMP – Philosophy of Science LudwigMaximiliansUniversität München

 Philosophy
Mathematical Philosophy  the application of logical and mathematical methods in philosophy  is about to experience a tremendous boom in various areas of philosophy. At the new Munich Center for Mathematical Philosophy, which is funded mostly by the German Alexander von Humboldt Foundation, philosophical research will be carried out mathematically, that is, by means of methods that are very close to those used by the scientists.
The purpose of doing philosophy in this way is not to reduce philosophy to mathematics or to natural science in any sense; rather mathematics is applied in order to derive philosophical conclusions from philosophical assumptions, just as in physics mathematical methods are used to derive physical predictions from physical laws.
Nor is the idea of mathematical philosophy to dismiss any of the ancient questions of philosophy as irrelevant or senseless: although modern mathematical philosophy owes a lot to the heritage of the Vienna and Berlin Circles of Logical Empiricism, unlike the Logical Empiricists most mathematical philosophers today are driven by the same traditional questions about truth, knowledge, rationality, the nature of objects, morality, and the like, which were driving the classical philosophers, and no area of traditional philosophy is taken to be intrinsically misguided or confused anymore. It is just that some of the traditional questions of philosophy can be made much clearer and much more precise in logicalmathematical terms, for some of these questions answers can be given by means of mathematical proofs or models, and on this basis new and more concrete philosophical questions emerge. This may then lead to philosophical progress, and ultimately that is the goal of the Center.

 video
How Almost Everything in Spacetime Theory Is Illuminated by Simple Particle Physics: The Neglected Case of Massive Scalar Gravity
J. Brian Pitts (Cambridge) gives a talk at the MCMP Colloquium (6 February, 2013) titled "How Almost Everything in Spacetime Theory Is Illuminated by Simple Particle Physics: The Neglected Case of Massive Scalar Gravity". Abstract: Both particle physics from the 1920s30s and the 1890s SeeligerNeumann modification of Newtonian gravity suggest considering a “mass term,” an additional algebraic term in the gravitational potential. The “graviton mass” gives gravity a finite range. The smooth massless limit implies underdetermination. In 1914 Nordström generalized Newtonian gravity to fit Special Relativity. Why not do to Nordström what Seeliger and Neumann did to Newton? Einstein started in setting up a (faulty!) analogy for his cosmological constant Λ. Scalar gravities, though not empirically viable since the 1919 bending of light observations, provide a useful test bed for tensor theories like General Relativity. Massive scalar gravity, though not completed in a timely way, sheds philosophical light on most issues in contemporary and 20th century spacetime theory. A mass term shrinks the symmetry group to that of Special Relativity and violates Einstein's principles (general covariance, general relativity, equivalence and Mach) in empirically small but conceptually large ways. Geometry is a poor guide to massive scalar gravities in comparison to detailed study of the field equation or Lagrangian. Matter sees a conformally flat metric because gravity distorts volumes while leaving the speed of light alone, but gravity sees the whole flat metric due to the mass term. Largely with Poincaré (pace Eddington), one can contemplate a “true” flat geometry differing from what material rods and clocks disclose. But questions about “true” geometry need no answer and tend to block inquiry. Presumptively one should expect analogous results for the tensor (massive spin 2) case modifying Einstein’s equations. A case to the contrary was made only in 197072: an apparently fatal dilemma involving either instability or empirical falsification appeared. But dark energy measurements since 1999 cast some doubt on General Relativity (massless spin 2) at long distances. Recent calculations (2000s, some from 2010) show that instability can be avoided and that empirical falsification likely can be as well, making massive spin 2 gravity a serious rival for GR. Particle physics can let philosophers proportion belief to evidence over time, rather than suffering from unconceived alternatives.

 video
Evaluating Risky Prospects: The Distribution View
Luc Bovens (LSE) gives a talk at the 6th MunichSydneyTilburg Conference on "Models and Decisions" (1012 April, 2013) titled "Evaluating Risky Prospects: The Distribution View". Abstract: Policy Analysts need to rank policies with risky outcomes. Such policies can be thought off as prospects. A prospect is a matrix of utilities. On the rows we list the people who are affected by the policy. In the columns we list alternative states of the world and specify a probability distribution over the states. I provide a taxonomy of various ex ante and ex post distributional concerns that enter into such policy evaluations and construct a general method that reflects these concerns, integrates the ex ante and ex post calculus, and generates orderings over policies. I show that Parfit’s Priority View is a special case of the Distribution View.

 video
On the Conception of Fundamentality of TimeAsymmetries in Physics
Daniel Wohlfarth (Bonn) gives a talk at the MCMP Colloquium (30 January, 2013) titled "On the Conception of Fundamentality of TimeAsymmetries in Physics". Abstract: The goal of my talk is to argue for two connected proposals: Firstly: I shall show that a new conceptual understanding of the term ‘fundamentality’  in the context of timeasymmetries  is applicable to cosmology and in fact shows that classical and semiclassical cosmology should be understood as timeasymmetric theories. Secondly: I will show that the proposed conceptual understanding of ‘fundamentality’, applied to cosmological models with a hyperbolical curved spacetime structure, provides a new understanding of the origin of the (quantum) thermodynamic timeasymmetry. In the proposed understanding a ‘quantum version’ of the second law can be formulated. This version is explicitly timeasymmetric (decreasing entropy with decreasing time coordinates and visa versa). Moreover, the physical effectiveness of the timeasymmetry will be based on the crucial Einstein equations and additional calculations in QFT. Therefore, the physical effectiveness of the timeasymmetry will be independent of an ontic interpretation of ‘entropy’ itself. The whole account is located in the set of semi classical quantum cosmology (without an attempt to quantize gravity) and depends on the definability of any cosmic time coordinates.

 video
Simplicity and Measurability in Science
Luigi Scorzato (Roskilde) gives a talk at the MCMP Colloquium (16 January, 2013) titled "Simplicity and Measurability in Science". Abstract: Simple assumptions represent a decisive reason to prefer one theory to another in everyday scientific praxis. But this praxis has little philosophical justification, since there exist many notions of simplicity, and those that can be defined precisely strongly depend on the language in which the theory is formulated. Moreover, according to a common general argument, the simplicity of a theory is always trivial in a suitably chosen language. However, this "trivialization argument" is always either applied to toymodels of scientific theories or applied with little regard for the empirical content of the theory. In this paper I show that the trivialization argument fails, when one considers realistic theories and requires their empirical content to be preserved. In fact, the concepts that enable a very simple formulation, are not necessarily measurable, in general. Moreover, the inspection of a theory describing a chaotic billiard shows that precisely those concepts that naturally make the theory extremely simple are provably not measurable. This suggests that, whenever a theory possesses sufficiently complex consequences, the constraint of measurability prevents too simple formulations in any language. In this paper I propose a way to introduce the constraint of measurability in the formulation of a scientific theory in such a way that the notion of simplicity acquires a general and sufficiently precise meaning. I argue that this explains why the scientists often regard their assessments of simplicity as largely unambiguous.

 video
Descriptivism about Theoretical Concepts Implies Ramsification or (Poincarean) Conventionalism
Holger Andreas (MCMP/LMU) gives a talk at the conference on "The Analysis of Theoretical Terms" (35 April, 2013) titled "Descriptivism about Theoretical Concepts Implies Ramsification or (Poincarean) Conventionalism".

 video
Theoretical Terms and Induction
Hannes Leitgeb (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (35 April, 2013) titled "Theoretical Terms and Induction".