FSB research seminar

The Foundational Studies research seminar currently takes place on Thursdays 5pm, in G2 in Cotham House (See the University's Google precinct map).

Each week, an idea, thesis or result is presented within thirty minutes or less, leaving plenty of time for discussion. Only very limited background knowledge is assumed and the aim is to develop new research directions within foundational studies (philosophical foundations of mathematics, logic, philosophical logic, formal theories of truth, ...).

If you are interested, email foundational-studies@bristol.ac.uk to ask to be added to the mailing list, or follow us on facebook.

The seminars given in previous years can be seen here.

2016/2017 schedule, Teaching Block 2
DateSpeakerTitle

19 January 2017

Oliver Tatton-Brown

Mathematical auxiliaries and quietist Platonism

2 February 2017

Benedict Eastaugh

Mathematical equivalences and the road to formalisation

9 February 2017

Leon Horsten

Prejudice as evidence

16 February 2017

Pawel Pawlowski (Ghent)

Non-classical logic of informal provability
2 March 2017

Nemo D'Qrill

Nozick Truth Tracking - Not Necessary

9 March 2017

Carlo Nicolai (Munich)

What can we learn from reflecting on truth?

16 March 2017

Øystein Linnebo (Oslo)

Generality explained: A truth-maker semantics

23 March 2017

Irina Starikova (Sao Paolo)  Visual aspects of scientific models: the case of turbulence

30 March 2017

Stuart Presnell Martin-Löf's "Meaning Explanations" 

27 April 2017

Kentaro Fujimoto  

4 May 2017

Beau Madison Mont (Oxford)  

Abstracts

Oliver Tatton-Brown (Bristol) - 'Mathematical auxiliaries and quietist Platonism'

Various authors, including Maddy, Tait and the Neo-Fregeans, have suggested that perhaps the problem of the existence of mathematical objects is not a deep problem at all, and that their existence can be read off of our mathematical language or mathematical practice. I outline a counterargument to this based on the use of auxiliaries in mathematics. Roughly, whether certain objects exist can matter because we can use their existence to deduce facts about other objects. It cannot be read off of our mathematical language or mathematical practice that we can reason as though certain objects exist (and obtain truths), so it also cannot be read off of our mathematical language or mathematical practice that the objects in question do exist.

Benedict Eastaugh (Bristol) - Mathematical equivalences and the road to formalisation

Talk of equivalences is commonplace in mathematical discourse, both for theorems (“The least upper bound principle is equivalent to the Dedekind cut principle”) and conjectures (“The following are equivalents of the Riemann hypothesis . . . ”). Given two statements P and Q, such equivalences are demonstrated by proving both that P implies Q, and conversely that Q implies P. While the pragmatics of such equivalence statements are relatively clear, the semantics are harder to make sense of. We contend that equivalence statements in ordinary mathematical practice are best understood through the lens of reverse mathematics, where the equivalence of P and Q is proven in a weak base theory incapable of proving either statement. This view avoids a trivialising problem for a classical understanding of the biconditional, on which all true statements are logically equivalent. It also connects informal equivalence statements in mathematics to their formal counterparts in set theory, such as the many statements found to be equivalent over the base theory ZF to the Axiom of Choice. Finally, we consider objections arising from an argument of Paul Halmos that such equivalence statements are nonsensical. 

Leon Horsten (Bristol) - Prejudice as evidence

In a 2012 article, Leitgeb outlines a way of building revision-based models for languages containing a self-referential subjective probability predicate. The revision-component of Leitgeb’s procedure is based on a relative frequency idea. 

In my talk I want to take a few steps in exploring whether in Leitgeb’s procedure the frequency-based revision step can be replaced by a form of Bayesian conditionalisation.

Pawel Pawlowski (Ghent) - Non-classical logic of informal provability

Mathematicians prove theorems. They don’t do that in any particular axiomatic system. Rather, they reason in a semi-formal setting, providing what we’ll call informal proofs. There are quite a few reasons not to reduce informal provability to formal provability within some appropriate axiomatic theory (Marfori, 2010; Leitgeb, 2009). The main worry about identifying informal provability with formal provability starts with the following observation. We have a strong intuition that whatever is informally provable is true. Thus, we are committed to all instances of the so-called reflection schema P(⌜φ⌝) → φ (where ⌜φ⌝ is the number coding formula φ and P is the informal provability predicate).

Yet, not all such instances for formal provability (in standard Peano Arithmetic, henceforth PA) are provable in PA. Even worse, a sufficiently strong arithmetical theory T resulting from adding to PA (or any sufficiently strong arithmetic) all instances of the reflection schema for provability in T will be inconsistent (assuming derivability conditions for provability in T are provable in T). Thus, something else has to be done.

The main idea behind most of the current approaches (Shapiro, 1985; Horsten, 1994, 1996) is to extend the language with a new informal provability predicate or operator, and include all instances of the reflection schema for it. Contradiction is avoided at the price of dropping one of the derivability conditions. Thus, various options regarding trade-offs between various principles which all seem convincing are studied. In order to overcome some of the resulting difficulties and arbitrariness we investigate the strategy which changes the underlying logic and treats informal provability as a partial notion, just like Kripke’s theory of truth (Kripke, 1975) treats truth as a partial notion (one that clearly applies to some sentences, clearly doesn’t apply to some other sentences, but is undecided about the remaining ones). The intuition is that at a given stage, certain claims are clearly informally provable, some are clearly informaly disprovable, whereas the status of the remaining ones is undecided.

In Kripke-style truth theories strong Kleene three-valued logic is usually used – which seems adequate for interpreting truth as a partial notion. Yet, we will argue that no well-known three-valued logic can do a similar job for informal provability. The main reason is that the value of a complex formula in those logics is always a function of the values of its components. This fails to capture the fact that, for instance, some informally provable disjunctions of mathematical claims have informally provable disjuncts, while some other don’t.

We develop a non-functional many-valued logic which avoids this problem and captures our intuitions about informal provability. We describe the semantics of our logic and some of its properties. We argue that it does a better job when it comes to reasoning with informal provability predicate in formalized theories built over arithmetic.

Nemo D'Qrill (Bristol) - Nozick Truth Tracking -- Not Necessary

Kripke (1980s), Williams (2015), and others, have attempted to show that Nozick's tracking theory for knowledge is insufficient. Murray and Adams (2003:2005:2015) have defended against these objections. This paper demonstrates, with two independent counterexamples, that even were Nozick's theory sufficient, it would be unnecessary.

Carlo Nicolai (Munich) - What can we learn from reflecting on truth?

In recent work with Martin Fischer and Leon Horsten, we have studied the result of iterating a form of uniform reflection over a simple truth theory featuring only basic disquotational principles formulated in four-valued logic. The resulting picture can be read as (i) suggesting a conceptual analysis of the notion of truth based on disquotation (ii) a way to close the proof-theoretic gap existing between classical and non-classical theories of truth (iii) pointing at a foundational project based on weak combinatorial principles. In the talk I will mostly focus on options (ii) and (iii) and discuss some potential problems.

Øystein Linnebo (Oslo) - Generality explained: A truth-maker semantics

What explains a true universal generalization? This paper distinguishes two kinds of explanation. While an instance-based explanation proceeds via each instance of the generalization, a generic explanation is independent of each instance, relying instead on completely general facts about the properties or operations involved in the generalization. This distinction is illuminated by means of a truth-maker semantics, which is also used to show that instance-based explanations support classical logic, while generic explanations support only intuitionistic logic.

Irina Starikova (Sao Paolo) - Visual aspects of scientific models: the case of turbulence

Recent discussions question the role of sensory aspect in not only in proofs but also in models and thought experiments. How important are images? Are they necessary? Relevant philosophical opinions divide into camps. For example, Brown, Gendler, Nersessian argue that visualisations are essential. Norton claims that visualisations are irrelevant in thought experiments. Meanwhile, Salis & Frigg (forthcoming) suggest that images are sometimes useful for thought experiments but never necessary. My position is that in some cases visualisations are necessary, e.g. when reasoning requires mental manipulations over them of images from diagrams (most recently Giaquinto & Starikova forthcoming, Starikova 2016, De Toffoli & Giardino 2014, 2016).

This paper moves focus from pure to applied mathematics, and to the use of images in studying physical phenomena. There are still phenomena waiting for a better mathematical grip, for example, turbulence. I will argue that a visual image (of a model of physical phenomena) can play an important role in guiding the mathematicians’ research and choosing new mathematical resources. In particular, I will show that Richardson’s model of a cascading wave motivated both Kolmogorov’s statistical theory of turbulence and more recent geometric interpretation of the shape dynamics of a fluid volume. This is how an application of Riemannian geometry (Ricci flows) in mathematical description of turbulence became accessible.

The paper distinguishes "loose" geometry, which means simply visualising a phenomenon, and "strict" geometry, which means already looking at the visual representation geometrically and applying geometry to the initial problem. On the basis of this distinction one can observe from the case study that loose geometry opens up possibilities for strict geometry. Visual representations can (even in very complex mathematics) guide research in a certain (geometric) direction, when a merely linguistic / symbolic representation does not help.

 

Stuart Presnell - Martin-Löf's "Meaning Explanations" 

In "On the Meanings of the Logical Constants and the Justifications of the Logical Laws" Per Martin-Löf presents an analysis of the notions of 'proposition' and 'judgement', from which he derives an account of intuitionistic/constructive logic.  I'll start by giving a summary of this account, which forms the conceptual core of Homotopy Type Theory.  I'll then make some connections to Øystein Linnebo's recent "truth-maker semantics" talk, and consider some ways that the idea can be broadened to new applications (as suggested by Ben Eva).

 

  


 

2016/2017 schedule, Teaching Block 1
DateSpeakerTitle/Subject

21 October 2016

Johannes Stern

The Sky is the Limit: Reconsidering the Equivalence Scheme

28 October 2016

Sam Roberts

Modal structuralism and the access problem 

11 November 2016

Catrin Campbell-Moore

Non-classical probabilities

18 November 2016

Max Jones

Numerical Perception and the Access Problem

25 November 2016

Dan Saattrup Nielsen

Determinacy of games

2 December 2016

Alex Jones

What is deflationism about truth?

9 December 2016

Yang Liu

A Simpler and More Realistic Subjective Decision Theory

16 December 2016

Johannes Stern

The Mind Cannot be Mechanized

Abstracts

Johannes Stern (Bristol) - 'The Sky is the Limit: Reconsidering the Equivalence Scheme'

In this talk we reconsider the role of the Equivalence schema against the backdrop of the paradoxes. The most prominent reaction to the paradoxes within the boundaries of classical logic is to restrict the Equivalence scheme to a class of permissible instances. We argue that this strategy is not without problems and that it might be preferable to give up the Equivalence scheme altogether and seek for weaker principles of truth. To this end we propose a criterion for when such weaker principles of truth aptly characterize the notion of truth. The guiding idea will be that the non-naive truth predicate should be maximally truthlike.

Sam Roberts (Bristol) - 'Modal structuralism and the access problem'

In this talk, I will look at modal structuralism -- the view that mathematics is about possible non-abstract structures. The main consideration in its favour is that it might solve the access problem -- the problem of explaining how we come to know all of mathematical facts that we do. I will try to get clear on the extent to which modal structuralism solves the access problem by investigating the presuppositions underlying it.

Catrin Campbell-Moore (Bristol) - 'Non-classical probabilities'

This talk is an introduction to non-classical probabilities. When one typically presents the probability axioms, as they apply to sentences, they embed assumptions of classical logic; for example they require that the probability of Pv¬P is 1. In this talk we present an overview of non-classical probabilities which say what the probabilities should look like if these assumptions of classical logic aren't in play but instead there is some non-classical logic in the background. We will particularly focus on the case of supervaluational probabilities and explain how they connect to a model of belief that has been very popular in recent years: that of imprecise probabilities where an agent's belief state is modeled by a set of probability functions. Time permitting we will also present some connections with epistemic utility arguments for rationality constraints on agents in a non-classical framework. 

Max Jones (Bristol) - 'Numerical Perception and the Access Problem'

Over the last twenty years a wealth of evidence from the cognitive sciences has emerged that suggests that humans (and a wide range of other species) possess the capacity to perceive numerical properties or numerosities. However, within the philosophy of mathematics the notion that we perceive number is relatively unpopular. Those who support the idea that perception provides us with access to mathematical content, such as (early) Maddy, Kitcher, and Resnik, take this to provide a response to Benacerraf's access problem, which supports some form of realism. I'll briefly present some of the evidence for numerical perception and discuss it's impact on a more generalised version of Benacerraf's access problem, before arguing that, while numerical perception fails to support a realist solution, it may still have significant consequences for the metaphysics of number.

Dan Saattrup Nielsen (Bristol) - 'Determinacy of games'

This is going to be a brief glimpse of how game theory can affect logic and Mathematics, where determinacy of a game is when there exists a winning strategy for one of the players in the game. I will present determinacy in a syntactical way as a special case of de Morgan's law in a certain infinitary logic, explore how much determinacy our axioms of set theory allows us to prove and show a sample application.

Alex Jones (Bristol) - What is deflationism about truth?

Deflationism about truth has seen many advocates in recent years, those who argue that truth is insubstantial. These philosophers agree on this broad point, but still hold different theories of truth. In my talk I shall be looking at the question of what it is that these theories have in common, what makes a theory of truth deflationary/insubstantial. I shall examine some proposals that have been put forward and argue that these are deficient. I shall then introduce my own proposal, making use of the metaphysical notion of grounds, which at least improves over these deficiencies and I will argue is the correct way of deciding whether a theory of truth is deflationary or not. This will be heavy on philosophy, but lacking in formal technical details, so should be accessible to all!

Yang Liu (Cambridge) - 'A Simpler and More Realistic Subjective Decision Theory'

In his seminal work “the Foundations of Statistics,” Savage put forward a theory of subjective probabilities. The theory is based on a well-developed axiomatic system of rational decision making. In establishing this system of decision making, additional problematic assumptions are however required. First, there is a Boolean algebra of events on which subjective probabilities are defined. Savage's proof requires that this algebra be a σ-algebra. However, on Savage's view, one should not require the probability to be σ-additive. He, therefore, finds the insistence on a σ-algebra peculiar and unsatisfactory. But he sees no way of avoiding it. Second, the assignment of utilities requires the constant act assumption: for every given consequence there exists a constant act which has that consequence in every state. This assumption is known to be highly counterintuitive. The paper on which this talk is based includes two mathematical results. The first, and the more difficult one, shows that the σ-algebra assumption can be dropped. The second states that, as long as utilities are assigned to finite gambles only, the constant act assumption can be replaced by the plausible, much weaker assumption that there are at least two non-equivalent constant acts. 

In this talk, I will first provide an overview of Savage’s theory of expected utilities, I will then outline the new technique of tripartition trees we developed in the paper which leads to the definition of quantitative probabilities without the σ-algebra assumption. During the talk, I will also discuss the notion of "idealized agent" that underlies Savage's approach and argue that our simplified system, which is adequate for all the actual purposes for which the system is designed, involves a more realistic notion of an idealized agent.

The talk is based on a joint work with Professor Haim Gaifman.   

Johannes Stern (Bristol) - 'The Mind Cannot be Mechanized'

Gödel's disjunction is the famous thesis that either the human mind cannot be mechanized or that there exist absolutely undecidable statements. Authors such as Lucas and Penrose have brought forward arguments purporting to show the first disjunct, namely, that the human mind cannot be mechanized. In this talk I shall focus on one particuar argument by Penrose to this effect, for which Peter Koellner has recently proposed a reconstruction using a self-applicable truth and a self-applicable absolute provability predicate. We investigate whether there are reasonable theories of truth and absolute provability in which Penrose's argument can be carried out and whether the addition of the truth predicate has any interesting philosophical consequences.