Bristol-Leuven Workshop 3, 1 February 2016: Workshop on philosophy of science, logic and mathematics


The morning talks (9.00-13.00) will be taking place in G16, Cotham House.
After lunch we will reconvene in Portacabin 6 at the rear of the Maths building for the afternoon talks.


9.00-9.30: Welcome coffee

9.30-10.30: Tudor Baetu, Models as Local Frameworks for Integrating Knowledge

10.45-11.45: Karim Thébault, Confirmation via Analogue Simulation: A Bayesian Analysis

12.00-13.00: Sylvia Wenmackers, Neo-Leibnizian analysis of indeterminism in Newtonian physics

13.00-15.00: Lunch

15.00-16.00: Catrin Campbell-Moore, Revision theory of probability

16.15-17.15: Jan Heylen, Knowability and closure

17.45-18.45: Chris Kelp, A different knowledge first epistemology

19.00: Dinner and pub

Talks are 45 minutes, with 15 minutes for questions after


Tudor Baetu

Title: Models as Local Frameworks for Integrating Knowledge

Abstract: A comprehensive survey of models in immunology is conducted and distinct kinds are characterized based on whether models are material or conceptual, the distinctiveness of their epistemic purposes, and the criteria for evaluating the goodness of a model relative to its intended epistemic purposes. In light of the results of this survey, I propose that a model in biology is, under its minimal and most general understanding, a local standardized framework for integrating epistemically relevant information about biological phenomena or systems. Irrespective of their kind, models make possible the comparison, evaluation and synthesis of knowledge about biological phenomena/systems by providing a common frame of reference, and provide at least a preliminary identification of what is likely to be epistemically relevant to a phenomenon/system of interest.

Karim Thébault

Title: Confirmation via Analogue Simulation: A Bayesian Analysis

Abstract: Analogue simulation is a novel mode of scientific inference found increasingly within modern physics, and yet all but neglected in the philosophical literature. Experiments conducted upon a table-top 'source system' are taken to provide insight into features of an inaccessible 'target system', based upon a syntactic isomorphism between the relevant modelling frameworks. An important example is the use of acoustic 'dumb hole' systems to simulate gravitational black holes. In a recent paper it was argued that there exists circumstances in which confirmation via analogue simulation can obtain; in particular when the robustness of the isomorphism is established via universality arguments. The current paper supports these claims via an analysis in terms of Bayesian confirmation theory.

Joint work with Radin Dardashti, Stephan Hartmann and Eric Winsberg. A pre-print can be found here.

Sylvia Wenmackers

Title: Neo-Leibnizian analysis of indeterminism in Newtonian physics

Abstract: Norton’s dome is an example of indeterminism in Newtonian physics, based on a differential equation involving a non-Lipschitz continuous function. We present an alternative model using non-standard analysis, which involves infinitesimals and is close to Leibniz's formulation of the calculus as well as to physical praxis. Our hyperfinite model for the dome is deterministic. Moreover, it allows us to assign probabilities to the variable in the indeterministic model. Since non-standard models are empirically indistinguishable from models based on standard reals, we have to conclude that (in-)determinism is a model-dependent property. (Joint work with Danny Vanpoucke.)

Catrin Campbell-Moore

Title: Revision theory of probability

Abstract: The revision theory of truth is an influential theory that is used to study sentences which talk about their own truth, like the liar sentence. I will consider how one might develop a revision theory of probability. This will result in features like that the probability of the liar is a half. The notion of probability that is being developed in this revision sequence is one of semantic probability, which measures the degree of truth of a sentence. When real numbers are in play instead of just truth values true and false, one needs to reconsider how to 'sum up' facts and generalise the limit stages of the revision theory.

Jan Heylen

Title: Knowability and closure

Abstract: There are at least four concepts of knowability that have played a role in central debates surrounding the limits of knowledge. First, there is the concept of having the (counterfactual) possibility to know: in a perhaps counterfactual state of the world one knows. Second, there is the concept of being in a position to know, which is understood as knowledge one is capable of acquiring and which is such that nothing stands in the way of successfully acquiring it. Third, there is the concept of being able to know, which is understood in by some such that it allows for necessarily unexcercised abilities. Fourth, there is the concept of having the potential to know, which is understood as knowledge one can reach in response to hypothetical evidence. The main focus of my talk is on the closure of these notions under logical implication. After having discussed closure, I will point out some consequences for the debates in which these notions figure.

Chris Kelp

Title: A different knowledge first epistemology

Abstract: This paper argues that there is reason to believe a key thesis of Williamson’s knowledge first epistemology—to wit, that knowledge is a mental state—is mistaken. Rather knowledge is an epistemic state (only). I then offer a functional analysis of knowledge in terms of its place in inquiry. More specifically, knowledge is success in inquiry. This thesis, I argue, not only holds the key to answering the question concerning the nature of knowledge, but is also fit to serve as the foundation for a different kind of knowledge first epistemology.