Journal for General Philosophy of Science

, Volume 42, Issue 1, pp 177–183

GAP.7: Reflections and Projections: Challenges to Philosophy

Authors

    • Institut für PhilosophieUniversität Bremen
  • P. Näger
    • Institut für PhilosophieUniversität Bremen
  • W. Stelzner
    • Institut für PhilosophieUniversität Bremen
Report

DOI: 10.1007/s10838-011-9151-3

Cite this article as:
Kuhlmann, M., Näger, P. & Stelzner, W. J Gen Philos Sci (2011) 42: 177. doi:10.1007/s10838-011-9151-3
  • 58 Views

The international conference of the German Society for Analytic Philosophy (GAP) is held triennially, the 2009 venue taking place in Bremen. The lectures, colloquia and talks covered the complete spectrum of analytic philosophy divided into the eight sections Logic and Philosophy of Science, Epistemology, Philosophy of Language, Philosophy of Mind, Metaphysics and Ontology, Applied Ethics (including Political Philosophy, Philosophy of Law, Social Philosophy), Normative Ethics (Metaethics, Philosophy of Action and Decision-Making), Aesthetics/Philosophy of Religion. This conference report is about the first section, Logic and Philosophy of Science, which is traditionally large and vivid. Instead of giving a complete overview we try to present a representative sample of the 61 talks given in this section. The complete programme can be found at http://www.gap7.de/en/programm.html.

1 Talks on Logic

Logic is usually considered to play an essential role in analytic philosophy, especially in the philosophy of science. On the other hand, logic itself—as philosophical logic—is an intrinsic part of analytic philosophy. This was partly mirrored in the programme of GAP7, where the section Logic and Philosophy of Science contained a remarkable number of contributions on logic. The presented papers focussed on non-classical entailment theory, epistemic logic and some other topics.

In her paper Bolzano’s Theory of Derivability (Ableitbarkeit) and Grounding (Abfolge) and His Notion of a ‘Rigorous Scientific Presentation’ (streng wissenschaftliche Darstellung) Stefania Centrone gave a comparison between the treatment of Bolzano’s logic in the Frege-Russell-Hilbert-style (FRH-style) and in the Gentzen-style (G-style). Unfortunately neither in the paper nor in the following discussion there was room for discussing the important non-classical consequences of Bolzano’s theory of Ableitbarkeit and Abfolge in which he sufficiently differs form the Tarski-conception of entailment.

Stefan Roski in his paper Epistemological Questions Concerning Concepts of Logical Consequence (Erkenntnistheoretische Fragen an Konzeptionen logischer Konsequenz) raised the question about the epistemological relevance of model-theoretic and proof-theoretic explications of the notion of consequence. In particular he considers the question in what sense, if at all, the proof-theoretic conception can claim to be epistemologically superior to the model-theoretic one. Roski refers to central critiques by Prawitz and Etchemendy against the model-theoretic conception. He argues that the proof-theoretic conception has an advantage over the model-theoretic conception only if some questionable presuppositions concerning the philosophy of language are accepted. Roski concludes that the claims raised by Prawitz and Etchemendy in favour of the proof-theoretic conception are not fulfilled.

Departing from the Tarski-conception of entailment, Elia Zardini in his Following-from and Transitivity challenged the transitivity of the entailment relation. In order to attain room for non-transitivity, Zardini introduces two different kinds of reasons for accepting a sentence, “non-deductive” and “deductive”: one’s reasons are non-deductive when the sentence is accepted not because it is the conclusion of a deductively valid argument all of whose premises one accepts or has reason to accept. Of course, this opens a wide range of non-deductive reasons, because they are defined just as not deductive ones. Non-transitivity can then be seen in the case that consequence only commits one to accept the conclusion of a valid argument all of whose premises one has non-deductive reason to accept.

In his Assertion and Denial in Proof-Theoretic SemanticsPeter Schroeder-Heister considered proof-theoretic semantics as an attempt to define logical consequence and, more generally, analytic reasoning in terms of proof rather than truth. He explains that proof-theoretic semantics by its nature in emphasizing proof rather than refutation is assertion-driven. Schroeder-Heister focuses his paper on the fact that there is an asymmetry between proofs and refutations or between assertions and denials. And he argues that this asymmetry should be removed. Using duality arguments he shows that there is no proper advantage of assertion over denial. He comes to the conclusion that in proof-theoretic semantics, as in truth-condition semantics, there is no fundamental semantic principle available which favours assertion.

Based on the distinction between the sets of designated and antidesignated truth-values D+ and D, which are not the complements in the set of truth-values V, Yaroslav Shramko in his Entailment Relations in Many-Valued Logics gave the following four basic definitions, which determine various conceptions of an entailment relation:
$$ A\sim_{\text{t}}\, B\quad {\text{iff}}\;\forall v: \, v\left( A \right) \in {\text{D}}^{ + } \Rightarrow v\left( B \right) \in {\text{D}}^{ + } $$
(1)
$$ A\sim_{\text{f}}\, B\quad {\text{iff}}\;\forall v: \, v\left( A \right) \notin D^{ - } \Rightarrow v\left( B \right) \notin {\text{D}}^{ - } $$
(2)
$$ A\sim_{\text{q}}\, B\quad {\text{iff}}\;\forall v: \, v\left( A \right) \notin D^{ - } \Rightarrow v\left( B \right) \in {\text{D}}^{ + } $$
(3)
$$ A\sim_{\text{p}}\, B\quad {\text{iff}}\;\forall v: \, v\left( A \right) \in {\text{D}}^{ + } \Rightarrow v\left( B \right) \notin {\text{D}}^{ - } $$
(4)
On the basis of these definitions, Shramko examines the four-valued matrix {T, B, N, F} with D+ = {T, B} and D = {N, F} of Belnap and the Kleene-Priest account with the matrix {T, I, F} and D+ = {T} and D = {F} in this framework. In the case of Belnap he receives \( \sim_{t} = \sim_{f} \,{\text{and}}\,\sim_{q} = \sim_{p} . \) Moreover,\( \sim_{t} \,({\text{alias}}\,\sim_{f} ) \ne \sim_{p} \,({\text{alias}}\,\sim_{q} ). \) In the Kleene case we have \( \sim_{\text{t}} \) as the entailment relation and in the Priest case it is \( \sim_{\text{f}} . \) In the Kleene-Priest matrix we have: \( \left( 1 \right) \sim_{\text{q}} \; \subseteq \; \sim_{\text{t}} ;\; \, \left( 2 \right) \sim_{\text{q}} \; \subseteq \; \sim_{\text{f}} ;\; \, \left( 3 \right) \sim_{\text{t}} \subseteq \sim_{\text{p}} ;\,\left( 4 \right)\sim_{\text{f}} \subseteq \sim_{\text{p}} ;\,\left( 5 \right)\sim_{\text{q}} \subset \sim_{\text{t}} \cap \sim_{\text{f}} . \)

Shramko then defines a “truth order” between entailment relations and obtains another representation of a four-valued logic whose values are formed by the entailment relations defined on the basis of a three-valued quasi-matrix.

Heinrich Wansing in his technically sophisticated talk on Sequent Calculi for Some Trilattice Logics concentrated on the trilattice SIXTEEN3 introduced in 2005 by Shramko/Wansing, which is a natural generalisation of the famous bilattice FOUR2. As a further source, Wansing refers to Hilbert-style proof systems for trilattice logics related to SIXTEEN3 one of which was presented by Ordintsov in 2008.

Werner Stelzner in his Foundations of a Logic of Consent (Grundzüge einer Zustimmungslogik) developed an analysis of the logical relations between assents, which is based on the distinction between two different kinds of possible worlds: reality-worlds and (epistemic) set-up-worlds. While the entailment relations in reality-worlds are classically determined, the picture changes radically for set-up-worlds. We can not only have empty and contradictory set-ups, there exists no general entailment relation between set-ups. Such entailment relations can be found only under the supposition of special types of epistemic subjects accomplishing the set-ups according to special entailment relations limited by the logical abilities and resources of epistemic subjects, performing the assents.

In his paper On the Incompatibility of Negative Introspection and Knowledge as True BeliefManuel Bremer gave a formal treatment of the question how much access and how reliable an access a cognitive agent a has to her non-epistemic beliefs. An agent with ideal self-access or ideal introspective capacities fulfils
$$ {\text{B}}\alpha \supset {\text{BB}}\alpha \quad ({\text{positive}}\,{\text{introspection}}) $$
$$ \neg {\text{B}}\alpha \supset {\text{B}}\neg {\text{B}}\alpha \quad ({\text{negative}}\,{\text{introspection}}) $$
An ideal epistemic subject (which is supposed by Bremer) also fulfils logical omniscience and deductive closure:
$$ \delta \alpha \Rightarrow \delta {\text{B}}\alpha $$
$$ \delta (\alpha \supset \gamma ) \Rightarrow \delta {\text{B}}\alpha \supset {\text{B}}\gamma $$
Bremer admits that for human agents the last two principles seem to be unrealistic: neither do we believe or know all logical truths, nor are our beliefs closed under logical consequence. It was shown that negative introspection cannot be consistently adopted for a strong notion of knowledge, which treats knowledge as true conviction.

2 Colloquium Science and Philosophy for a Complex World

The philosophy of science is traditionally dominated by the analysis of fundamental theories about simple objects. While this led to many interesting and important insights, it misrepresents the significance of research about far more complicated composite systems in today’s science. This neglect is particularly momentous since complex systems research, for instance, has its own repertoire of analytical methods, concepts and explanatory strategies, which cannot be reduced to those in fundamental science. The colloquium Science and Philosophy for a Complex World was devoted to a typical kind of such complex systems research.

The physicist Stefan Bornholdt introduced the new discipline of econophysics in his talk The Dynamics of Complex Systems: Models and Approaches by Physicists (Dynamik komplexer Systeme: Modelle und Sichtweisen der Physik). Econophysicists view financial markets as complex systems that can be analysed by using techniques and models from physics. The most important indication that financial markets exhibit complex behaviour is the occurrence of unexpected collective behaviour resulting from the repeated non-linear interactions among the market participants. The punch line is the following. Financial markets experience far more extreme events, like crashes and bubbles, than one would traditionally expect for random processes. This indicates that the interaction between market participants is of crucial significance. To put it another way, the best explanation for the high probability of extreme events in financial markets involves the assumption that financial markets are complex systems, just as many other composite systems that show a similar tendency for extreme events in the absence of any dramatic external causes. And statistical physics can offer some of the best analyses of systems with coherent large-scale collective behaviour resulting from random interactions of a large number of constituents.

However, the classification of financial markets as complex systems, the very foundation for econophysics, is not a self-evident truth. The economist Thomas Lux showed in his presentation What can Economists Learn from Physics? (Was können Ökonomen aus der Physik lernen?) in which sense economics usually, albeit erroneously, builds on a completely different assessment. According to the standard neo-classical equilibrium theory economics is to be understood in terms of a representative kind of rational agent who strives to maximise his expected utility. Price formation is a matter of equilibrium between supply and demand and extreme events, e.g. bubbles and crashes in financial markets, hardly ever occur and if they do, there is a discernible reason, namely usually an external cause. As Lux points out, econophysics, a field to which he contributed substantially himself, rests on a very different assessment. According to this approach, extreme events are to be expected with non-negligible probability and they can arise endogenously, i.e. purely through the normal interaction of market participants without any drastic external disturbance. Lux emphasizes that the ruling ‘conceptual reductionism’ to a representative rational agent forecloses an appropriate understanding of economical problems such as the recent financial crisis. In contrast to standard economics, Lux advocates a model with heterogeneous agents whose uncoordinated interactions can reproduce the statistical characteristics one actually observes in financial markets.

The two philosophers of science in the colloquium, Stephan Hartmann and Meinard Kuhlmann, focussed on the more general philosophical lessons that can be learnt from complex systems research. Moreover, in his talk Complex Social Networks: The Philosophical Approach (Komplexe Soziale Netzwerke: Der philosophische Zugang) Stephan Hartmann introduced another example of complex systems research, to which he also contributed himself. In this research one studies how to resolve disagreement on the basis of a particular normative model for consensual decision-making, which is due to the philosopher K. Lehrer and the mathematician C. Wagner. Besides a critical assessment of this model, Hartmann discusses how philosophers can profit from investigating such complex network models for deliberation processes in social groups. Concerning the general perspective about interdisciplinary complex systems research, be it econophysics or social network models, Hartmann points out that science lives on the openness for new methods, and that scientific breakthroughs were often achieved by the import of methods from one scientific field to another. However, Hartmann cautions not to draw premature conclusions from disciplines which are in a very early, i.e. pre-paradigmatic stage. Moreover, Hartmann emphasizes that it is not a stable matter of fact which scientific explanations are rated as satisfactory or even which phenomena call for an explanation in the first place. Against this background, it is particularly difficult to evaluate new scientific fields such as complex systems research.

As Meinard Kuhlmann highlighted in his talk Structural Mechanisms: Philosophical Reflections (Strukturelle Mechanismen: Wissenschaftsphilosophische Überlegungen), econophysics and similar disciplines deal with a type of complex systems that is radically different from the ones that are often investigated in (the philosophy of) biology. Instead of being compositionally complex, these systems rather exhibit highly non-trivial dynamical patterns on the basis of compositionally simple arrangements of large numbers of non-linearly interacting constituents. Kuhlmann calls this kind ‘dynamically complex systems’. The characteristic dynamical patterns in dynamically complex systems arise endogenously from the local interaction of the system’s parts, largely irrespective of their detailed properties. Kuhlmann’s main objective is to show that dynamically complex systems are not sufficiently covered by the available conceptions of mechanistic explanations and that one needs to employ a more general structural notion of mechanisms.

3 Talks on Philosophy of Science

Among the talks about philosophy of science there was no prevailing theme, rather a wide range of topics. One thing that could be noticed was that besides the traditional themes in general philosophy of science, philosophers have started to think deeply about specific philosophical problems concerning the single sciences, which do not only include the paradigmatic single science, physics, but now economics and biology as well.

Carsten Held (Theory Success and Laws of Nature) asked how the success of a scientific theory (the true predictions it makes) could be explained. He rejects the scientific realist’s answer that approximate truth is the best explanation (Putnam), because of the well-known criticism that there have been successful but false theories (Laudan). Van Fraassen’s anti-realist answer that scientific theories are successful because they are empirically adequate is not informative. Stanford’s account that a successful theory is predictively similar to the true theory Tt does not help either because it cannot explain the success of the true theory Tt itself. Held argues that satisfactory explanations of a theory’s success have to fulfil two requirements: First, the theory has to involve a deterministic mechanism, because theories involving indeterministic mechanisms can only be successful by chance—but success by chance cannot be explained. Second, the space-time regions of the input-output variables of a theory have to be connected by a law, for if they were not, the regular pattern would be accidental and the success of the theory could not be explained. Hence, Held concludes, there have to be laws of nature, which connect regions of space-time—otherwise the success of our theories could not be explained. Furthermore, these laws cannot be mere regularities (as Mill, Ramsey and Lewis uphold in the Humean tradition), because then again the success could not be explained: To say that a theory is successful if its predictions are derived from the theorems of the true best theory (Lewis) does not explain the success of the best true theory.

Simon Deichsel (Against Realism in the Philosophy of Economics—Wider den Realismus in der Wissenschaftstheorie der Ökonomie) examined the recent revival of epistemic realism in the philosophy of economics (Mäki, Lawson). Against this alleged realism, Deichsel states that neither Mäki nor Lawson in fact support epistemic realism, because they both concede the fallibility of economic theories. But strong epistemic realism together with fallibility of theories is not epistemic realism (in this strong sense) any more; it results in a position that antirealists can accept as well, namely that economic theories can be true (weak epistemic realism). Hence, the quarrel between realists and antirealists in the philosophy of economics seems to be based on a misunderstanding since the alleged realists are not in fact holding the position that the anti-realists are attacking. Furthermore, neither Mäki nor Lawson solve the fundamental problem for strong epistemic realism, namely the problem of finding a “correspondence criterion”, which would indicate whether an empirically adequate theory is actually true. Given this analysis, even realism as a normative position becomes questionable, because all the criteria that are employed by realists can be employed by anti-realists as well: if it is not an ontological claim the slogan “more realism” would only amount to require theories to fit our present belief system better. However, Deichsel eventually gives arguments why even this “methodological” realism is unwise to adopt, because it may prevent generating useful abstract theories.

Marie I. Kaiser (A Mechanical Understanding of Reduction—Ein mechanistisches Verständnis von Reduktion) considered the question what it means that an explanation can be reduced to another explanation (explanatory reduction in contrast to Nagel’s programme of theory reduction). She distinguishes two kinds of explanatory reduction: First, single explanations can be reductionist if they explain the behaviour of a system by referring to the behaviour of its parts (part-whole explanations), while, second, one explanation can reduce another explanation of the same phenomenon, if the first belongs to another “level”. Kaiser argues for a new model of explanatory reduction that is based on the concepts of “mechanism” in recent publications (Machamer, Darden, and Craver; Glennan). This mechanical understanding of explanatory reduction opens new possibilities to solve the problem of reduction in biology, because mechanists do not focus merely on the parts of a system and their interaction but can take into account the complex organization of the parts as a prerequisite for productive interactions. Furthermore, mechanists recognize the fact that mechanisms are organized in multilevel hierarchies. This latter fact allows acknowledging the high importance of the context and higher levels in mechanistic explanations and suggests a multi-level reductionism (in contrast to a single level reductionism where all biological phenomena are reduced to one level). Kaiser concludes that these outcomes of a mechanistic explanatory model are adequate to the biological practice and the complex relations between biological phenomena, and therefore show good prospects to solve the problem of reduction in biology.

Alexander Reutlinger (A Four-Dimensional Theory of Laws of Nature) explicated the non-universal character of ceteris paribus laws in the special sciences. His analysis distinguishes four dimensions of universality: (1) Universality of space and time: Laws are universal if they hold for all space-time regions. (2) Universality of Domain of Application: Laws are universal if they hold for all (kinds of) objects. (3) Universality for External Circumstances: Laws are universal if they hold under all external circumstances. (4) Universality with respect to the Values of Variables: Laws are universal if they hold for all possible values of the variables in the law statement. According to the four-dimensional theory of non-universal laws, a statement S is a law in the special sciences if S is a generalization that is (a) a system law, (b) comparative and (c) stable. In his talk Reutlinger focuses on the third dimension which faces the following dilemma (Lange, 1993): Without ceteris paribus clauses non-universal laws are false (because in a strict sense not all Fs are Gs) while with ceteris paribus clauses they become trivially true (because the ceteris paribus clauses could comprise any disturbing factors, so that the law would amount to “all Fs are Gs or ¬(all Fs are Gs)”). Reutlinger presents three strategies to avoid this dilemma. First, the method of quasi-Newtonian laws relies on the fact that, typically, laws are part of a theory or a model. Then some disturbing factor with respect to a law L (e.g. Newton’s First Law) can be described by another law L* (e.g. Newton’s Second Law) in the same theory or model. The ceteris paribus law L then avoids falsity because the occurrence of a disturbance does not render the law in question false—instead it describes its influence; and L avoids triviality because it is not committed to the crucial expression ‘if nothing interferes’. Second, according to the method of negligibility scientists do only have to refer to relevant interfering factors, i.e. factors that arise sufficiently often and can cause relevant deviations from G-hood. This method avoids falsity because the occurrence of a negligible factor, e.g. a comet, does not render the law false; and it avoids triviality because ‘non-negligible’ disturbances can disconfirm a law. Third, the method of intended application assumes that some interfering factors can be declared as exogenous, because they do not belong to the laws (and the discipline’s) intended purposes and applications. Triviality is avoided because taking some factors as exogenous does not preclude empirical testing; and falsity is avoided, because ‘wiggling’ the exogenous variables is not a counterinstance to the law.

Andreas Hüttemann (Non-trivial Ceteris Paribus Laws (in Physics)—Nicht-triviale Ceteris-paribus-Gesetze (in der Physik)) referred to the same dilemma for ceteris paribus laws (first horn: falsity versus second horn: triviality) as Reutlinger (see above). He argues that laws have to be understood as ascribing dispositions to the systems in question. In this reading of laws the horn of falsity is avoided, because laws are not meant to be universal: They apply only to the circumstances that are specified by the manifestation conditions of the disposition (namely the ceteris paribus clause). In order to avoid the second horn as well, Hüttemann claims that ceteris paribus laws have to be understood according to (what Reutlinger called) the method of quasi-Newtonian laws: Disturbing factors that are covered by the ceteris paribus clause of a certain law L must quantitatively be described by another law L* of the theory. The paradigm case is Newton’s First Law according to which every body remains in its state of motion unless forces act upon it (cp clause). The influence of the forces, however, is quantitatively described by Newton’s Second Law. Ceteris paribus laws become scientifically respectable laws if they are in such a way integrated in experimental and theoretical methodologies that allows quantifying the influence of disturbing factors.

4 Conclusion

The Congress title “Reflections and Projections—Challenges to Philosophy” requested to consider challenges to philosophy and the colloquium Science and Philosophy for a Complex World suggested that complexity is one of these big challenges. In a complex world, traditional scientific models and results are not sufficient for making rational decisions about complex problems, which arise in all realms of modern societies (health care, law, technology, politics, economics,…). In the face of this situation the expertise of philosophers in evaluating new scientific results concerning such complex systems becomes increasingly important.

Copyright information

© Springer Science+Business Media B.V. 2011