Philosophy, Mathematics, Linguistics: Aspects of Interaction 2014

International Interdisciplinary Conference held on April 21-25, 2014

All plenary, thematic sessions, and the panel discussion take place in the building of EIMI located at the address:
10, Pesochnaya naberezhnaya, St. Petersburg. For more details, see the page Venue on the conference website.

Abstracts of the talks

Invited Talks

G. A. Chaitin (Federal University of Rio de Janeiro, Brazil)
Conceptual Complexity and Algorithmic Information
Abstract: In this essay we propose that the fundamental philosophical concept of conceptual complexity is captured mathematically by the notion of algorithmic information content, and we discuss the complexity of physical and mathematical theories, the complexity of biological mutations, and the most complex system in biology, the human brain.

E. Dragalina-Chernaya (Higher School of Economics, Moscow, Russia)
Logical hylomorphism revisited
Abstract: The aim of this paper is to systematize the variety of logical hylomorphism according to different types of formal relations. Various explications of substantial and dynamic formality will be sketched. My larger purpose is to discuss demarcation principles for the bounds of logic as formal ontology and formal deontology.

D. Grigoriev (CNRS - Université Lille 1, Lille, France)
Analog computations: past and future?
Abstract: Analog computation devices were used long before the modern digital computers. Nowadays an interest is growing towards theoretical models of analog computers: the most popular are quantum and adiabatic ones, also there are speculations on computers based on the laws of classical physics and their applications to cryptography (replacing the role of one-way functions). We suppose to discuss some theoretical aspects arising in connection with analog computations, in particular Church’s thesis and P-NP problem.

Yu. Gurevich (Microsoft Research, Redmond, WA, USA)
On Semantics-to-Syntax Analyses of Algorithms
Abstract: Alan Turing pioneered semantics-to-syntax analysis of algorithms. It is a kind of analysis where you start with a large semantically defined species of algorithms, and you finish up with a syntactic artifact, typically a computation model, that characterizes the species. The task of analyzing a large species of algorithms seems daunting if not impossible. As in quicksand, one needs a support, a rescue point, a fulcrum. In computation analysis, a fulcrum is a particular viewpoint on computation that clarifies and simplifies things to the point that analysis become possible. We review from that point of view Turing’s analysis of human-executable computation, Kolmogorov’s analysis of sequential bit-level computation, Gandy’s analysis of a species of machine computation, and our own analysis of sequential computation.

K. Mainzer (Graduate School of Computer Science, Technische Universitat Munich, Carl von Linde Academy)
The Effectiveness of Complex Systems. A Mathematical, Computational, and Philosophical Approach
Abstract: "The unreasonable effectiveness of mathematics" in the natural sciences noted by Wigner, is manifested also in neural and cognitive sciences by complex systems. The local activity principle is mathematically applied to Hodgkin-Huxley neurons and nanoelectronic circuits to generate action potentials and pattern formation (“cell assemblies”) in the brain. Neural cell assemblies are correlated to cognitive and mental states. Therefore, in technology and philosophy, the question arises whether mind and brain are computable. Dynamical systems like neural networks can be defined as digital models on natural or rational numbers with difference equations or as analog models by differential equations on real and complex numbers. Therefore, the concepts of computability and complexity must be extended from digital to analog applications. The effectiveness and computability of mathematical models depend on different degrees of complexity. It can be proven that computational complexity of different automata and machines in computer science are not only equivalent to dynamical complexity of different dynamical systems in systems science, but also to logical complexity of different languages and grammars in linguistics.

Yu. I. Manin (Max–Planck–Institut für Mathematik, Bonn, Germany)
Point, Atom, Letter
Abstract: In this essay, I am considering the emergence and evolution of the mathematical notion of point on the background of history, on the one hand, and psychology combined with neuroscience, on the other hand. More precisely, I survey the historical development of human collective cognitive processes, and I trace parallelisms between the origins of geometry, of atomism, and of alphabetic writing, respectively. I argue that they were correlated with the general dynamics of the “left brain – right brain” balance in various cultures.

G. Mints (Dept. of Philosophy, University of Stanford, CA, USA)
Several Interesting Problems in Applied Logic
Abstract: Three problems arizing in the context of program verification and safe programming are discussed here: sawing big propositional proof, existence of short proof in the logic of quantified propositional Boolean formulas and solving logical equations in first order arithmetic.

A. N. Parshin (Steklov Institute of Mathematics RAS, Moscow, Russia)
A staircase of reflections: from gnoseology to anthropology
Abstract: In 1908, russian philosopher Pavel Florensky published a paper “The Limits of Gnoseology”, dedicated to analysis of the act of cognition, the acquisition of a new knowledge. His approach is a development of philosophical ideas of Fichte and Schelling. A characteristic feature of the work is an explicit introduction of infinity into the act of cognition. In addition, he used a meaningful consideration of the theory of infinite groups. We offer a development of Florensky's ideas, connecting these ideas with his another much more known works such as “Iconostasis” and “Imaginaries in Geometry” which describe the boundary between intelligible and sensual spaces. Analogies with the act of knowledge of quantum theory and human anthropology will also be described.

O. B. Prosorov (St. Petersburg Department of V. A. Steklov Institute of Mathematics RAS, Russia)
Linguistic universals of topological nature
Abstract: We discuss about the topologies and partial order structures underlying the process of a natural language text understanding. We argue that these mathematical structures have some properties which are shared by natural language texts intended for human understanding, and which may be thought of as linguistic universals of topological nature.

A. Rodin (Institute of Philosophy RAS, Moscow - St. Petersburg State University, Russia)
Objectivity, Objecthood and Genetic Axiomatic Methods in Categorical Mathematics
Abstract: In 1934 Hilbert and Bernays distinguished between their novel notion of formal axiomatic method, which later became standard, and a more traditional notion of axiomatic method that they describe as ”constructive” or "genetic". While the traditional genetic method requires building of theoretical objects from given primitive objects according to certain construction rules (think of points and constructions by ruler and compass in the traditional geometry), the modern formal axiomatic method uses what Hilbert and Bernays describe as "existential form", i.e., an assumption that any given consistent theory has certain models which, generally, are simply posited rather than constructed. At the same time the formal method allows for constructive (genetic) procedures applied (not to theoretical objects proper but) to formulas expressing properties of (and relations between) the theoretical objects. Thus the formal method does not exclude the genetic method altogether but delimits its application to the (informal) meta-mathematics.
      In spite of the fact that the Hilbert-style axiomatic setting still provides an "official" picture of how a well-founded mathematical theory should look like, its application in the mathematical practice remains very limited. I argue that certain important developments in the axiomatic thinking of 20th and 21st centuries including Lawvere’s axiomatic Topos theory and Voevodsky’s axiomatic Homotopy Theory (developed along with his project of building new Univalent Foundations of mathematics), can be adequately described as a revival of the traditional genetic approach in the new contexts. I show how these developments help us to pave the existing gap between foundations of mathematics and foundations of physics and thus give us some sensible strategies of applying modern mathematics in natural sciences. On the basis of my analysis of the new genetic methods in mathematics I propose an original account of mathematical objectivity and of mathematical objecthood.

A. Slissenko (LACL, Université Paris-Est Créteil, France)
In Quest of Information in Algorithmic Processes
Abstract: Intuitively, an algorithm that is computing the value of a function for a given input, is extracting an information from the input and is processing this information. During this work the information being processed ‘converges to the result’. We show by examples that one can measure this information convergence. No general theory is presented, though some questions that arise on the way towards such a theory are discussed. We start with an approach based on approximations of the graph of the computed function; we call this approach semantical. It works only for algorithms that mainly calculate but do not analyze the information being processed. We extend this approach by using syntactical considerations. The principle underlying the estimations of information convergence, is that of ‘maximal uncertainty’ that we impose on the algorithm under analysis. The problem of information convergence brings up conceptual questions “What is information?”, “What is uncertainty?” related to similar questions more and more intensively studied in philosophy.

V. L. Vasyukov (Chair of History and Philosophy of Science, Institute of Philosophy RAS, Moscow)
Univalent Foundations of Mathematics and Logical Pluralism
Abstract: Vladimir Voevodsky in his Univalent Foundations Project writes that univalent foundations can be used both for constructive and for non-constructive mathematics. The last is of extreme interest since this project would be understood in a sense that this means an opportunity to extend univalent approach on non-classical mathematics. In general, Univalent Foundations Project allows the exploitation of the structures on homotopy types instead of structures on sets or structures on categories as in case of set-level mathematics or category-level mathematics. Non-classical mathematics should be respectively considered either as non-classical set-level mathematics or as non-classical category-level (toposes-level) mathematics. Since it is possible to directly formalize the world of homotopy types using in particular Martin-Lof type systems then the task is to pass to non-classical type systems e.g. da Costa paraconsistent type systems in order to formalize the world of non-classical homotopy types. Taking into account that the univalent model takes values in the homotopy category associated with a given set theory and to construct this model one usually first chooses a locally cartesian closed model category (in the sense of homotopy theory) then trying to extend this scheme for a case of non-classical set theories (paraconsistent, quantum, relevant etc.) we need to evaluate respective non-classical homotopy types not in cartesian closed categories but in more suitable ones. In any case it seems that such Non-Classical Univalent Foundations Project should be directly developed according to Logical Pluralism paradigma and and it seems that it is difficult to find counter-argument of logical or mathematical character against such an opportunity except the globality and complexity of a such enterprise.

N. A. Vavilov (Dept. of Mathematics and Mechanics, St. Petersburg State University)
Complexity and reliability of proofs and computations
Abstract: In the talk I plan to discuss the following phenomenon: a short formal or computational proof that we do not understand may be much less reliable than a very long conceptual proof that we understand. I will illustrate this thought in many real life examples. In particular, we could not find a single formula in recent number theory books, which would be longer than 2.5in, and yet stand a computer verification. Also, I will discuss what it really means to verify an extremely long and complex proof, such as Thompson—Feit theorem, by computer.

A. Vershik (St. Petersburg Department of V. A. Steklov Institute of Mathematics RAS, Russia)
Does a freedom of the choice exist in mathematics?
Abstract: I discuss about the question: why mathematics is different from other sciences. Namely, the main pretension to mathematicians from other scientist and first of all from physicist is: ”You can opt for your studying what you desire, whenever we experience the strict rules: to look and to study only the Nature.” Is it right? In a sense ”Yes”, but this is a rather superficial point of view on the mathematical Universum. If we look carefully, we can see also the strict rules (at least for some of the mathematicians) how to choose the subjects, methods etc., but these rules are completely opposite to the dictate of Nature in the natural sciences. I try to explain why one of the basic principles of mathematics is the aesthetic one, and why the apparent voluntariness of the choice of subjects and methods of mathematical research is nothing other than a manifestation of the rigid aesthetic requirements.

Contributed Talks

H. Benis Sinaceur (IHPST, Paris, France)
Facets and Levels of Mathematical Abstraction
Abstract: Mathematical abstraction is the process of considering and manipulating operations, rules, methods and concepts divested from their reference to real world phenomena and circumstances, and also deprived from the content connected to particular applications. There is no one single way of performing mathematical abstraction. The term “abstraction” does not name a unique procedure but a general process, which goes many ways that are mostly simultaneous and intertwined; in particular, the process does not amount only to logical subsumption. I will consider comparatively how philosophers consider abstraction and how mathematicians perform it, with the aim to bring to light the fundamental thinking processes at play, and to illustrate by significant examples how much intricate and multi-leveled may be the combination of typical mathematical techniques which include axiomatic method, invariance principles, equivalence relations and functional correspondences.

H. Graves (Algos Associates, Fort Worth, TX, USA)
A Practical Doctrine for Mathematical Applications
Abstract: A doctrine, in the sense of Jon Beck, is outlined for representing and reasoning about mathematical applications. The doctrine is a two category whose objects are axioms sets and whose morphisms are functors. The use of this doctrine for developing and reasoning about axiom sets corresponds closely to informal practice, but differs from textbook development. An application axiom set is specified by a signature and formulae in the language of the signature. Each application axiom set uses a base language with term constructions from topos theory. The axioms are Horn rule axioms. These rule axioms sets generate a topos as their deductive closure. A First Order Logic is used to express the axioms, but extends standard presentations in that terms are allowed to have decidable preconditions for being well-formed. Constructions such as composition of maps are defined as functions terms. The axiom sets are represented as tuples within the 2-category doctrine. The 2-category is a meta logic for operating on axiom sets and maps between them. The doctrine is also a specification for a class of software tools for developing and analysing axiom sets that represent applications.

R. Kahle (CENTRIA and DM, FCT, Universidade Nova de Lisboa, Caparica, Portugal)
The logical cone. A new account to counterfactuals
Abstract: We give a new account to counterfactuals, based on the notion: which facts, relevant for the consequent, may depend on the fact in the antecedent?

E. F. Karavaev (Dept. of Logic, St. Petersburg State University, Russia)
One way to determine the intervals in hybrid temporal logic
Abstract: This presentation discusses the opportunity of improvement of technical means of hybrid temporal logic through the introduction of time intervals. In the procedure of constructing intervals the author of the presentation follows ideas and development expressed and carried out by A.A. Markov in his article published in 1932. So the ‘Priorean paradigm’ of understanding of the logic (temporal qualification of judgments and the idea of hybrid logic) is complemented by a building of time metric based on the relation ‘earlier than’. It seems that the described improvement of the machinery of temporal logic allows, in particular, to perfect the approaches to the modelling of planning and strategic management.

V. Kreinovich (University of Texas at El Paso, USA),
O. Kosheleva (University of Texas at El Paso, USA)
Logic of Scientific Discovery: How Physical Induction Affects What Is Computable
Abstract: Most of our knowledge about a physical world comes from physical induction: if a hypothesis is confirmed by a sufficient number of observations, we conclude that this hypothesis is universally true. We show that a natural formalization of this property affects what is computable when processing measurement and observation results, and we explain how this formalization is related to Kolmogorov complexity and randomness. We also consider computational consequences of an alternative idea also coming form physics: that no physical law is absolutely true, that every physical law will sooner or later need to be corrected. It turns out that this alternative approach enables us to use measurement results go beyond what is usually computable.

A. Lecomte (Laboratoire SFL - CNRS - Université Paris 8, Saint-Denis, France)
An Interaction Framework for Dialogue
Abstract: Many considerations lead to the idea that interaction is at the heart of language. Conversations are at the basis of the development of language. Only a few formal works are attempting to show how this feature is preeminent. Most works in formal semantics for instance are oriented toward model-theoretic aspects: linguistic expressions are just seen as describing situations. Even if this is not of course completely wrong, it seems that most of them do not directly take their references from the environment, but from reduction steps in the interaction of two participants in a dialogue. Theoretical Computer Science has given a paradigm for such an interaction, starting from the reduction of λ-terms which corresponds to proof normalization. New approaches extend this view by entering into the picture not only proofs but also counter-proofs, thus leading to Girard’s Ludics, which provides several tools to express dialogue’s dynamicity.

L. I. Manevitch (Institute of Chemical Physics, Moscow, Russia)
Asymptotic thinking as a philosophical principle
Abstract: It was noted that asymptotic thinking, being efficient mathematical tool, similarly to symmetry analysis is also a philosophical principle reflecting the essential features of the cognition process. While the clearest manifestation of this principle is achieved in dynamical systems, its numerous applications also relate to physics, biology and humanities.

S. I. Nikolenko (St. Petersburg Department of V. A. Steklov Institute of Mathematics RAS, Russia),
S. Koltsov (National Research University Higher School of Economics, St. Petersburg, Russia),
O. Koltsova (National Research University Higher School of Economics, St. Petersburg, Russia)
Measuring Topic Quality in Latent Dirichlet Allocation
Abstract: Topic modeling is an important direction of study for modern text mining; unsupervised mining of collections of topics is intended to produce understanding and capture the essence of issues a dataset is devoted to. However, existing techniques of topic evaluation in topic models such as latent Dirichlet allocation (LDA) are still lacking in their ability to represent human interpretability and worth for qualitative studies. In this work, we propose a novel topic quality metric that more closely corresponds to human judgement than existing ones. We support this claim with the results of an experimental study where test subjects rate LDA topics on how interpretable they are.

A. B. Patkul (Department of Ontology and Epistemology, St. Petersburg State University, Russia)
The Problem of the Logic’s Destruction and Grounding in Phenomenology
Abstract: The article is dedicated to the problem of the critical grounding of the formal logic (as well traditional as contemporary symbol logic) in phenomenological philosophy. The examples of such critical grounding of this discipline in E. Husserl and M. Heidegger are inquired here in more particularly way. The following theses are ascertained: Husserl believes that the logic is based in the constitutive activity of transcendental consciousness. And Heidegger thinks that logic has its origin in the human understanding of the being and therefore in the usage of copular verb in the sense of “presence-at-hand”. Moreover, the logic is read out from nature as domain, the being of which is the presence, and the logic’s application on the other domains of the being is its unjustified extrapolation in Heidegger’s opinion. And conclusion the question is put in the article, whether such phenomenological “destruction” of logic is only negation of it or a possible way to its grounding outside of the traditional shape of logic.

V. Perminov (Faculty of Philosophy, M. V. Lomonosov Moscow State University, Russia)
Praxeological Substantiation of Logic
Abstract: Logical norms can be understood as a formal criterion of truth which makes it possible to eliminate untrue statements on the base of their form. Background of logic is the insight of full (absolute) truth, different from relative truth of science. For more adequate understanding of the nature of logic, we should rehabilitate old apriorism, having given to it new, praxiologic substantiation.

E. Rivello (Scuola Normale Superiore, Pisa, Italy)
Eliminating the Ordinals from Proofs. An Analysis of Transfinite Recursion
Abstract: Transfinite ordinal numbers enter mathematical practice mainly via the method of definition by transfinite recursion. Outside of axiomatic set theory, there is a significant mathematical tradition in works recasting proofs by transfinite recursion in other terms, mostly with the intention of eliminating the ordinals from the proofs. Leaving aside the different motivations which lead each specific case, we investigate the mathematics of this action of proof transforming and we address the problem of formalising the philosophical notion of elimination which characterises this move.

V. A. Shaposhnikov (Faculty of Philosophy, Lomonosov Moscow State University, Russia)
The Applicability Problem and a Naturalistic Perspective on Mathematics
Abstract: The paper outlines a philosophical account of the interplay between pure and applied mathematics. This account is argued to harmonize well with the naturalistic philosophy of mathematics. The autonomy of mathematics is considered as a transitional form between theological and naturalistic views of mathematics. From the naturalistic standpoint, it is natural to understand pure mathematics through applied mathematics but not vice versa. The proposed approach to mathematics is interpreted as a revival of Aristotle’s philosophy of mathematics and owes a lot to James Franklin. Wigner’s puzzle of applicability is explained away as a survival of the positivist philosophy of mathematics.

S. Soloviev (IRIT, Université Toulouse 3)
Context-dependent invertibility, isomorphism and subtyping in type theory: possible linguistic applications
Abstract: The aim of this note is to attract attention to context-dependent invertibility of terms in type theory and related notions of context-dependent isomorphisms and coercions, a relatively understudied topic that may be of interest to applications, especially in linguistics where meaning naturally depends on context.

A. Spaskov (Institute of Philosophy, Academy of Sciences of Belarus, Minsk, Belarus),
O. Kozina (Moscow Social-Psychological University, Russia)
Number and Time
Abstract: The paper deals with the nature of mathematical concepts, genesis of natural numbers and the temporal structure of consciousness. We analyze the arithmetic model of time and propose a new geometrical model of three-dimensional time, which is based on the hypothesis of independ-ent time dimensions corresponding to external linear and internal cyclic time.

Sh. Steinert-Threlkeld (Department of Philosophy, Stanford University, CA, USA)
On the Decidability of Iterated Languages
Abstract: A special kind of substitution on languages called iteration is presented and studied. We show that each of the star-free, regular, and deterministic context-free languages are closed under iteration and that it is decidable whether a given regular language or a DCFL is an iteration of two languages. We also determine the state complexity of iteration of regular languages. Connections to the van Benthem / Keenan ‘Frege Boundary’ are discussed.

V. Stepanov (Dorodnicyn Computing Centre RAS, Moscow, Russia)
Truth Theory for Logic of Self-Reference Statements as a Quaternion Structure
Abstract: Article is aimed at giving to linguists the tool which they can use for studying of the mechanism of references of one statements on others, including on itself. For this purpose the quantifier of the self-reference is entered and approximation of the self-reference quantifier on sequences of statements of language is given. The language model on discrete dynamic systems is defined. In a dynamic model of self-reference statements developed by the author it is revealed that for language with the propositional connectives of biconditional (<—>) and negation (˜) the truth table for biconditional for type formulae of True, Liar, TruthTeller and (TruthTeller<—>Liar), is Cayley table for the Klein four-group V. It suggests that the truth space for values of self-reference statements is described by quaternion algebra H.

D. Tiskin (Dept. of Logic, St. Petersburg State University, Russia)
Transparent Evaluation and Pronouns in Attitude Reports
Abstract: The paper outlines an account for Simon Charlow’s data concerning de se and de re readings of pronouns and anaphors in attitude contexts. Using Arnim von Stechow’s binding technique as well as the insights about the internal structure of pronouns (due to Rose-Marie Déchaine and MartinaWiltschko) and about transparent readings of predicates (due to Yasutada Sudo) I treat de se readings as primitive and de re ones as derived. An additional assumption about the semantics of reflexives is used to explain why, as shown by Charlow, a de se anaphor cannot be bound by a de re subject. Next, I show another direction within the problem of anaphora one might proceed in with the treatment of pronouns found in Déchaine and Wiltschko’s paper. Finally, comparing my proposal with its predecessors, I touch upon the issue of the extent to which a semantic theory should be philosophically laden.

J. A. Wislicki (Faculty of Polish Studies, University of Warsaw, Poland)
Semantics of quotation. Against the functional approach to quotation
Abstract: The aim of this paper is twofold. First, it is to argue against the functional approach to quotation, according to which enquotation is a syntactic map that delivers expressions of the metalanguage. Second, it is to define a semantic operation that allows to express the meaning of quotation without getting involved into semantic inconsistencies. I discuss the semantic expressive power of the most influential functional theories of quotation and show the problems that arise in that kind of approach. Then I draw an important connection between Reichenbach's idea of the so-called 'arrow quotes' and the account of quotational context given by Pagin and Westerstahl. The core ideas of both proposals become a bottom line of the semantic account that allows to calculate the meaning of quotation via composition principles without getting involved into semantic inconsistencies.

Links to: PDMI,  EIMI Last Update: April 23, 2014