A note on the metaphilosophy, and metametaphysics, of information

Metaphilosophy is the philosophy of how to approach questions in a particular domain of philosophy (very roughly and not super correctly - philosophy about philosophy). Metametaphysics is about what is the best way, in methodological and conceptual terms (at minimum), to approach and ask questions in metaphysics.

I will not approach the issue here, but metaphysics itself has a long history of facing calls to declare it is already extinct. You can refer to almost any work by Rudolph Carnap, A.J. Ayer, or by almost any late 20th Century expressivist, for evidence of this. This challenge to metaphysics is itself both an example of metametaphysics, and of metaphilosophy. The metametaphysics of most expressivists, as well as their error-theoretic metaphilosophy about metaphysics, says, essentially: don't bother with metaphysics. Certainly, not everyone agrees with expressivists about this. Many philosophers of science do not. The philosophy and metaphysics of information are an interesting case in point and proving ground for this anti-metaphysical challenge, if for no other reason than it simply does not appear that basic questions about the nature of information can even be approached without using metaphysics. The same problem, oddly enough, exists in logic - especially around questions of the relationship between semantics and syntax (although, again, many pragmatist expressivists would dispute this.) If you think the same problem does not exist in logic, then consider that logicians have been trying to figure out if there is information flow in deduction, if information is based upon logic, and if logic is based upon information (of some kind, somehow) for - ooh, gee whiz - at least 40 years? (Reference Set 1)

One reason why the nature of information presents us with such difficult questions is that the philosophy of information and the questions it embodies sit at a kind of nexus of logic, semantics, the philosophy of mathematics, the philosophy of probability, and epistemology. The nature of information is thus a multifacted and difficult question in metaphilosophical and metametaphysical terms. Add questions about the nature of semantic information, and the problems become worse.

The nature of information, and pluralism about information, are a typical running philosophical debate in the philosophy of science and information. The nature and definition of semantic information is one of the hardest problems, or questions, in both metaphysics and classical and non-classical logic (Google Hintikka and the Scandal of Deduction). There is significant overlap also between questions of pluralism in logic, or logical pluralism, and pluralism about the nature of both physical, and semantic, information.

Leading philosopher of information, Fred Dretske, proposed that the veridicality thesis, must be true. He did so in support of his informationist process-reliabilist epistemology (knowledge is sustained by causally reliable information processes). However, this is 1. by no means certain, nor decided, and, 2. even if true in some contexts, the veridicality thesis may be 'vulnerable' to overriding pluralism about the nature of semantic information, as well as to reinterpretation in different problem domains at different levels of abstraction.

Information, information transmission, and semantic information have all been given competing conceptions and analyses by very talented philosophers of science and logicians. Luciano Floridi, Gregory Bateson, Fred Dretske, Pieter Adriaans, Jaako Hintikka, Ruth Millikan, Gregory Chaitin, Jeremy Seligman, Jon Barwise, John Etchemendy, and John Perry are just a few famous philosophers offering competing accounts.

If you get told by anyone that the nature of information is a decided matter, or that one person's view is definitely wrong: then proceed with great caution, and exercise healthy scepticism. Shannon's theory is central in some way to most accounts. Yet, some of the most mathematically astute recent accounts of the nature of information have serious problems. Agreement is far from in reach.

One of my favourite examples, and 'stalking horses', is the probabilistic difference maker theory of Andreas Scarantino (PDMT). Its level of technical sophistication is very elevated, and it has a very sound basis in Shannon's mathematical theory of communication. However, the same is true of Dretske's theory, which PDMT opposes - especially on the critical issue of the veridicality thesis. (Reference Set 2.) According PDMT the information in a signal is determined probabilistically without reference to any specific token information source from which the signal might have come. Handy, because one doesn't need the source, which might have expired, disappeared, or be too far away to access causally. However, it also follws that if we successflly fake a signal so it looks exactly like it came from a vervet monkey, then PDMT says it came from a vervet monkey.

PDMT has other challenges. It is both subjectivist and cognitivist about information. Subjectivism about information means that one cannot have information without a subjective consumer or receiver: a subject-receiver is a necessary condition for information to obtain. According to cognitivist subjectivism about information: that receiver or consumer must have a cognitive capacity: a mind. Dretske's theory does not require this, and is arguably objectivist (although there is a problem too, with, Dretske's objectivism about information: subjectivism is 'smuggled in' for semantic information.)

Cognitivist subjectivism about information is very controversial. However, the bold and technically strong theory of PDMT has an even bigger problem: it is propositionalistic about information. Propositionalism is something that most philosophers will only find familiar from research into intentional attitudes. (Esoteric analytical philosophical fare, if ever there was some!) What is propositionalism about the nature of information? It means that, according to PDMT, not only does one need a subject-receiver with a mind, but that the information in the mind is somehow indelibly, or intrinsically, linked to propositions.

"No biggy", you might say (and I am sure some philosopher will say that.) To which I reply: "Actually: not definitely for certain, but - very much maybe a big biggy." The first reason that propositionalism about information might be a big biggy (i.e. big problem) is that philosophers and logicians have (and this is a well kept secret, mind you) litttle to no idea what a proposition is.

The second problem is that when they do seem to have SOME idea what a proposition is, and when there are pretty nice looking definitions of what a proposition is, these usually depend on some specific technical conception of - cue ominous music - information. In that case we then have what philosophers of all ilks loathe and fear. No - not experimental and theoretical physicists (although that's a good guess, dear reader). A circularity, dear reader. A definitional, conceptual, and ontic circularity. Yucky poo! (Yucky pro-poo-sition?) You cannot define information in terms of propositions, and propositions in terms of information. That's circular. (Jerry Fodor will laugh you out of the seminar from the grave!) So - dear reader, you could be forgiven for thinking that, if philosophers of information do not even know whether information must be true or not, or even if it has anything to do with minds and propositions or not, then we should just declare "nuts to philosophers of information"! (Expressivists of different ilks might resoundingly agree with you!) However, if you cannot be sure about which of the following options (see the tweet) is right, then, perhaps you might develop some doubts about that metaphilosophy of information:

Those might be a little terminologically dense, so try these questions: 1. Do you need a receiver for information, or not? Is there information in DNA even if no one and nothing ever receives it?

2. Do you need a mind in order for there to be information? What happens then to the information in the DNA and mRNA (and tRNA, and proteins, and chaperone molecules - and neural cascades and assemblies of neurons for that matter!) 3. Is logic the basis of information, or is information the basis of logic? You (probably) cannot have both. Or, can you?

4. Is it probability that is the basis of information, and logic has nothing to do with it? If so: which kind of probability: classical, frequentist, subjectivist, Bayesian? Which? 5. Do we need codes to have information? How do they happen in nature without any minds and constructed rules involved? 6. What if we get clever and ignore probability, and go straight to statistics? Is information grounded in statistics? Science likes statistics, whereas philosophers of probability (who are admittedly horrible beings - especially the ones in Canberra), it tends to shoot on sight. Now is it Kolmogorov's axioms, Fisher information, or what?

7. What about entropy, Mr smarty pants Long? Okay: which one? Boltzmann? Gibbs? Is that the same as information, or, as Shannon was advised by Von Neumann, does no one have a clue what entropy is, and so it doesn't matter! In conclusion: questions about the nature of information and semantic information are difficult. If someone tells you some theory, like PDMT or Dretske's theory, or Pieter Adriaans' theory, or Floridi's informational structural realism (ISR), are W-R-O-N-G wrong, then that might just be misinformation (a mixture of pseudo-information and information). It might be B-U-L-L-S-H-I-T, and we all know what that spells: disinformation. Just FYI, dear reader.

Reference Set 1: Logic and Information Flow Akman, V., & Surav, M. (1997). The use of situation theory in context modeling. Computational Intelligence.

Allo, P. (2007). Logical pluralism and semantic information. Journal of Philosophical Logic.

Allo, P. (2008). Formalising the “no information without data-representation” principle. Frontiers in Artificial Intelligence and Applications.

Allo, P. (2010). A Classical Prejudice? Knowledge, Technology & Policy.

Allo, P. (2011). The logic of “being informed” revisited and revised. Philosophical Studies.

Allo, P. (2017). A Constructionist Philosophy of Logic. Minds and Machines.

Allo, P. (2017). Hard and soft logical information. Journal of Logic and Computation.

Allo, P., & Mares, E. (2012). Informational Semantics as a Third Alternative? Erkenntnis.

Barwise, J., & Seligman, J. (1993). Imperfect information flow. Proceedings - Symposium on Logic in Computer Science.

BARWISE, J., GABBAY, D., & HARTONAS, C. (1995). On the Logic of Information Flow. Logic Journal of IGPL.

D’Agostino, M. (2013). Semantic information and the trivialization of logic: Floridi on the scandal of deduction. Information (Switzerland).

D’Agostino, M., & Floridi, L. (2009). The enduring scandal of deduction. Synthese.

D’Alfonso, S. (2014). The logic of knowledge and the flow of information. Minds and Machines.

de Rijke, M., Barwise, J., & Seligman, J. (1999). Information Flow. The Logic of Distributed Systems. The Journal of Symbolic Logic.

Devlin, K. (2006). Situation theory and situation semantics. Handbook of the History of Logic.

Devlin, K. J. (1995). Logic and information. Cambridge University Press.

Elbourne, P., & Elbourne, P. (2013). Situation Semantics. In Definite Descriptions.

Floridi, L. (2006). The logic of being informed. Logique et Analyse.

Floridi, L. (2014). Information closure and the sceptical objection. Synthese.

Floridi, L. (2017). The Logic of Design as a Conceptual Logic of Information. Minds and Machines.

Floridi, L. (2019). The Logic of Information. In The Logic of Information.

Girard, P., Seligman, J., & Liu, F. (2014). General dynamic dynamic logic. Advances in Modal Logic.

Gorsky, S. (2018). The dissolution of Bar-Hillel-Carnap Paradox by semantic information theory based on a paraconsistent logic. Principia.

Guo, M., & Seligman, J. (2013). The logic of Priori and a Posteriori rationality in strategic games. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).

Howson, C. (2016). Does information inform confirmation? Synthese.

Huibers, T. W. C., Lalmas, M., & Van Rijsbergen, C. J. (1996). Information retrieval and situation theory. SIGIR Forum (ACM Special Interest Group on Information Retrieval).

Jaeger, M. (2006). A logic for inductive probabilistic reasoning. In Uncertainty, Rationality, and Agency.

Kun, W., & Brenner, J. E. (2013). The informational stance: Philosophy and logic. Part i the basic theories. Logic and Logical Philosophy.

Lalmas, M., & van Rijsbergen, C. J. (1994). Situation Theory and Dempster-Shafer’s Theory of Evidence for Information Retrieval.

Liang, Z., & Seligman, J. (2011). The dynamics of peer pressure. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).

Liu, F., Seligman, J., & Girard, P. (2014). Logical dynamics of belief change in the community. Synthese.

Ma, M., & Seligman, J. (2015). Algebraic semantics for dynamic dynamic logic. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).

Mares, E., Seligman, J., & Restall, G. (2011). Situations, constraints and channels (update of chapter 4). In Handbook of Logic and Language.

Mechkour, S. (2007). Overview of Situation Theory and its application in modeling context. Seminar Paper, University of Fribourg.

Nurse, P. (2008). Life, logic and information. In Nature.

Pacuit, E. (2013). Dynamic epistemic logic II: Logics of information change. Philosophy Compass.

Primiero, G. (2009). An epistemic logic for becoming informed. Synthese.

Restall, G. (1994). Information flow and relevant logics. Logic, Language and Computation: The 1994 Moraga Proceedings.

Restall, G. (1996). Notes on Situation Theory and Channel Theory. Politics.

Sagueillo, J. M. (2014). Hintikka on Information and Deduction. TEOREMA.

Sagüillo, J. M. (2014). Hintikka on information and deduction. Teorema.

Scarrott, G. (1992). Logic and Information. The Computer Journal.

Seligman, J. (2009). Channels: From logic to probability. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).

Seligman, J. (2014). Situation Theory Reconsidered. In Outstanding Contributions to Logic.

Seligman, J., & Moss, L. S. (2011). Situation Theory. In Handbook of Logic and Language.

Seligman, J., & Moss, L. S. (2011). Situation Theory. In Handbook of Logic and Language.

Seligman, J., Liu, F., & Girard, P. (2011). Logic in the community. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).

Sequoiah-Grayson, S. (2008). The scandal of deduction : Hintikka on the information yield of deductive inferences. Journal of Philosophical Logic. Set 2: Semantic Information and Veridicality

Adriaans, P. (2010). A Critical Analysis of Floridi’s Theory of Semantic Information. Knowledge, Technology & Policy.

Aliev, R. A., & Huseynov, O. H. (2014). Fuzzy Logic and Approximate Reasoning. In Decision Theory with Imperfect Information.

Bar-hillel, Y., & Carnap, R. (1953). Semantic information. British Journal for the Philosophy of Science.

Bonnevie, E. (2001). Dretske’s semantic information theory and metatheories in library and information science. Journal of Documentation.

Carnap, R., & Bar-Hillel, Y. (1952). An Outline of a Theory of Semantic Information (Issue 247).

Coulter, N. S. (1976). Using semantic information measures to evaluate learning strategies. Proceedings of the 14th Annual Southeast Regional Conference, ACM-SE 1976.

Elias, P. (1954). Rudolf Carnap and Yehoshua Bar-Hillel. An outline of a theory of semantic information. Technical report no. 247. Photo-offset. Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Mass., 1952, ii + 49 pp. - Yehoshua Bar-Hi. Journal of Symbolic Logic.

Fetzer, J. H. (2004). Information: Does it Have To Be True? Minds and Machines.

Floridi, L. (2004). Outline of a Theory of Strongly Semantic Information. Minds and Machines.

Floridi, L. (2011). Semantic Information and the Correctness Theory of Truth. Erkenntnis.

Floridi, L. (2008). Understanding Epistemic Relevance. Erkenntnis.

Floridi, L. (2012). Semantic information and the network theory of account. Synthese.

Gorsky, S. (2018). The dissolution of Bar-Hillel-Carnap Paradox by semantic information theory based on a paraconsistent logic. Principia.

James, C. T. (1975). The role of semantic information in lexical decisions. Journal of Experimental Psychology: Human Perception and Performance.

Long, B. R. (2014). Information is intrinsically semantic but alethically neutral. Synthese.

Lundgren, B. (2019). Does semantic information need to be truthful? Synthese.

Smokler, H. (1966). Informational Content: A Problem of Definition. The Journal of Philosophy.


Allo, P. (2010). A Classical Prejudice? Knowledge, Technology & Policy.

Floridi, L., & Floridi, L. (2011). Semantic information and the veridicality thesis. In The Philosophy of Information.

Fresco, N., & Michael, M. (2016). Information and veridicality: Information processing and the bar-hillel/carnap paradox. Philosophy of Science.

Floridi, L. (2011). Semantic Information and the Correctness Theory of Truth. Erkenntnis.

Oakes, R. (2005). Transparent veridicality and phenomenological imposters: The telling issue. In Faith and Philosophy.

Demir, H. (2014). Taking stock: Arguments for the veridicality thesis. Logique et Analyse.

14 views0 comments