Philosophy of information

Philosophy of information

The philosophy of information (PI) is the area of research that studies conceptual issues arising at the intersection of computer science, information technology, and philosophy.

It includes: [Luciano Floridi, [ "What is the Philosophy of Information?"] , "Metaphilosophy", 2002, (33), 1/2.]

# the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation and sciences
# the elaboration and application of information-theoretic and computational methodologies to philosophical problems.


The philosophy of information (PI) has evolved from the Philosophy of artificial intelligence, logic of information, cybernetics, social theory, ethics and the study of language and information.

Logic of information

The logic of information, also known as the "logical theory of information", considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce.


One source for the philosophy of information can be found in the technical work of Norbert Wiener, Alan Turing, William Ross Ashby, Claude Shannon, Warren Weaver, and many other scientists working on computing and information theory back in the early 1950s.

Some important work on information and communication was done by Gregory Bateson and his colleagues.

tudy of language and information

Later contributions to the field were made by Fred Dretske, Jon Barwise, Brian Cantwell Smith, and others.

The Center for the Study of Language and Information (CSLI) was founded at Stanford University in 1983 by philosophers, computer scientists, linguists, and psychologists, under the direction of John Perry and Jon Barwise.


More recently this field has become known as the philosophy of information. The expression was coined in the 1990s by Luciano Floridi, who has published prolifically in this area with the intention of elaborating a unified and coherent, conceptual frame for the whole subject.

Defining information

What the word "information" means depends on how it is defined.


Peirce's concept of information serves to integrate the aspects of signs and expressions that are separately covered, on the one hand, by the concepts of denotation and extension, and on the other hand, by the concepts of connotation and comprehension.


Claude E. Shannon, for his part, was very cautious: “The word ‘information’ has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field.” (Shannon 1993, p. 180). Thus, following Shannon, Weaver supported a tripartite analysis of information in terms of (1) technical problems concerning the quantification of information and dealt with by Shannon's theory; (2) semantic problems relating to meaning and truth; and (3) what he called “influential” problems concerning the impact and effectiveness of information on human behaviour, which he thought had to play an equally important role. And these are only two early examples of the problems raised by any analysis of information.

A map of the main senses in which one may speak of information is provided by [ the Stanford Encyclopedia of Philosophy article] . The previous paragraphs are based on it.


Gregory Bateson defined information as "a difference that makes a difference". [ [ Extract from "Steps to an Ecology of Mind"] ]


According to Floridi, four kinds of mutually compatible phenomena are commonly referred to as "information":
* Information about something (e.g. a train timetable)
* Information as something (e.g. DNA, or fingerprints)
* Information for something (e.g. algorithms or instructions)
* Information in something (e.g. a pattern or a constraint).

The word "information" is commonly used so metaphorically or so abstractly that the meaning is unclear.

Philosophical directions

Computing and philosophy

Recent creative advances and efforts in computing, such as semantic web, ontology engineering, knowledge engineering, and modern artificial intelligence provide philosophy with fertile notions, new and evolving subject matters, methodologies, and models for philosophical inquiry. While computer science brings new opportunities and challenges to traditional philosophical studies, and changes the ways philosophers understand foundational concepts in philosophy, further major progress in computer science would only be feasible when philosophy provides sound foundations for areas such as bioinformatics, software engineering, knowledge engineering, and ontologies.

Classical topics in philosophy, namely, mind, consciousness, experience, reasoning, knowledge, truth, morality and creativity are rapidly becoming common concerns and foci of investigation in computer science, e.g., in areas such as agent computing, software agents, and intelligent mobile agent technologies.

According to L. Floridi [Luciano Floridi, [ "Open Problems in the Philosophy of Information"] "Metaphilosophy" 35.4, 554-582. Revised version of "The Herbert A. Simon Lecture on Computing and Philosophy" given at Carnegie Mellon University in 2001, with [ realvideo] and [ powerpoint presentation] ] one can think of several ways for applying computational methods towards philosophical matters:
# Conceptual experiments in silico: As an innovative extension of an ancient tradition of thought experiment, a trend has begun in philosophy to apply computational modeling schemes to questions in logic, epistemology, philosophy of science, philosophy of biology, philosophy of mind, and so on.
# Pancomputationalism: By this view, computational and informational concepts are considered to be so powerful that given the right Level of abstraction, anything in the world could be modeled and represented as a computational system, and any process could be simulated computationally. Then, however, pancomputationalists have the hard task of providing credible answers to the following two questions:
## how can one avoid blurring all differences among systems?
## what would it mean for the system under investigation not to be an informational system (or a computational system, if computation = information processing)?

Information and society

Philosophical studies of the social and cultural aspects of electronically mediated information have been carried out by numerous philosophers and other thinkers.

*Albert Borgmann, Holding onto Reality: The Nature of Information at the Turn of the Millennium (Chicago University Press, 1999)
*Mark Poster, The Mode of Information (Chicago Press, 1990)
*Luciano Floridi, Informational Nature of Reality, Key Talk selected at the E-CAP conference 2006 (Trondheim, 2006)


ee also

* Barwise prize
* Complex system
* Digital physics
* Game Theory
* Informatics
* Information
* Information art
* Information ethics
* Information theory
* International Association for Computing and Philosophy
* Logic of information
* Philosophy of artificial intelligence
* Philosophy of technology
* Philosophy of thermal and statistical physics
* Physical information
* Relational quantum mechanics
* Social informatics
* Statistical mechanics

External links

* [ IEG site, the Oxford University research group on the philosophy of information]
* [ PI website]
* [] Luciano Floridi, "Where are we in the philosophy of information? 21.06.06, University of Bergen, Norway, Podcast"
* [] Luciano Floridi, "What is the Philosophy of Information?", "Metaphilosophy", 33.1/2, 123-145. Reprinted in T.W. Bynum and J.H. Moor (eds.), "CyberPhilosophy: The Intersection of Philosophy and Computing" (Oxford – New York: Blackwell, 2003).
* [] G.M. Greco, G. Paronitti, M. Turilli, L. Floridi, "How to Do Philosophy Informationally", "Lecture Notes on Artificial Intelligence", 3782, pp. 623–634, 2005.
* [] "The Blackwell Guide to the Philosophy of Computing and Information", edited by Luciano Floridi (Oxford - New York: Blackwell, 2004).
* [ Philosophy of Statistical Mechanics]

Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Information — as a concept has a diversity of meanings, from everyday usage to technical settings. Generally speaking, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning,… …   Wikipedia

  • Information ethics — is the field that investigates the ethical issues arising from the development and application of information technologies. It provides a critical framework for considering moral issues concerning informational privacy, moral agency (e.g. whether …   Wikipedia

  • information theory — The theory of measurement of information in which the information content of an event is measured by its statistical improbability …   Philosophy dictionary

  • Philosophy of mind — A phrenological mapping[1] of the brain. Phrenology was among the first attempts to correlate mental functions with specific parts of the brain. Philosophy of mind is a branch of philosophy that studies the nature of the mind, mental even …   Wikipedia

  • Philosophy — For other uses, see Philosophy (disambiguation) …   Wikipedia

  • Information — Das „i“ ist international ein Symbol für Information Information (lat. informare „bilden“, „eine Form, Gestalt, Auskunft geben“) ist eine zeitliche Abfolge von Signalen, deren Sinn und Bedeutung der Empfänger, nach seinen Möglichkeiten und… …   Deutsch Wikipedia

  • Information science — Not to be confused with Information theory. Contents 1 Introduction 2 A multitude of information sciences? 3 Definitions of information science 4 History …   Wikipedia

  • Philosophy encyclopedia — A philosophy encyclopedia is a comprehensive reference work that seeks to make available to the reader a number of articles on the subject of philosophy. Many online and hard copy encyclopedias of philosophy have been written, with encyclopedias… …   Wikipedia

  • Philosophy of artificial intelligence — The philosophy of artificial intelligence considers the relationship between machines and thought and attempts to answer such question as: [Harvnb|Russell|Norvig|2003|p=947 define the philosophy of AI as consisting of the first two questions, and …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia