Logic of information

MyWikiBiz, Author Your Legacy — Thursday October 10, 2024
Revision as of 03:24, 16 November 2015 by Jon Awbrey (talk | contribs) (update)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This page belongs to resource collections on Logic and Inquiry.

The logic of information, or the logical theory of information, considers the information content of logical signs — everything from bits to books and beyond — along the lines initially developed by Charles Sanders Peirce.  In this line of development the concept of information serves to integrate the aspects of logical signs that are separately covered by the concepts of denotation and connotation, or, in roughly equivalent terms, by the concepts of extension and comprehension.

Peirce began to develop these ideas in his lectures “On the Logic of Science” at Harvard University (1865) and the Lowell Institute (1866).  Here is one of the starting points:

Let us now return to the information.  The information of a term is the measure of its superfluous comprehension.  That is to say that the proper office of the comprehension is to determine the extension of the term.  For instance, you and I are men because we possess those attributes — having two legs, being rational, &tc. — which make up the comprehension of man.  Every addition to the comprehension of a term lessens its extension up to a certain point, after that further additions increase the information instead.

Thus, let us commence with the term colour;  add to the comprehension of this term, that of redRed colour has considerably less extension than colour;  add to this the comprehension of darkdark red colour has still less [extension].  Add to this the comprehension of non-bluenon-blue dark red colour has the same extension as dark red colour, so that the non-blue here performs a work of supererogation;  it tells us that no dark red colour is blue, but does none of the proper business of connotation, that of diminishing the extension at all.

Thus information measures the superfluous comprehension.  And, hence, whenever we make a symbol to express any thing or any attribute we cannot make it so empty that it shall have no superfluous comprehension.  I am going, next, to show that inference is symbolization and that the puzzle of the validity of scientific inference lies merely in this superfluous comprehension and is therefore entirely removed by a consideration of the laws of information.  (C.S. Peirce, “The Logic of Science, or, Induction and Hypothesis” (1866), CE 1, 467).

References

  • De Tienne, André (2006), "Peirce's Logic of Information", Seminario del Grupo de Estudios Peirceanos, Universidad de Navarra, 28 Sep 2006. Online.
  • Peirce, C.S. (1867), "Upon Logical Comprehension and Extension", Online.

Resources

Syllabus

Focal nodes

Peer nodes

Logical operators

Template:Col-breakTemplate:Col-breakTemplate:Col-end

Related topics

Template:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-end

Relational concepts

Template:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-end

Information, Inquiry

Template:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-end

Related articles

Template:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-end

Document history

Portions of the above article were adapted from the following sources under the GNU Free Documentation License, under other applicable licenses, or by permission of the copyright holders.