Article
Contents1. Formulation of the subject matter 2. Information flow in a distributed system 3. Information channels 4. Information flow: the ideal case
5. Information flow: the practical case 6. Fallibility in the flow of information 7. Two versions of the theory
Channel Theory (also known as the Theory of Information Channels, the Theory of Information Flow or simply IF-Theory) is a logico-mathematical theory that models the flow of information among components of a so-called "distributed system". Barwise and Seligman (1997) is the standard source. There are previous versions of the theory that are acknowledged by the same name; in the last section we will deal with that problem.
1. Formulation of the subject matterThere is a fundamental question that channel theory tries to answer: "How is it possible for one thing to carry information about another?" (Barwise and Seligman 1997: xi). Since entities convey information about each other as far as they are classified by abstract states, and moreover the conveyed information depends also on certain background of connections (between things) and regularities (between abstract states), any answer to a particular instance of the previous question has to fit the following scheme (Barwise and Seligman 1997: 13).
Information report: The fact thata is in the abstract state F carries the information that b is in the abstract state G with respect to certain relationships that link a and b on the one hand, F and G on the other.It does not matter what
a, b, F, G are. It might be the case that a, b are objects and F, G are properties (as in monary predicate logic); perhaps a, b are situations whereas F, G are situation types (as in situation theory); maybe a, b are different instants a system goes by, while F, G are system descriptions in the form of tuples consisting of numbers (as in mathematical modelling). The point is that every part of a distributed system consists of a collection of tokens {a_{1}, a_{2},...} as well as a collection of types {F_{1}, F_{2},...}; both collections relate to each other by means of a classificatory relation, giving rise to items of the form "a is F".This account of information reports goes back to Dretske (1981). It was partially developed in the theory of situations of Barwise and Perry (1983), which Devlin (1991) updates. In situation theory, regularities between
2. Information flow in a distributed system Even though information is not defined, it is assumed as something that "flows" among the components of a system. Such components may be distant from one another in time and space; furthermore, they can be very different one to each other. That is why it is said that the system is "distributed" (in computer science this term has another meaning). Example: all the noticeboards, travel tickets and trains that make up a railway network form together a distributed system.
There are systematic correlations among components in every distributed system. They are "regularities" that support the system's information flow, which in turn can be modelled by different theoretical constructs we call "information channels".
There are four principles of information flow. They lead the mathematical development of the theory. - Information flow results from regularities in a distributed system.
- Information flow crucially involves both types and their particulars.
- It is by virtue of regularities among connections that information about some components of a distributed system carries information about other components.
- The regularities of a given distributed system are relative to its analysis in terms of information channels.
Let us see how to formalize the concepts of distributed system and information channel in such a way that they match the above four principles.
3. Information channels Parts of a distributed system are modelled as classifications. A classification
A is a structure (A,T,R) where A and T are non-empty sets of tokens and types respectively, and R is a relation from A to T. There might be tokens classified by several types, as well as types that classify several tokens. If a is in A and t is in T, then eRt means that a is of type t. A classification provides the vocabulary (via Two classifications Vertical lines represent classificatory relations; horizontal arrows are functions. Since the direction of We do not consider subscripts in Barwise and Seligman (1997: 34, 76) define an information channel as a collection of infomorphisms that share the same codomain. We can also say that a channel consists of a set { Every channel models those conditions that make the information flow possible in a distributed system, which in turn can be modelled by different information channels. A distributed system
D is a collection of elements informing about each other. Formally, D consists of an indexed class cla(D) of classifications together with a class inf(D) of infomorphisms whose domains and codomains are all in cla(D).An information channel
covers a distributed system K D if and only if cla(D) are the classifications of the channel and for every infomorphism f in inf(D) there are infomorphisms from both the domain and codomain of f to the core of such that the diagram formed by these three infomorphisms commutes. The underlying idea is that all classifications in the distributed system are informational parts of the core whose channel covers the system. In Barwise and Seligman (1997: 89-97) it is shown how to construct an information channel out of a distributed system. KAn information channel with four components could be e.g. a flashlight of which we consider the bulb ( Information flows across the channel: switch being ON and battery being charged inform that the bulb is lite unless the case is broken; battery working properly informs that the bulb can be either lite or unlite, etc.
It is possible to simplify a channel so that it contains only two classifications and one infomorphism. In order to do that we get together its parts Tokens of [
Information channels tell us Every classification
A is equipped with its own "theory", namely the class of regularities among types that are supported by the tokens. How to formalize this idea of regularity that depends on the idea of classification? If T_{1}, T_{2} are subsets of T, then a token a of A satisfies the pair (T_{1},T_{2}) if and only if aRt for all t in T_{1} implies aRt for some t in T_{2}. Every pair (T_{1},T_{2}) satisfied by some token is a regularity. The theory Th(
A) generated by A is a structure (T,=>) consisting of the set T of types in A together with a consequence relation => comprised by all regularities in A. Given a theory, we write T_{1}=>T_{2} and say that T_{1} implies T_{2} whenever (T_{1},T_{2}) is a regularity of the theory. Relation => obey the logical properties of identity, monotony and cut that characterize deductive inference.Once we have the concepts of classification, theory, infomorphism and information channel, it is feasible to try out a first analysis of information flow. Let be given a channel A_{1} and A_{2} inform one about another in virtue of their informational memebership C. The diagram looks like this:
Initial proposal: Leta_{1} be of type t_{1} in A_{1} and a_{2} of type t_{2} in A_{2}. Then a_{1}'s being of type t_{1} in A_{1} informs that a_{2} is of type t_{2} in A_{2}, relatively to the channel , if and only if Ka_{1} and a_{2} are connected through some token in C and moreover f^{+}(t_{1}) implies f^{+}(t_{2}) in the teory Th(C) (Barwise and Seligman 1997: 35).
This first analysis bears in mind regularities in C instead of regularities among the parts of the system. This is because we have adopted a viewpoint external to this system, assuming as well that we are given complete information about its regularities. We have idetified that information with Th(C). But in practice it is unlikely, if not impossible, that we know all these regularities. That's why it is convenient to revise the previous analysis: we have to assume an internal viewpoint with respect to the system, wherein we are so to speak considering just a part of the system; from observation of that part -together with our incomplete and fallible knowledge of the system as a whole- we have to extract information about other parts of the system. How to do this? By means of local logics.
5. Information flow: the practical case Given a classification In general, for all infomorphism T_{2}^{–f} in A into T_{1} => T_{2} in B. Elim-f translates T_{1} => ^{f}T_{2} in ^{f}B into T_{1} => T_{2} in A. By means of Intro-f, validity is preserved, while non-validity is not; by means of Elim-f, non-validity is preserved, while validity is not. Closer analysis of rules Intro-f and Elim-f suggests that we should generalize the concept of theory in order to cover logical systems that are possibly unsound or incomplete. In the car example, as we apply Intro-
f_{1} to the theory Th(A_{1}) we get a consistent theory that might not be complete, and as we apply Elim-f_{2} to that theory we get a third one (this time over A_{2}) that might be unsound or incomplete or both.A local logic
L = (A,=>,N) consists of a classification A, a binary relation => on type sets from A satisfying identity, monotony and cut, as well as a subset N of "normal" tokens from A satisfying all pairs (T_{1},T_{2}) such that T_{1}=>T_{2}. Logic L is sound if every token is normal; it is complete if for every pair (T_{1},T_{2}) satisfied by every normal token, it is true that T_{1}=>T_{2}. The sound and complete local logic of A is Log(A), which is but a generalization of Th(A).If we have in the previous diagram a local logic Does this fact bear any relation at all with our logical model of information flow? Let us suppose there is a channel equipped with two components, as in the former diagram, but this time we do not have Log( As we take
Given a channel with only one infomorphism, its distributed logic is the inverse image of the local logic associated to the core (Barwise and Seligman 1997: 183). This proposal is somehow less explicit than the previous one in that it does not mention the "information report" of the first section. However, it is obviously coherent with such a scheme. To see it you only have to work out the basic proposal having into account the concepts involved in the sum of classifications.
6. Fallibility in the flow of information Whether a pair of types ( A way of restricting the number of regularities in a channel that has the same parts as K'yet a different core K C' that lies between C and the parts in such a way that the following diagram commutes. A straightforward case is that where functions in
r are identities and C' contains more tokens than C. From this case it should be obvious that, the more refined a channel, the more reliable the information it supports, since the number of connections between tokens of different parts of the system increases. With respect to the types: by Intro-r every regularity in is a regularity in K' as well; now, by Elim-Kr not every regularity in is a regularity in K . This means that whenever a regularity in K' fails (because of exceptions) we do not have to seek alternative logics but alternative channels.KSuppose that
7. Two versions of the theory There are two versions of channel theory. The second one is a development of the first one, which in turn stems from situation theory. Both versions originate in the collaborative work of Jon Barwise and Jerry Seligman during the 1990s.
*First version.*The first published paper is Barwise (1992). There it is suggested that situation theory cannot explain fallibility in the information flow because it considers relationships between types of situations without paying attention to relationships between concrete situations. Such relations are introduced and the resulting model is analyzed. Barwise (1993) is a much more sophisticated exposition. Seligman (1990, 1991a, 1991b) had developed very similar ideas to those of Barwise independently. From collaboration of these two authors arise the technical paper Barwise and Seligman (1993) and the more philosophical Barwise and Seligman (1994). This version of the theory was summarized in the survey paper Moss and Seligman (1994).*Second version*. The first and still standard reference is Barwise and Seligman (1997), where the previous version of the theory is reformulated in the mathematical framework of category theory, in particular the theory of Chu spaces (Barr 1979; Pratt 1999). Algebraic constructions over Chu spaces provide the semantics of the theory. Barwise (1997) investigate linkages to modal logic, whereas Barwise (1999) is an application of the theory to the study of non-monotonic reasoning. Seligman (2009), in turn, is an attempt of merging the second version of channel theory with Shannon's statistical theory of signal transmission and codification (1948).
Pérez-Montoro (2000, 2007) takes the viewpoint of information content in his comprehensive survey of Shannon, Dretske, situation theory and the first version of channel theory. Restall (2005) deals with the first version of the theory from a logical perspective. Some recent surveys of information theories, like Devlin (2001) or Bremer and Cohnitz (2004), devote a separate chapter to the second version of channel theory.
References
- BARR, M. (1979).
**-Autonomous Categories, with an Appendix by Po Hsiang Chu*. Lecture Notes in Mathematics 752. Heidelberg: Springer-Verlag. - BARWISE, J. (1992). “Information links in domain theory”. In: BROOKES, S. et al. (eds.).
*Mathematical Foundations of Programming Semantics*, Lecture Notes in Computer Science 598, pp. 168–192. Berlin / Heidelberg / New York: Springer-Verlag. - BARWISE, J. (1993). “Constraints, channels and the flow of information”. In: ACZEL, P. et al. (eds.).
*Situation Theory and Its Applications. Volume 3*, CSLI Lecture Notes 37, pp. 3–27. Stanford: CSLI Publications. - BARWISE, J. (1997). “Information and Impossibilities”.
*Notre Dame Journal of Formal Logic*, Vol. 38(4), pp. 488–515. - BARWISE, J. (1999). “State-spaces, local logics and non-monotonicity”. In: MOSS, L. S. et al. (eds.).
*Logic, Language and Computation. Volume 2*, CSLI Lecture Notes 96, pp. 1–20. Stanford: CSLI Publications. - BARWISE, J. & PERRY, J. (1983).
*Situations and Attitudes*. Cambridge, MA: Bradford Books / The MIT Press. - BARWISE, J. & SELIGMAN, J. (1993). “Imperfect Information Flow”. In: VARDI, M. (ed.).
*Proceedings. Eight Annual IEEE Symposium on Logic in Computer Science*, pp. 252–260. Montreal: IEEE Computer Society Press. - BARWISE, J. & SELIGMAN, J. (1994). “The rights and wrongs of natural regularity”. In: TOMBERLIN, J. E. (ed.).
*Philosophical Perspectives, 8, Logic and language*, pp. 331–364. Atascadero, CA: Ridgeview. - BARWISE, J. & SELIGMAN, J. (1997).
*Information Flow: The Logic of Distributed Systems*. Cambridge: Cambridge University Press. - BREMER, M. & COHNITZ, D. (2004).
*Information and Information Flow*. Frankfurt / Lancaster: Ontos Verlag. - DEVLIN, K. (1991).
*Logic and Information*. Cambridge: Cambridge University Press. - DEVLIN, K. (2001).
*The Mathematics of Information*. [Online] Helsinki (Finland): European School of Logic, Language and Information. <http://www.helsinki.fi/esslli/courses/Logicinfo.html> [Consulted: 18/12/2009] - DRETSKE, F. (1981).
*Knowledge and the Flow of Information*. Cambridge, MA: The MIT Press. - MOSS, L. S. & SELIGMAN, J. (1994). “Classification domains and information links: a brief survey”. In: VAN EIJCK, J. & VISSER, A. (eds.).
*Logic and Information Flow*, pp. 112–124. Cambridge, MA / London: The MIT Press. - PÉREZ-MONTORO, M. (2000).
*El fenómeno de la información. Una aproximación conceptual al flujo informativo*. Madrid: Trotta. - PÉREZ-MONTORO, M. (2007).
*The Phenomenon of Information. A Conceptual Approach to Information Flow*. Medford, NJ: The Scarecrow Press, Inc. [English translation of Pérez-Montoro (2000).] - PRATT, V. (1999).
*Chu Spaces*. [Online] Coimbra (Portugal): School on Category Theory and Applications. <http://boole.stanford.edu/pub/coimbra.pdf> [Consulted: 18/12/2009] - RESTALL, G. (2005). “Logics, situations and channels”.
*Journal of Cognitive Science*, Vol. 6, pp. 125–150. [Publicado en 2006. Una versión previa titulada “Notes on Situation Theory and Channel Theory” ya estaba disponible online en 1996 desde la web del autor.] - SELIGMAN, J. (1990).
*Perspectives: A Relativistic Approach to the Theory of Information*. PhD Thesis. Centre for Cognitive Studies. Edinburgh: University of Edinburgh. - SELIGMAN, J. (1991a). “Perspectives in Situation Theory”. In: COOPER, R. et al. (eds.).
*Situation Theory and Its Applications. Volume 1*, CSLI Lecture Notes 22, pp. 147–191. Stanford: CSLI Publications. - SELIGMAN, J. (1991b). “Physical situations and information flow”. In: BARWISE, J. et al. (eds.).
*Situation Theory and Its Applications. Volume 2*, CSLI Lecture Notes 26, pp. 257–292. Stanford: CSLI Publications. - SELIGMAN, J. (2009). “Channels: From Logic to Probability”. In: SOMMARUGA, G. (ed.).
*Formal Theories of Information*, pp. 193–233. Berlin / Heidelberg / New York: Springer-Verlag. - SHANNON, C. E. (1948). “A Mathematical Theory of Communication”.
*Bell System Technical Journal*, Vol. 27 (July, October), pp. 379–423, 623–656.
| Entries
New entry. Before doing a new entry, please, copy this line and the following ones and paste them at the column bottom. Next fill out the fields: 'name', 'date' and 'text', and delete this upper blue paragraph.Name (date)[Entry text]
Incorporated entries
Julio Ostalé (jan.2010)[It correspond with the article directly edited by the editor/author, shown in the left column] |