City Research Online

Reconciling Shannon and Scott with a Lattice of Computable Information

Hunt, S. ORCID: 0000-0001-7255-4465, Sands, D. & Stucki, S. (2023). Reconciling Shannon and Scott with a Lattice of Computable Information. Proceedings of the ACM on Programming Languages, 7(POPL), pp. 1987-2016. doi: 10.1145/3571740

Abstract

This paper proposes a reconciliation of two different theories of information. The first, originally proposed in a lesser-known work by Claude Shannon (some five years after the publication of his celebrated quantitative theory of communication), describes how the information content of channels can be described qualitatively, but still abstractly, in terms of information elements, where information elements can be viewed as equivalence relations over the data source domain. Shannon showed that these elements have a partial ordering, expressing when one information element is more informative than another, and that these partially ordered information elements form a complete lattice. In the context of security and information flow this structure has been independently rediscovered several times, and used as a foundation for understanding and reasoning about information flow.

The second theory of information is Dana Scott’s domain theory, a mathematical framework for giving meaning to programs as continuous functions over a particular topology. Scott’s partial ordering also represents when one element is more informative than another, but in the sense of computational progress – i.e. when one element is a more defined or evolved version of another.

To give a satisfactory account of information flow in computer programs it is necessary to consider both theories together, in order to understand not only what information is conveyed by a program (viewed as a channel, à la Shannon) but also how the precision with which that information can be observed is determined by the definedness of its encoding (à la Scott). To this end we show how these theories can be fruitfully combined, by defining the Lattice of Computable Information (LoCI), a lattice of preorders rather than equivalence relations. LoCI retains the rich lattice structure of Shannon’s theory, filters out elements that do not make computational sense, and refines the remaining information elements to reflect how Scott’s ordering captures possible varieties in the way that information is presented.

We show how the new theory facilitates the first general definition of termination-insensitive information flow properties, a weakened form of information flow property commonly targeted by static program analyses.

Publication Type: Article
Additional Information: © 2023 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution International License.
Publisher Keywords: Semantics, Information Flow
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Z Bibliography. Library Science. Information Resources > ZA Information resources
Departments: School of Science & Technology > Computer Science
SWORD Depositor:
[thumbnail of 3571740.pdf]
Preview
Text - Published Version
Available under License Creative Commons Attribution.

Download (517kB) | Preview

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login