Quantifying Information Flow with Constraints

Zhu, P. (2010). Quantifying Information Flow with Constraints. (Unpublished Doctoral thesis, City University London)

Text - Accepted Version
Download (859kB) | Preview


Quantifying flow of information in a program involves calculating how much information (e.g. about secret inputs) can be leaked by observing the program's public outputs. Recently this field has attracted a lot of research interest, most of which makes use of Shannon's information theory, e.g. mutual information, conditional entropy, etc. Computability entails that any automated analysis of information is necessarily incomplete. Thus quantitative flow of analyses aim to compute upper bounds on the sizes of the flows in a program. Virtually all the current quantitative analyses treat program variables independently, which significantly limits the potential for deriving tight upper bounds. Our work is motivated by the intuition that knowledge of the dependencies between program variables should allow the derivation of more precise upper bounds on the size of flows, and that classical abstract interpretation provides an effective mechanism for determining such dependencies in the form of linear constraints. Our approach is then to view the problem as one of constrained optimization (maximum entropy), allowing us to apply the standard technique of Lagrange multiplier method. Application of this technique turns out to require development of some novel methods due to the essential use of non-linear (entropy) constraints, in conjunction with the linear dependency constraints. Using these methods we obtain more precise upper bounds on the size of information flows than is possible with existing analysis techniques.

Item Type: Thesis (Doctoral)
Subjects: T Technology > T Technology (General)
Divisions: School of Informatics > Department of Computing
URI: http://openaccess.city.ac.uk/id/eprint/12101

Actions (login required)

View Item View Item


Downloads per month over past year

View more statistics