Journal Article

In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory

Roman Frigg

in The British Journal for the Philosophy of Science

Published on behalf of British Society for the Philosophy of Science

Volume 55, issue 3, pages 411-434
Published in print September 2004 | ISSN: 0007-0882
Published online September 2004 | e-ISSN: 1464-3537 | DOI: http://dx.doi.org/10.1093/bjps/55.3.411
In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory

More Like This

Show all results sharing these subjects:

  • Philosophy of Science
  • Science and Mathematics

GO

Show Summary Details

Preview

On an influential account, chaos is explained in terms of random behaviour; and random behaviour in turn is explained in terms of having positive Kolmogorov-Sinai entropy (KSE). Though intuitively plausible, the association of the KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. I provide this justification for the case of Hamiltonian systems by proving that the KSE is equivalent to a generalized version of Shannon's communication-theoretic entropy under certain plausible assumptions. I then discuss consequences of this equivalence for randomness in chaotic dynamical systems.

Introduction

Elements of dynamical systems theory

Entropy in communication theory

Entropy in dynamical systems theory

Comparison with other accounts

Product versus process randomness

Journal Article.  0 words. 

Subjects: Philosophy of Science ; Science and Mathematics

Full text: subscription required

How to subscribe Recommend to my Librarian

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.