TY - JOUR KW - Electronic Warfare KW - Entropy KW - Information KW - Mutual Information KW - Shannon-HartleyTheorem KW - Situation Assessment AU - Andrew Borden AB -

The Shannon-Hartley Theorem for the information-carrying capacity of a noisy communication channel is an elegant way to unify Attack and Protect concepts in Electronic Warfare (EW). Using this principle, all EW measures can be seen as attempts to increase (reduce) bandwidth or Signal to Noise Ratio. Shannon’s Formula for Mutual Information is an extension of this principle to the Information Dialectic in which all Attack and Protect measures are attempts to increase (reduce) Information Bandwidth or Entropy (ambiguity). This characterization of Information Warfare is domain independent and very widely applicable.

BT - Information & Security: An International Journal DA - 2000 DO - http://dx.doi.org/10.11610/isij.0402 LA - eng N2 -

The Shannon-Hartley Theorem for the information-carrying capacity of a noisy communication channel is an elegant way to unify Attack and Protect concepts in Electronic Warfare (EW). Using this principle, all EW measures can be seen as attempts to increase (reduce) bandwidth or Signal to Noise Ratio. Shannon’s Formula for Mutual Information is an extension of this principle to the Information Dialectic in which all Attack and Protect measures are attempts to increase (reduce) Information Bandwidth or Entropy (ambiguity). This characterization of Information Warfare is domain independent and very widely applicable.

PY - 2000 SP - 33 EP - 40 T2 - Information & Security: An International Journal TI - The Dialectics of Information – A Framework VL - 4 ER -