• Information is a fundamental aspect of systems separate from both matter and energy

  • Information is the news of difference that makes a difference. It talks about facts in the system which are relevant for the specific analysis being conducted.

    • Because it is a difference between two things it is inherently comparative.
  • Information flow can be both internal — facilitating the mechanisms of the system, and external — allowing the system to respond to the environment.

  • Remember that information is the degree of surprise a system experiences upon receipt.

    • It is also a measure of reduction of a priori uncertainty to certainty.
    • Information does not add something unknown but rather removes an uncertainty.
    • The surprise (deviation from expected behavior) facilitates either a change in the system’s state or the system’s behavior.
  • Communication is the act of a sender inserting a message into a communication channel and a receiver accepting the message and acting on the information content of the message.

    • A message is a sequence of states of the communication channel that may or may not contain information depending on the state of the receiver.
    • The sender is any system that can routinely encode a message for insertion into a channel for conveyance.
    • A receiver is any system that can accept a message through a channel and for which the message potentially conveys some amount of information.
    • An observer is generally a purposeful receiver. They have an ability to interpret the messages they receive and use that information to modify their own internal organization.
    • A channel is any physical medium through which an information flow can be sent.
    • A signal is a disturbance in the channel as a result of sending the message.
    • Noise is any disturbance that can mask or corrupt the genuine message state.
    • Codes are methods used to frame a message to increase the likelihood of proper transmission and receipt.
    • A protocol is an a priori agreed-upon matched process for both sender and received that ensures both understand the nature of the message and the modulations of the signals.
    • A datum is a value, usually taken in some point in space and time, that represents a measure on some suitable scale and provides context for interpretation.
  • Information and knowledge are not subject to a conservation law.

    • Information creates more knowledge which creates more information.
    • Information can modify both the receiver’s state and the receiver’s behavior.
  • Information is analogous to entropy Information is maximized when entropy is maximized. Organization constrains the state of the system in the same way that Information reduces uncertainty.

  • One quantity of interest for communication is the information per unit of power. That is, the efficiency of a signal.

    • Transduction - occurs when an information channel permits one kind of energy flow in, generally at higher power but not always, and generates signals in another kind of energy flow, generally of lower power but not always.
    • Sensing (via sensors) allows a large energy flow to result in a smaller energy flow through a different route, resulting in information flow.
      • Measurement can be thought of in two ways (1) As inherent properties of the object; (2) As properties that emerge from the process of measurement (as we have defined it).
    • If another force modulates the source energy, it will propagate through the energy flow channel and through the sensor channel
    • Amplification occurs when a lower-power signal is used to control or modulate a high power flow.
      • An actuator is any device that can modulate a higher power energy flow using a lower energy control signal. They do work under the control of a small input signal.
  • The amount of information in a message is actually a property of the receiver (observer) and not of the sender (or the observed)

    • Another way to say this, the same message (data) can be interpreted differently by different systems and thus contain different amounts of information based on what the system expects.
  • Knowledge can be defined as the cumulative expectations with which a system moves into the future. It is the internal structure that matches the capacity of the system to dissipate flows in a steady-state equilibrium.

    • Another way to put it: It is the fit between the system’s expectation and the actual state of the environment.
    • Having knowledge reduces the informational value received from stimuli. In fact, we may quantify knowledge as the inverse of information
      Because the more you know, the less you will be surprised.
  • Information can be contextual, that is, dependent on previously observed states.

    • Context can boost a signal and reduce noise.

Links