• More information is not necessarily good. Sometimes information can cause harm.
  • Information hazards are risks that arise from the dissemination or the potential dissemination of true information that may cause harm or enable some agent to cause harm.

Typology by Data Transfer Mode

  • Data Hazard - Specific data that, if disseminated, creates risk.
    • Example: The genetic sequence of a deadly pathogen. Nuclear codes.
  • Idea Hazard - A general idea that, if disseminated, creates a risk, even without a data-rich detailed specification.
    • Example: The idea of using a fission reaction to create a bomb, or the mere demonstration thereof.
  • Attention Hazard - The mere drawing of attention to some particularly potent or relevant ideas or data increases risk, even when these ideas are already known.
    • Example: Drawing the adversary’s attention to a subset of especially potent avenues can greatly facilitate the search.
    • Efforts to contemplate in some risk area will do more harm than good.
  • Template Hazard - The presentation of a template enables distinctive modes of information transfer and thereby creates risk.
    • Example: Bad role models that inspire unwanted behavior in individuals. These individuals may emulate the personality of the bad model.
  • Signaling Hazard - The verbal and non-verbal actions can indirectly transmit information about some hidden quality to the sender, and such social signaling creates risk.
    • Example: Conspiracy theories may be such a hazard to academics because they can damage their reputations.
  • Evocation Hazard - There can be risk that the particular mode of presentation used to convey some content can activate undesirable mental states and processes.
    • Example: A vivid description of some event may activate psychological processes that lie dormant when the same event is recounted in dry prose.

Typology by Effect

Adversarial Risks

Competitive Hazard

  • By obtaining information, some competitor will become stronger, thereby weakening our position.
  • Example: The rival job applicant knew more and got the job.

Enemy Hazard

  • By obtaining information, our enemy or potential enemies become stronger. This threat increases the threat they pose to us.

Intellectual Property Hazard

  • There is a risk that some other competitor will obtain our intellectual property, thereby weakening our competitive position.

Commitment Hazard

  • The obtainment of some information will weaken one’s ability credibly to commit to some course of actions.
  • Example: Blackmailers make use of the knowledge that the target is aware the blackmailer has incriminating information.

Knowing-too-much Hazard

  • By possession of some information, makes us a potential target or object of dislike.
  • Example: Witches in the witch hunting period in Europe.
  • Example: Being a witness

Risks to Social Organization and Markets

Norm Hazard

  • Some social norms depend on a coordination of beliefs or expectations among many subjects; and a risk is posed by information that could disrupt these expectations for the worse.
  • Example: In a money-dependent society, counterfeiting or knowledge thereof is a hazard.

Information Asymmetry Hazard

  • Information Asymmetry is a scenario wherein one party in a transaction has more information than the other.
  • The hazard involves one party to a transaction having the potential to gain information that the others lack, giving them an advantage over others.

Unveiling Hazard

  • Information Symmetry - a scenario wherein both parties have equal information.
  • The functioning of some market and the support for some social policies depend on the existence of a shared veil of ignorance, and the lifting of which can undermine those markets and policies.
  • Example: Unveiling patented technologies may lead to developing negatively disruptive technologies
  • Examples: Support for protection of free speech and minority rights might weaken if individuals could be sure they would never find themselves in a prosecuted minority and censorship won’t apply to them.

Recognition Hazard

  • Some social fiction depends on some shared knowledge not becoming common knowledge, or not being publicly acknowledged. Public release of information could ruin the pretense.
  • Example: An incriminating open secret.

Risks of Irrationality and Error

Ideological Hazard

  • An idea might, by entering into an ecology populated by other ideas, interact in ways which, in the context of extant institutional and social structures, provides a harmful outcome even in the absence of any intention to harm.
  • It refers to the possibility that somebody will be mislead to head in some bad direction because of the way that some information interacts with false beliefs or incomplete knowledge.
  • Example: Extremism.

Distraction and Temptation Hazards

  • Information can harm us by distracting us or presenting us with temptation.
  • Example: Re-exposure of a rehab patient to alcohol.
  • Example: Hyper stimulation.

Role Model Hazards

  • An individual can be corrupted and deformed by exposure to bad role models.
  • Example: Copycat murderers and suicides.

Biasing Hazard

  • Information can lead us further astray because said information amplifies or triggers our biases
  • Example: Falling into more extremist viewpoints as a result of information that seemingly supports these viewpoints.
  • Example: Knowledge of human biases that reduces the ability to learn.

De-biasing hazard

  • Beneficial societal and moral biases could be eroded as a result of information.
  • Example: If society benefits from risk takers, which are reliant on a bias towards achievement, then removing this bias could hinder society.

Neuropsychological Hazard

  • Information might have negative effects on our psyches because of the particular ways our brains are structured. These effects would not arise in more idealized cognitive architectures.
  • Example: Mental illnesses and disorders.

Information Burying Hazard

  • Irrelevant information can make relevant information harder to find, thereby increasing search costs for agents with limited computational resources.
  • Example: Cryptography.
  • Example: Legalese. Doublespeak

Risks to Valuable States and Activities

Psychological Reaction Hazard

  • Information can reduce well-being by causing sadness, disappointment, or some other psychological effect in the viewer.

Disappointment Hazard

  • Our emotional well-being can be adversely affected by the receipt of bad news.

Spoiler Hazard

  • Fun that depends on ignorance and suspense is at risk of being destroyed by premature disclosure of truth.

Mindset Hazard

  • Our basic attitude or mindset might change in undesirable ways as a consequence of exposure to information of certain kinds.
  • Example: Learning about history may change our ideologies.
  • Example: The reduction of the wonder of life due to explaining everything via science.
  • Example: Learning about the past makes us more nihilistic and cynical.

Belief-constituted Value Hazard

  • If some component of well-being depends constitutively on epistemic or attentional states, then information that alters those states might thereby directly impact well-being.
  • Example: Learning more about others influences how we think about and perceive them

Mixed type

Embarrassment Hazard

  • We may suffer psychological distress or reputational damage as a result of embarrassing facts about ourselves being disclosed.

Risks from Information Technology Systems

Information System Hazard

  • The behavior of some (non-human) information system can be adversely affected by some informational inputs or system interactions

Information Infrastructure Failure Hazard

  • There is a risk that some information system will malfunction, either accidentally or as result of cyber attack; and as a consequence, the owners or users of the system may be inconvenienced, or third parties whose welfare depends on the system may be harmed, or the malfunction might propagate through some dependent network causing a wider disturbance.

Information Infrastructure Misuse Hazard

  • There is a risk that some information system, while functioning according to specifications, will service some harmful purpose and will facilitate the achievement of said purpose by providing useful information infrastructure.

Artificial Intelligence Hazard

  • There could be computer-related risks in which the threat would derive primarily from the cognitive sophistication of the program rather than the specific properties of any actuators to which the system initially has access.

Robot Hazard

  • There are risks that derive substantially from the physical capabilities of a robotic system.
  • Example: Malfunction of an autonomous vehicle leading to fatalities

Risks from Development

Development Hazard

  • Progress in some field of knowledge can lead to enhanced technological, organizational, or economic capabilities, which can produce negative consequences (independently of any particular extant competitive context).
  • Example: The nuclear bomb.
  • Example: Genetic engineering.
  • Example: Even just the building blocks to these fields.

Responses

  • Sometimes the best response is no response (i.e., for attention hazards)
  • Sometimes, the best response is to not invest time in discovering such hazards (i.e., for spoiler hazards)
  • Sometimes, the danger lies in partial information, and so the trick is providing full information.
  • Sometimes, the danger lies in a lack of policy, in which case policies should be developed.
    • Remember, that policies centered around information restriction tend to be used to serve special interests.
    • A unilateral openness tends to have better results. If Knowledge is Power, secret knowledge breeds power breeds corruption.
    • At the same time, the pursuit of knowledge can itself lead to greater hazards.
  • It is said that a little knowledge is a dangerous thing. It is an open question whether more knowledge is safer

Links