Many readers may not be intimately familiar with Bruce Schneier; to end that, here are some facts about him, [Via BruceScheierFacts]
-Bruce Schneier can watch a Blue Ray encrypted movie, just by looking at the disk with his naked eye.
-For Bruce Schneier, all zeros of the Riemann zeta function are trivial.
-Heisenberg’s Uncertainty Principle doesn’t protect your qubits from Bruce Schneier. Bruce knows with certainty.
-Bruce Schneier’s fists violate the anti-circumvention clause of the Digital Millennium Copyright Act.
-Bruce Schneier can tap fiber optic cable just by smelling it.
-Crytanalysis doesn’t break cryptosystems. Bruce Schneier breaks cryptosystems.
-Bruce Schneier’s cryptographic know-how can’t even fit in a ZFS filesystem.
And, perhaps the most telling description of him is contained in this piece of Zen;
Science is defined as mankinds futile attempt at learning Bruce Schneiers private key
Now, more seriously (the above are jokes, in the meme’ic vein of “ChuckNorris Facts”… only cooler, owing to being about Bruce Schneier, not “Chuck Norris”, whoever that may be.
Security theatre refers to security measures that make people feel more secure without doing anything actually to improve their security. An example: the photo ID checks that have sprung up in office buildings. No-one has ever explained why verifying that someone has a photo ID provides any actual security, but it looks like security to have a uniformed guard-for-hire looking at ID cards. Airport-security examples include the National Guard troops stationed at US airports in the months after 9/11 – their guns had no bullets. The US colour-coded system of threat levels, the pervasive harassment of photographers, and the metal detectors that are increasingly common in hotels and office buildings since the Mumbai terrorist attacks, are additional examples.
This selection from Schneier’s essay titled “Virginia Tech Lessons: Rare risks and overreaction speaks to a very interesting social psychology phenomena
“In other words, proximity of relationship affects our risk assessment. And who is everyone’s major storyteller these days? Television. (Nassim Nicholas Taleb’s great book, The Black Swan: The Impact of the Highly Improbable, discusses this.)
Consider the reaction to another event from last month: professional baseball player Josh Hancock got drunk and died in a car crash. As a result, several baseball teams are banning alcohol in their clubhouses after games. Aside from this being a ridiculous reaction to an incredibly rare event (2,430 baseball games per season, 35 people per clubhouse, two clubhouses per game. And how often has this happened?), it makes no sense as a solution. Hancock didn’t get drunk in the clubhouse; he got drunk at a bar. But Major League Baseball needs to be seen as doing something, even if that something doesn’t make sense — even if that something actually increases risk by forcing players to drink at bars instead of at the clubhouse, where there’s more control over the practice.
I tell people that if it’s in the news, don’t worry about it. The very definition of “news” is “something that hardly ever happens.” It’s when something isn’t in the news, when it’s so common that it’s no longer news — car crashes, domestic violence — that you should start worrying.
But that’s not the way we think. Psychologist Scott Plous said it well in The Psychology of Judgment and Decision Making: “In very general terms: (1) The more available an event is, the more frequent or probable it will seem; (2) the more vivid a piece of information is, the more easily recalled and convincing it will be; and (3) the more salient something is, the more likely it will be to appear causal.”
So, when faced with a very available and highly vivid event like 9/11 or the Virginia Tech shootings, we overreact. And when faced with all the salient related events, we assume causality. We pass the Patriot Act. We think if we give guns out to students, or maybe make it harder for students to get guns, we’ll have solved the problem. We don’t let our children go to playgrounds unsupervised. We stay out of the ocean because we read about a shark attack somewhere.
It’s our brains again. We need to “do something,” even if that something doesn’t make sense; even if it is ineffective. And we need to do something directly related to the details of the actual event. So instead of implementing effective, but more general, security measures to reduce the risk of terrorism, we ban box cutters on airplanes. And we look back on the Virginia Tech massacre with 20-20 hindsight and recriminate ourselves about the things we should have done. In fact, the incident has been used as evidence both for and against gun control.”
For interests sake, here are some of his most recent essays, op ed pieces, and commentary.
March/April 2011 • IEEE Security & Privacy
January 28, 2011 • CNN
Why Terror Alert Codes Never Made Sense
January 2011 • Information Security
Schneier-Ranum Face-Off on Whitelisting and Blacklisting
December 2, 2010 • Financial Times
It Will Soon Be Too Late to Stop the Cyberwars
December 2, 2010 • The Atlantic
Why the TSA Can’t Back Down
December 2, 2010 • The New York Daily News
Close the Washington Monument
November 2010 • Information Security Magazine
The Dangers of a Software Monoculture
November 23, 2010 • New York Times Room for Debate Blog
A Waste of Money and Time
November 11, 2010 • Forbes
The Plan to Quarantine Infected Computers
November 10, 2010 • Dark Reading
When to Change Passwords