With all of the news from Japan, and some additional information that’s been published, I thought it might be a nice time to revisit something I wrote last year about the “naked scanners” in use at airports.
“Cissna said she is not very sensitive about her mastectomy and did not have breast reconstruction surgery….Yet she is acutely sensitive to being aggressively touched in private areas because of a traumatic touching experience she underwent as a child, she said.”
“The TSA screener “put her full hand on my breast and said, ‘What is this?’ ” Bossi told the station. “And I said, ‘It’s my prosthesis because I’ve had breast cancer.’ And she said, ‘Well, you’ll need to show me that.’ “
There have not been many public issues with transfolk traveling, but it is a big topic of discussion.
Also? Now whenever those people are Googled, what’s going to come up? That they were molested as a child, carry their urine in a bag, or have a fake boob. That’s information that people would rather be allowed to share on their own.
Some legislators have suggested charging TSA with “sexual assault.” That’s not going to solve the problem. (And, frankly, I don’t think using “rape” in this context is productive at all–it trivializes a very violent and traumatic crime.) TSA employees are mostly low wage Joes and Janes that are carrying out their orders. (And adding to the security theater by flying in lingerie is not helping.)
The solution is better training, and more options. Many of the people who’ve had issues with TSA patdowns did go through the advanced scanners of one type or another–but were then pulled aside for additional searches because they were physically different.
Until we learn to deal with difference sensitively–whether it’s a Muslim woman in a headscarf or a non-standard body part–this will remain a problem.
But what about the second item discussed in that post, the actual scanners themselves? An article I linked to had the title “Felt up or Blown Up.” This pretty much summarizes the perception of a lot of people about the Hobson’s choice that they face when flying.
People generally are more afraid of technology-induced hazards than naturally-occurring hazards. We also tend to fear risks we can’t control over self-chosen risks as well (flying in an airplane vs. driving, for example).
The National Safety Council produced this excellent visualization of the comparative risks of different activities. Of course, the ultimate probability of your death is 100%.
The way in which a risk is communicated (“Radiation = bad!”, for example) can make a big difference in how we perceive it. It’s clear from another recent post on Skepchick, doctors seem to be telling people to not have “too many” X-rays and CT scans. That’s probably due to this recent paper: Elements of Danger: Medical Imaging, which discussed the way that patients were ending up with large cumulative exposures to ionizing radiation. It suggested that doctors seek to reduce the use of some types of medical scanning.
So, doctors pass that information on to their patients–as they should. But, how does that relate to planes and scanners?
Doctors and laypeople (including ME!) are not discriminating between different types of radiation, and coming to an incorrect conclusion that there might be some risk involved in the airport scanners.
So, learn from my fail: let’s explore why that mistake happens. Humans use heuristics to infer what will happen in a new situation. (“Heuristic” is a fancy way to say you use your past experience to predict what will happen in a new situation, when you don’t have a lot of time.)
One thing that affects how people assess risk of injury is availability. In other words, how easily can you think of an instance that is similar? Even though injuries from radiation are rare, they are highly publicized (and horrible), and so may seem disproportionately common.
The distrust of the government and medical authorities to not keep us “safe” from radiation probably stems, in part, from very public mistakes. That isn’t, though, a problem with the science. That is human error. And there is never a way to completely reduce that in any system, unfortunately.
Another type of heuristic is, ironically, trust in authority. I assumed my doctor knew what she was talking about; mostly because I don’t have time to check everything she tells me. Heuristics exist for us to make decisions quickly.
So that’s a couple forms of heuristic bias; there is also some interesting evidence that the way in which an issue is reported can be just as influential as the facts that are reported:
“news stories of public meetings filled with distrust and controversy led to ratings indicating greater perceived risk than news stories reporting no distrust or controversy, even though the risk information was held constant.” (emphasis mine)
This is a rather depressing conclusion. What this implies is that anytime there is a controversy, the hype will trump the facts. It predicts dismal success for skeptics trying to get good information across in the face of active opposition.
In my case, knowing that a small group of scientists had questions about the scanners, added in with my doctor’s advice, played a big role in my musings over flying. What I forgot was that a “thinking out loud” exploratory post about my thought process before stepping on a plane isn’t just a conversation that I have with myself–other people read it. And rightly call me out when I get details wrong.
Check out this brand new paper from the American Medical Association about the risks of airport full body scanning. Later this week, I’ll post an interview with a health Physicist to discuss the different types of radiation, and how they affect humans.
And now I’m going to go watch one of my favorite radiation disaster movies, THEM.