Paul Krugman points out the obvious with the attempted airline bomber slipping through the cracks... too much information for anyone to sort through. That's the problem with this data mining they think is such a woderful idea. Sweep everyone in, open possibilities for abuse if the information gets into the wrong hands, and miss the stuff you ought to actually be worried about.
KRUGMAN: I think we do want -- I mean, someone's head ought to roll over this. Something needs to be looked at. But if you read your military history, every major military surprise that ever happened, there were ample warnings. You go back to the record; you find out there was information.
The trouble is, there is so much information. You know, there's 500,000 people on this list we're talking about. Stuff is going to fall through the cracks. Ultimately, you do what you can, but someone who is prepared to die while killing a bunch of civilians, that's going to happen now and then. In fact, we're quite lucky it didn't happen now.
But, you know, I think -- I think we are using a lot of 20/20 hindsight. What was the kind of thing that always happens whenever anything goes wrong.
DOWD: Well, to me, OK, so the situation now is, what do we do in the aftermath of this?
So what it looks like we usually do is we profile an article of clothing, not the person.
And so we're reluctant, because of politically correctness, to profile a person, but the shoe bomber happens and now we all have to put our shoes on the conveyer belt for it to go through, and we're not going to profile a person.
This guy's underwear is on fire...
I'm afraid what we next have to profile...
(UNKNOWN): Everyone's going to have to wear their underwear on the outside.
Sadly they may not be that far off the mark with how much worse airline security is going to get now.