As someone who's had a history of taking liberties with poorly-protected systems (eveyone's young, once), I recognize the value of locking down technology. Because I know there are bored people out there and because I know there are truly malicious people out there, I make efforts to protect things against them. I understand the value of "systems security".
That said, I have to deal with others interpretations of what it means to make a system "secure". In general, security is at odds with usability and functionality. The key to good security is finding "balance". Sadly, so much of systems security is left to people who've never broken a system and who've only ever read papers, articles and/or books on security. So, when someone writes a security recommendation, the typical security person takes that recommendation as gospel or comes to the wrong interpretation of that recommendation (or fails to consider the impacts of what following a recommendation is).
This type of blind approach security always leaves me scratching my head. Invariably, the people implementing these policies in a context of ignorance leave gaping holes in systems. They'll lock down an avenue to a given piece of information. But, because they don't really understand the systems they're securing, they don't realize that there's a dozen other ways to get the same data (or that some data are critical to overall system usability and maintenance). In the end, it leaves you, as a system user, wondering "what the hell were they thinking" or "what the hell was the point of doing X". Today, what I found myself wondering was, "who the fuck removes `whereis` from a standardized UNIX deployment??"
No comments:
Post a Comment