How to Lie with Maps: When Open Source and National Security Collide
On Friday, I received an article that was published by C|NET and reprinted on CNN entitled California lawmaker wants to blur Google Earth. I spent the weekend driving around my county with a set of maps and a GPS device, plotting and ground truthing a variety of sites where we can put operators for an upcoming drill and I finally got around to reading the article and it really has me wondering if Assemblyman Anderson has taken leave of his senses.
This is not the first time that the world of Open Source and the desire for national security has slammed together. The most notable of these was the release of Phil Zimmerman’s PGP software in the mid 1990s. But is this really an issue of national security?
I have always loved maps. So much so, that I wanted to be a cartographer when I grew up. I even picked the universities I wanted to attend based on their offerings of cartographic programs, minors and majors. In the 1980s, when I was entering university, the science of the art of cartography was changing from a pure hand-based art to one that was more and more based on computer systems, and databases, long held on paper, were being moved to silicon and steel. This was a big deal. It was not enough to have a steady hand and a keen eye; you now had to be a computer technician as well.
According to the bill introduced by Assemblyman Anderson, images provided by commercial web sites shall not provide aerial or satellite photographs or imagery of a building or facility in this state that is identified on the Internet Web site by the operator as a school or place of worship, or a government or medical building or facilityand shall not provide street view photographs or images of the buildings and facilities. His argument is based in the assumption that planners of the attacks in Mumbai utilized tools such as Google Earth to plan their attacks and that information, freely available today, could be utilized to coordinate similar attacks in the United States. For most, this argument rings true. We see examples of these sorts of maps and images every day on television and in some cases, these representations are even close to what the state of the art is capable of providing. As a trained geographer however, I would suggest that Assemblyman Anderson needs to actually use some of these tools, and then get out of his office and go for a walk.
To test the theory that you can actually get accurate planning documents from the Internet without ever going to the site, I decided to pick a site I know well and one you can follow along with. It is also one that is heavily protected, well known and easily identifiable. I chose the White House, located at 1600 Pennsylvania Ave, NW, Washington, DC. It is hardly a state secret that this building is one of the most fortified buildings in the United States, even before the unfortunate incidents of September 11, 2001. There is a high police presence surrounding the building at all times and it is identified, not only on Google maps, but also on printed maps provided by the United States Government itself.
Herein lays the problem with the Assemblyman’s requirement. Maps, paper maps, with key locations have been available for most places in the world for decades and even a simple topographic map of Washington DC would provide a would-be bad guy with enough planning information that it would be unnecessary to visit Google. The 7.5 minute Washington West quadrangle that contains the White House and most of the major tourist sites in Washington DC is a product of the United States Government. There are literally thousands of these maps in circulation today, in books, software and other forms. The government makes the underlying data available free of charge to those who want it and users of Open Source tools like GRASS are very happy for this.
Let’s do an experiment: As I write this, I am looking at an image of the White House in Google Maps on-line. If you do the same thing and zoom in, you will note that the roof and its supporting structures are not blurred. Further, if you look at the copyright date, you will see that the image appears to be current, and thus justify the Alderman’s concerns that we are giving away sensitive information.
Zoom in and adjust your view to the point where Pennsylvania Avenue almost fills your screen and then move to the east (right) until you see a series of round objects clustered in a group of four (they are easier to spot). These are planters. Just to the left, there are a two in a row a white object laid across the street north-south, and more planters in the shadows on the south side of the street. If you are not a resident of DC, you would think that these are actually in place, after all, the image is current. What you are looking at is the first generation security measure put in place to close Pennsylvania Avenue to traffic following the bombing of another American city – Oklahoma City – in 1995! I can assure you, as I walk that way every day, that there are no planters in the middle of the street anymore. In fact, if you do a street level view from 15th Street looking back towards the White House, you will get a better idea of the current state of security, but you can get no closer than the middle of 15th Street.
My point here is two-fold. First, Google is already working to blur the images. They are using out of date images for bird’s eye views and they are restricting access to major monuments at street level. As a further experiment, I picked two schools. One in my neighbourhood, a suburb of DC and one in DC I happen to know well. Neither one was visible in any useful fashion. Further, many of the entrances to the Metro System (subway) are conveniently blocked by strategically placed buses and other vehicles. Secondly, most of this information is valueless without ground-truthing – the act of visiting the site and verifying that a road, path or obstacle present or absent on a map is in fact present or absent on the ground.
Could a skilled bad guy glean a lot of information from Google Earth or similar system? Sure. Just like you or I can plot out a route to maximize our walking time or minimize our commute. I would argue that I could get better information from a 7.5 minute quadrangle because it would also include the ability to generate points that I could enter into my GPS for turn-by-turn directions. But until we actually go through the motions, it is just static data. And I would suggest that the attackers in Mumbai, once they planned their route on paper, walked, rode and photographed any number of things that were not available publicly.
Access to information is just that. It is not evil or nefarious nor a prayer answered from above. It is data to be consumed and manipulated. It exists in any number of forms. And will continue to exists, long after the access to it has been cut off by those with small minds, bore-sighted on one small piece of the puzzle. The data exists. The Internet, that original act of Open Source, just makes it a little easier to access. What we do with it is our responsibility.
Shameless Plugs: If you are interested about the history of lying with maps, both intentionally and unintentionally, I would encourage you to read How to Lie with Maps. For more on Open Source GIS tools, O'Reilly has several books in their Mapping Topics.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Paranoid Penguin - Building a Secure Squid Web Proxy, Part IV
- SUSE LLC's SUSE Manager
- Google's SwiftShader Released
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide