How Edward Snowded Used a Web Crawler to Download NSA Data

I found Edward Snowden!The former security contractor of NSA had downloaded more than 200,000 top secret documents from NSA servers using a very simple automated process, a report from The New York Times says.

An official of NSA told NY Times that he used a widely available software called “web crawler” which is used for downloading a website’s offsite copy. The official further said that it was definitely an automated process and not like he getting all the data in one sitting.

Snowden downloaded data from the internal “wiki”  which is a shared resource for all the security analysts at NSA around the world.

An example of a web crawler is the Googlebot which scans and automatically downloads content from billions of websites on the internet for providing better search results later on. His web crawler was not as advanced as that of Google’s  but it definitely worked in a similar way, the official said.

Normally using such software would alert administrators but since Snowden was posted at a distant Hawaii outpost, he managed to remain concealed because security parameters had not been upgraded there. Though, he was challenged a few times on this activity, but he managed to satisfy the questioners.

Snowden is currently on the run from investigations and in Russia on asylum.

Share This Post

Post Comment