As people spend more time with computers, their reliance on websites and Internet service providers grow. And yet, the government’s ability to monitor these technologies must keep up.
Edward Snowden leaked details to The Washington Post and The Guardian early June about a secret NSA program called “PRISM.” Government departments in the U.S. and the U.K. worked with nine of the largest Internet and tech companies to secretly obtain all kinds of data: messages, personal videos, emails, pictures, blog posts and Internet searches.
Moving forward, the greatest concern lies not in what information is entrusted to the government, but rather the ways in which it is used and how it is delegated.
If information is used for a specific purpose and is necessitated by a clear, compelling government interest, then the government has a legitimate reason to monitor that information. Most often, this is for counterterrorism but can also be used to counter identity theft, among other crimes.
Fishing expeditions are what separate the reasonability of government searches. Citizens ought not be comfortable with the government combing through billions of packets of data, as there have been much more direct methods which law enforcement have used, such as surveilling a suspect’s Internet activity.
Compared to the Supreme Court’s June 3 decision in Maryland v. King, which allows law enforcement officers to use DNA obtained from a suspect at the time of arrest, PRISM raises serious questions over the government’s role in handling our information. As with the PRISM program, citizens must be able to evaluate the purposes being served by these practices.
Despite the Supreme Court’s unwillingness to say so, the real purpose of taking an arrestee’s DNA is to potentially match it against cold cases which would otherwise have no leads and no chance of being solved. This is legitimate because the government collects a specific set of data — a suspect’s genetic code — whereas PRISM collects an unduly large set of data to search for legal transgressions.
What is most shocking about PRISM is the sheer volume of data it gathered. In a 30-day period in March, the NSA collected 97 billion pieces of information from computer networks across the world, 3 billion of which came from within the U.S., according to The Guardian. Despite testimony from NSA officials that a significant sample of attacks were prevented on both sides of the Atlantic, there must be ways to coordinate surveillance of these criminal groups without sifting through the data of millions of Americans.
Case in point: In 2012, Tarek Mahenna of Massachusetts was convicted of conspiring to help al-Qaida when he was found translating documents for the terrorist group. The government has the resources to monitor and scan websites, such as Mahenna’s, which actually work with enemies of democracy.
Instead of implementing broad, secretive programs like PRISM, the government must preserve some form of uniform accountability. It must publicize the guidelines that stipulate when information will be accessed, how long it’ll remain on government servers, why is it being accessed and whether collecting the data fulfills a narrowly tailored government interest. Otherwise, Americans could lose the very ideal of freedom upon which the U.S. was built.
Jack Merritt is a sophomore majoring in history.