Let us be under no false delusions regarding the importance of raw data. It is, by far, the most important asset for any human being, anywhere, ever. and I do not say that lightly. If you do not know how to drink, water becomes useless, if you do not understand the most basic elements of commerce it is very possible that you could starve within a grocery shop.
The attainment of knowledge such as this, and much much more, is the reason that we educate children for so long, so that they not only know things that they need to, but also know where to obtain more information as and when they require it, as well as how to understand the information that they receive.
This contextual starting point, the narrative thoughts regarding the importance of information, information attainment and information understanding/de-coding, is imperative to being able to understand the controversy regarding the release of classified cables, documents and files.
If, for example, we are unaware that governments communicate with their embassy's overseas via cables, then finding out that secret cables have been released really means nothing to us. We wouldn't know what they were, as many of us didn't until quite recently. Before Wikileaks and Bradley Manning, communications between Embassy and State was not even considered by the public. In the same way as before Edward Snowdon, policies of GCHQ and the extent of their filtering algorithms to penetrate web-user generated content was so far out of the public focus that even when we were in up-roar regarding SOPA and ACTA, at no point did we drift to even being curious about data capturing.
Data capturing, which is to say the collection and storage of our created information, of information we have made such as telephone conversations, emails we have written, facebook chat conversations and online profiles. Why, you may ask, would anyone be interested in the often banal and mundane conversations that we have day to day on the telephone or via the internet? Most of it is complete drivel spewed by teens to attempt to fragment the boredom of their everyday existance. Imagine, if you can, having the inordinately monotonous task having to cypher through endless teenage messages regarding everything from adolescent arguements regarding what x said about y to z... all the way to an almost infinite pile of internet pictures affectionately called memes.
Under the circumstances of such a vast and boring collection of data, I personaly do not think it that surprising that GCHQ and the NSA wrote an algorithm to filter the information instead of subjecting some poor soul to this wretched form of mental torture. I would also be very surprised if the filter was as limited as has been expressed by several military figures. Bere in mind that available to the general public are computer memory components capable of storing several hundred terabytes of information and some of these components are smaller than a box of cigarettes, I cannot see the worlds largest intelligence agencies having too much trouble storing a few billion messages.
So should we be worried? In short, no. We have known for a long time that internet surveillance was taking place, this is why people started posting anonymously, why browsers such as TOR and other proxy bouncers were created and when sites such as megashare and Silk Road get taken down and their owners "traced" we see more evidwnce still. I hold reservations as to whether this is a good thing or not however I am accutely aware of the high unlikelyhood of uncovering anything relevant to stopping crimes or terrorism using the methods revealed by Mr Snowdon, or in fact of getting any information at all save for an intolerable amount of raw, uselessly innocent data.
Obtaining the data is merely half of the process, as I mentioned at the beginning, the other half being de-coding/deconstructing/understanding of the data. It is here where the problem truely lay for our intelligence communities in that the larger the intial data set, the harder/longer such de-coding becomes. A few hundred messages can, in fact, become very difficult if we take into account the traits of human fragmented discourse such as in jokes, multi-platform communication streams (conversations that start off in person (talking) then evolve onto facebook and then ends with an email, for example) and mis-communications. There are so many variables that the task of content deconstruction becomes more and more unlikely to yeild accurate results and so deconstruction based upon content is probably not the most likely method used. Indeed Mr Snowdon points towards Meta-data being the primary deconstructed element (not what the person is saying but who they are saying it too).
However, this is not entirely true. The arguement posed, not just by Mr Snowdon but also by many military officials is that the only information available to the agents of the NSA or GCHQ is the communication webs, I.e who is talking to who, but they are not allowed access to content (unless they get a warrant issued by the courts). However they do know at least one element of the content, which is the word or words that the algorithm is filtering for. They know that at least that one word/words is present in the text and so the arguement (which still needs to be resolved) is how much partial content can you filter/fish for before it is deemed that content is being actively accessed? I will not focus too much on this as it is an active conversation, however my own answer would be that it very much depends upon the size of the initial data set so I would have to put it at a percentage of accessed content, and I wouldn't want that to be much more than 25% else I would consider it the beginning of the deconstruction of content and content analysis.
The final point that I shall consider is the difference between Mr Manning (or is it Miss Manning? Tough to know what is true or not in that case) and Mr Snowdon. The biggest difference is the type of data revealed. Manning revealed diplomatic cables, what the USA was thinking and feeling about other countries and what it was doing to them in terms of military actions, infiltrations and negotiaions. In other words, (s)he revealed truths, truths which america itself viewed as truths. In many cases (s)he revealed information regarding the US breaking international law, in others just it breaking decorum. Either way, it was actual information, not information generated by manning but distributed by him(/her).
This is in stark contrast to Mr Snowdon who has not revealed direct information but methods of information obtainment and partially revealed the NSA and GCHQ's methods for de-coding gained information. What makes this really interesting is that Snowdon was also partially responsible for the development of these methodologies. He was a systems administrator and developer. In other words, instead of revealing direct data, he revealed systemic methods he helped develop for use by the NSA. Moreover, he has evaded capture (for the time being) and so once more has defied the US.
Their similarities should also be considered. That they both feel they acted from conscience, that they both still stand by their decisions and that since their actions the world has not come to an end are all points in their favour. Add to this that they made some very nasty people panic and worry makes them, perhaps not heros, but on the right side.