November 10, 2011:
The CIA recently revealed that it has set up an "Open Source Center" that trolls the Internet for useful information. For several years after September 11, 2001 the CIA was urged to do this. They invited people who ran open source analysis web sites (like StrategyPage) down to Langley to explain how this was done. And the message most of these visiting lecturers gave was basically the same; the information is out there, you just have to go take it and use it.
It was also pointed out that there are already marketing and BI (Business Intelligence, or corporate espionage) operations using what the CIA eventually built for their Open Source Center; software to gather all this information, filter and organize it, and then turn it over to analysts to be sorted out, or, in many cases, translated more accurately. That last bit was necessary because machine translation software can automatically translate all those tweets and postings so that stuff can be identified and put in a data base. But in order to get really useful (to the CIA) intelligence, you need skilled linguists and analysts to double check, and also find out if the selecting and sorting software needs to be tweaked (it often does).
This massive, real-time combing of social media and open (to anyone) message traffic has yielded a much more accurate and timely analysis of political, religious, cultural and military trends worldwide. It has also made the deployment of agents and other scarce resources (reconnaissance and electronic eavesdropping satellites, aircraft and ships) more effective.
The impetus for the Open Source Center actually came from within the CIA, but it was the post September 11, 2001 urgency, and obvious examples of civilian organizations using open source material, that got the CIA brass (and sufficient cash) on board. Once the Open Source Center began to show results (that happened quickly), it was easier to admit that this sort of thing had been going on for a long time. From the beginning, the CIA depended a lot on simply reading (and clipping) foreign newspapers, plus having agents wander about in foreign nations and report on what was being heard on "the street." But this is more time consuming and collects less data than the Open Source Center. Moreover, by having all this open source data in a database, it is possible to use widely available (or custom made) analysis software to extract all sorts of useful, but not initially obvious, information.
The Open Source Center still uses agents on the spot, collecting what "the street" is saying. But now these reports get into a database, where they undergo further analysis, and comparison to what the "Internet street" is saying. Often these two "streets" are not saying the same thing, which is also valuable information. In many parts of the world, only a small portion of the population is on the Internet. With the proliferation of cell phones that can access the Internet, this is changing. But you have to track this in order to know what these different branches are going.