Does Being a Journalist on the Internet Mean Never Having to Say You’re Sorry?

Since the NSA surveillance story broke, a number of journalists (e.g., at Mother Jones and Nation) have noted that the original reporting by Glenn Greenwald/The Guardian and Barton Gellman/Washington Post was flawed, perhaps profoundly so. Greenwald and Gelman have been sort-of backpedaling from some of their original claims, although neither to my knowledge has given a full and clear accounting of what they got wrong and why.

In the entirely dead tree publishing era of newspapers, journalists who made mistakes in reporting and subsequently walked a story back had no choice but to make a clean breast of things: The hard evidence of their errors was right there on every subscriber’s kitchen table. But in the online era, journalists can revise their stories without being specific about what details they have changed and why.

As I do not have the technical skill to evaluate how much the authors of the original NSA stories revised their articles in light of emerging evidence uncovered by other journalists, I am grateful to Ed Bott for having saved Gellman’s original and revised NSA stories and then posting this file with both articles compared side-by-side.

Bott’s work allows everyone to make their own judgement about whether the Post’s revisions reflect merely the correction of small errors or are a symptom of seriously sloppy journalism the first time around (If it hasn’t happened yet, I hope someone does the same with the Greenwald/Guardian stories). But should the provision of this valuable service be left to energetic volunteers outside of a newspaper’s operations? I believe it would be better practice for all newspapers to include a link to the original versions of subsequently edited breaking stories (e.g., “Click here to see how we have changed our reporting since this story broke”). It might keep Internet-era journalists more honest and careful to know that their breaking stories will be as available to future readers as were those of journalists in the dead tree publishing era.

Comments

  1. says

    I just skimmed through the Ed Bott-provided comparison of Gellman’s article, and I saw mostly addition/fleshing of new details, addition of feedback by some of the players, and some rephrasing to clarify what’s known from what’s likely e.g. from “The technology companies, which participate knowingly in PRISM operations” to “The technology companies, whose cooperation is essential to PRISM operations”.

    I didn’t spot any back-pedaling by the reporter, but did spot denials by the private companies involved.

    Of course, alongwith those denials was this addition to the article: “Government officials and the document itself made clear that the NSA regarded the identities of its private partners as PRISM’s most sensitive secret, fearing that the companies would withdraw from the program if exposed. “98 percent of PRISM production is based on Yahoo, Google and Microsoft; we need to make sure we don’t harm these sources,” the briefing’s author wrote in his speaker’s notes.”

  2. Brett Bellmore says

    In the era of dead tree publishing, journalists frequently made ‘errors’, (Systematic bias suggests they weren’t always mistakes.) and didn’t walk them back at all, because all they needed for a wide range of errors was for the paper they were working for to refuse to publish any of the letters complaining of the error, and they were untouchable. The ironic notion that journalists are usually pretty accurate except for matters you have personal knowledge of didn’t originate with the internet, after all.

    Things have vastly improved for journalism since the internet came around. For journalists? Not so much. Accountability is a pain, after all.

    On your suggestion that newspapers provide links to the original version of their stories when they change them, great idea, and they’re not going to do it. For much the same reason they don’t link to the full, unradacted content of remarks they excerpt: (Or, increasingly, just paraphrase.) Half the power of journalism comes from the ability to successfully mislead people, (Mislead, because the actual facts won’t always point the way you want people to go.) and you don’t retain that if you give people access to source materials, or point out reasons you should generally be doubted.

  3. Mike says

    Somehow seeing a journalist’s working notes when things were print only doesn’t seem to have been an issue before, even for journalists themselves, except in the cases when the police have tried to obtain them under some pretext. Why single out the internet. I’m sure the NY Times or WaPo will be more than willing to cough those up for us.

    I suppose then the issue becomes, like, execution, “But having a deadline concentrates one’s thoughts on getting things right the first time out the door…” Well, maybe. Generally, I don’t think it makes a lot of difference. By the time the last edits have been made, the web will archive at least one close to final version.

    I suppose there’s some concern that someone will print one thing, start a controversy over the coverage, sub in some changes later, and then claim the original didn’t exist? All I can say is don’t try that on the internet. If there’s a substantive change, you’ll get busted. If it’s just nuance, no one really cares. There’s not a lot of gray in between, but I’d acknowledge there’s a little, but probably less than is imagined here I suspect.

    As a longtime occasional journalist and editor, with over a decade of work in internet journalism, I think the sentiment of this idea is noble, but somehow you’re expecting more of the web than you are of print. Then there’s the issue of the internet’s immediacy. Of course things sometimes won’t be quite right the first shot out the door, but deadline pressure works another way on the web than it does in print. Striking while the iron is hot is what counts, then the medium permits rapid correction. This may seem scattershot, but then a print edition has a very nice fixed period to compile and resolve things that often are neater than they seem in the subsequent printed correction.

    Then there’s this specific case. I’m not quite sure what the erroneous prompt for this was. But it’s a big story with lots of lose ends that can’t be easily tracked down or confirmed by 2 or more other sources, because large parts of it are intentionally hidden. I just don’t see how whatever they supposedly did get wrong takes away from the general development of the story and what it’s implications are.

    And that’s my last point. Look at how much we can all be pretty sure they did get right. Then lets look at much more richly endowed journalistic institutions who have tremendous access and resources — who basically either missed this sort of thing or knew about it and helped keep it under the radar of public notice in what’s supposed to be a democracy protected by a free press…yeah, right…Thank god for Glenn Greenwald and all those other folks who tenaciously dig this stuff out and expose it to the light.

    Then there’s the fact that “David Brooks, Tom Friedman, Bill Keller Wish Snowden Had Just Followed Orders”:
    http://www.commondreams.org/view/2013/06/17

    I’d worry a lot less about what the inevitable small bits the internet gets wrong at first blush and worry a lot more about why so much of our traditional “free press” is sitting on its hands and/or missing so much in general about stories like this before they break. Who knows, they might sell more papers, which is OK with me so long as they make the corrections.

  4. James Wimberley says

    In defence of Gellman and Greenwald, we should remember they are trying to inform the public about stuff which (a) is technically very complicated indeed (b) the government is trying hard to keep secret. The corrections coming out may be closer to the truth, but they are still damage control.

    I agree we need a better practice of corrections in the MSM. We RBC bloggers aren’t professional journalists, but I think we all abide, without a formal code of good practice, by the principles of res scripta manent – you never try to erase the past – and making corrections in the post on matters of fact, clearly marked as updates. These good habits may come from the different culture of academia: there’s no shame in starting out wrong, as long as you accept the other gal’s better information when it comes.

  5. Sebastian H says

    I’m not a Greenwald fan, but it is far too early to complain about him getting this story wrong. We are still in the government denial phase of the story with one leak from Snowdrn and an avalanche of cryptic and weasel worded leaks from the government and private companies.

    And if he got things wrong, the largest reason is obvious–the government has been opaque on the issue. My prediction–and I hope I’m wrong–is that he may have technical details wrong, but it really is true that the NSA has been trawling the Internet and phone data with either no warrant or some ridiculous open warrant. The key semi denials that clue me in to that are the retreats to no “direct control of our servers” (the companies always say direct) and the change of topic to individual warrant requests whenever asked about casting a net for all phone metadata. (The we don’t give them all the metadata line, not the they don’t get it line. This last is especially suspicious given the it so ancient history of the Government just tapping AT&T trunks before the switch so that the company didn’t have to give them anything)

  6. Tony C. says

    Greenwald has arguably been the most important and scrupulous American journalist working over the past several years. If anyone would care to place a serious bet on the answer to the question “Did he get something seriously wrong?” on this story, I’ll be happy to take to the “No.” side.

    • Katja says

      I would disagree with your assessment of Glenn Greenwald. As others have remarked before, he still writes more like a litigator than a journalist, and his analytical skills do not impress me. Repeated fiery denunciations of evils can be admirable on an ethical basis (insofar as they target actual evils), but there’s more to good op-ed writing than that. And his reporting tends to consist more of raw infodumps than deep background.

      If you want examples of really good journalism, I recommend Gene Weingarten’s second Pulitzer-Prize winning piece or this story about scholarly malfeasance.

  7. koreyel says

    Greenwald and Gelman have been sort-of backpedaling from some of their original claims, although neither to my knowledge has given a full and clear accounting of what they got wrong and why.

    Yes you would expect that professional journalists would do that sort of self analysis, wouldn’t you? It does seem strange that Greenwald for instance demands transparency and honestly from institutions but fails to double back and apply the same standards to himself. But then again, maybe his personality gets in the way of that. Greenwald has always struck me as a self-important, opportunist gadfly nobly defending the rights of the downtrodden, like Anwar al-Awlaki, from any sort of hostile infringement. I suppose there is no snake in the universe that doesn’t have a herpetologist somewhere to sing its praise and ask that it not be stepped on. Why I hear tell that even Stalin had his share of Greenwalds back in the day, batting their eyebrows and singing apologias for the man of steel…

    • says

      Both you and Keith seem to be proceeding on certain assumptions that are obviously not accepted by many here, including me. I have followed this controversy reasonably closely and I have also read the linked article. I see none of the backpedaling and corrections that you and Keith seem to take as givens.

      If you know of such an error that I’ve overlook, please feel free to correct me but I can’t see that either Keith or Ed Bott have identified even one significant error or correction. All that I can see is unsupported personal attacks on Glenn Greenwald, especially from you. Evidently it’s more accurate to say that being an RBC blogger means never having to say you’re sorry.

      • koreyel says

        MItch…

        Do a search on “what Greenwald got wrong”. You will find plenty of threads to joust on in his favor.
        I am not interested is such tilting. My post was more about my reading into Greenwald’s humorless personality and sense of self-importance.

        I see Greenwald in terms of a bigger picture:

        That is as someone who absolutely insists we apply the rule of constitutional law to the person of Anwar al-Awlaki, no matter where Anwar is, no matter what Anwar has done — because by George –Anwar is an American citizen, and we all have a vested interest in protecting the oaths we have made to that sacred “piece of paper”.

        Then…

        Greenwald gets stolen goods from someone who has broken every vow with that same representative Government, a Government that has paid the crook exceptional well, and what does Greenwald do? He ignores the laws broken, the oaths turned aside, and publishes the stolen goods for profit.

        I’d would say it is funny how Greenwald gets to decide what vows and oaths and pieces of paper are sacred.
        And which are violable…
        But I better not: Because it is only that pompous humorless ass known as Greenwald that gets to speak “truth to power”.

  8. Katja says

    This question is really at least three different questions, I think. One about the general habits of online journalism, one about the reporting done by Glenn Greenwald and Barton Gellman in particular, and an implied question about the accuracy of the recent claims. Let’s tackle them one by one.

    First, one of the not-so-great aspects of modern online journalism is what one could tactfully call the “iterative deployment of articles”. It’s nothing new for anybody with an RSS reader capable of showing differences between versions of an article. Articles get written quickly to meet deadlines (time literally is money here, because newness translates into page views, which translate into clickthrough-rates) with often only cursory editing. They then get both edited for clarification and language and simultaneously updated with more recent information. You may not like it (I’m not particularly crazy about it, if only because I don’t like how it prioritizes newness over quality), but that’s the way online journalism works at the moment, for better or worse, and can really only be fixed by using a different monetization model rather than ads.

    Second, in this context, I’m not seeing the smoking gun in Barton Gellman’s article. Aside from newly written content, almost all of the editing is copy editing. E.g., replacing “members of Congress” with “lawmakers”. I really can’t get worked up over that part. The one problematic edit is the replacement of “to track a person’s movements and contacts over time” with “to track foreigners”. Obviously, going from “person” to “foreigner” is a substantive change; at the same time, as I’ll discuss below, I’m not sure if the original claim is inaccurate.

    As to Glenn Greenwald, I have to admit that I normally avoid reading him. His writing to often reads like the transcript of a cross-examination rather than reporting or an op-ed. However, I have reason to believe (as I’ll explain below) that the claims in his original article were substantially accurate, modulo Edward Snowden’s credibility. However, I also think a lot of non-computer people (including Glenn Greenwald himself) may have inadvertently gotten the wrong idea because of their lack of familiarity with what’s involved in running Google-sized data centers.

    Third, let’s go to the meat of the claims themselves. Before I talk about my thoughts about PRISM, let’s discuss “Boundless Informant” and the FISA court orders first.

    I have, from the beginning, been mostly interested in two things: The scale of the surveillance involved, and the procedures in place, especially safeguards (or lack thereof). “Boundless Informant” and the court orders tell us a lot about these.

    The Boundless Informant image lists about 97 billion DNI records (DNI = Data Network Intelligence). 97 billion is a lot. We can tell, because we can compare that directly with another intelligence agency with an agenda similar to the NSA’s, the German BND. We know about the BND’s dragnet collection because the parliamentary commission supervising them is required by law to publish aggregate statistics on an annual basis. The report that the commission published this March covers surveillance operations that happened in 2011. In total, the BND intercepted a bit less than 3 million communications that year, out of which a few hundred were retained and the rest deleted. Even taking into account that Germany is a smaller country than the US (though Germany is still a fairly big internet nexus in the center of Europe), this is still a huge difference. Now consider that the 97 billion DNI records were collected over a 30 day period in March 2013, allowing us to extrapolate to an annual estimate of 1.1-1.2 trillion, give or take. We’re talking about 5-6 orders of magnitude of difference at this point (and, incidentally, some 30 billion annual records on persons resident in the US, or about 100 per US citizen).

    In short, it appears as though some pretty massive surveillance that is taking place.

    Next, the FISA court orders. Luckily, there’s no need for much guessing with respect to the Verizon Business case, since US authorities have admitted that they’ve collecting connection records from phone companies on a large scale for years under a rather creative interpretation of section 215 of the Patriot Act. More importantly, we recently learned that Yahoo was also the target of a dragnet order (the text of the court decision does not spell that out, but the context, such as referring to minimization procedures as a safeguard, leaves little doubt). The order also uses the familiar loophole that a “significant” (rather than primary) purpose of the intelligence collection activity would be obtaining foreign intelligence information.

    At this point, we already have strong evidence that the NSA collects information fairly indiscriminately from both phone and email providers. It is at this point also virtually certain that data collection activities are not limited to foreigners, but that such a limitation only eventually occurs through the NSA’s minimization procedures (insofar as these procedures are effective).

    Finally, let’s talk about PRISM. Many non-techies seem to have assumed that the “direct information from servers” claim implied that the NSA had secret backdoors to the servers of the companies. That idea was always absurd when talking about server farms the size that Google, Microsoft, or Yahoo are running. Letting a three-letter agency just run their own software on your servers would be a prescription for chaos. It would at the very least affect system stability and possibly require re-architecting on a large scale, not even considering the legal aspects. Conversely, any NSA activity would eventually be noticed by the system administrators, many of whom are not American, do not live in America, and even if they do, may not be eligible for the necessary clearance. It would be neither in the interest of the companies nor the NSA to have such a setup. The obviously solution that satisfies both parties was to transmit data through a well-defined, isolated interface.

    Now, once that misconception was corrected, another apparent misconception seems to have replaced it, namely that we’re talking simply about a secure electronic dropbox for information obtained through warrants. Two pieces of information contradict this idea. One is the Yahoo court order, another is PRISM’s price tag, with an annual operating cost of $20 million. The Yahoo court order indicates that the NSA is collecting a whole lot more, and the $20 million annual expenses indicate that there’s indeed a pretty serious IT operation going on. Assuming that the cost is not due to using gold-plated servers and cables, the two obvious options are that (1) there is a whole lot of data flowing across these links requiring fairly massive pipes or (2) the interface between the NSA and the companies is a lot more sophisticated than just a dump dropbox (e.g., allowing the NSA to amend search terms or target designations dynamically). Neither option is particularly comforting.

    Insofar I don’t think that either Glenn Greenwald or Barton Gellman have a whole lot to walk back, if anything.

    • Cranky Observer says

      One thing to consider is whether Facebook [1], at least, has had TLA (three-letter agency) involvement in its design and operation from very early on and does actually have direct TLA access to the database. Indeed, whether the partners behind Facebook’s first few rounds of venture capital were TLA-funded. The potential for a 500 million-node user-built social network graph must have been apparent early on; investing a few 10s of millions in a dozen candidates to see which one became the winner might have seemed like a good investment with the black budget to play with.

      Cranky

      [1] Facebook being notoriously proud of its inconsistent and undependable behavior, the argument of surveillance damaging stability wouldn’t apply.

  9. Warren Terra says

    Honestly, given the conflicting stories I don’t know what to make of any of this.

    If the maximalist version of the stories from Greenwald, Snowden, CNET, etcetera are true, we’ve got an unmitigated disaster of privacy violation here: the government traipsing through all the major Cloud providers’ servers, a minor NSA subcontractor empowered to spy on anyone, warrantless wiretaps, etcetera.

    On the other hand, the technical rationale for the minimalist version of the story is pretty clear: there are times when – with a warrant – you want to do certain searches, to figure out who a person has been calling, etcetera. Those warrants can be justified (although Gawd knows the FISA court doesn’t bother to check), and the warrants are meaningless if there isn’t actually a database of phone logs to be consulted. So constructing various databases of this sort, incorporating massive amounts of data for which you’ve got no probable cause, and then only actually searching the databases as you are instructed to do so by a court order, could make a great deal of sense. The existence of such databases could be necessary for the putatively justified and civil-liberties-consistent searches to be effected. And at least one version of the government’s story is that this is about all they’ve done.

    The problem is, I don’t foresee any way we’re going to find out how much of the truth is on what side. I wouldn’t trust Greenwald as far as I could throw him, no-one would trust the NSA to describe their activities, and the bodies that are supposedly supposed to provide oversight are the same folks who’ve spent the last two weeks making dumb pronouncements and in any cases passed the enabling legislation.

    There is room for executive action here. The very least the Administration could do would be to lift the gagging orders on the Cloud purveyors, at least involving descriptive statistics (i.e. not the names of targets).

    • Katja says

      Warren: So constructing various databases of this sort, incorporating massive amounts of data for which you’ve got no probable cause, and then only actually searching the databases as you are instructed to do so by a court order, could make a great deal of sense.

      Indeed, there is a good case that could be made for it. But such a database would require proper safeguards or oversight. It would be something like the EU Data Retention Directive, which — controversial as it is — puts a minimum and a maximum duration on data retention, leaves the data with the providers, and still requires the government to obtain a warrant through the regular courts when needed (modulo national laws). Not the cloak and dagger stuff that the NSA is doing, not creating a state within a state.

      Even so, one can argue that the mere existence of such a database creates a chilling effect, especially in conjunction with prosecutorial discretion.

      Warren: The problem is, I don’t foresee any way we’re going to find out how much of the truth is on what side.

      Agreed. Which is why I’ve mostly been looking at what the documents can tell us. As far as I know, nobody has denied their basic accuracy, just how they have been interpreted.

      I also think, regardless what one thinks of the rest, the NSA (and possibly other three-letter agencies) needs a leash. One thing that has become clear is that due to ineffectual supervision by the two other branches of government, they have been able to bend various laws into a pretzel to get to their desired level of surveillance.

    • says

      I agree that we need to have some capacity for doing this kind of surveillance. In the crime-fighting world, electronic surveillance is very tightly regulated by the provisions of Title III and I don’t know of any serious infringements of civil liberties by misuse of electronic surveillance that have taken place since its passage. There are many very serious, genuine protections for privacy and civil liberties that accompany every Title III order.

      By contrast, the mere existence of this kind of permanent database is an invitation to a police state. Certainly, an ever expanding collection of searchable highly personal information such as CCTV, credit card charges, travel information such as hotel and air reservations, telephone conversations, emails, and so forth is an open invitation to wholesale blackmail and intimidation by political actors or by our ever-growing deep state. This is very dangerous stuff and the likelihood that everyone with access to this database will always be an angel is, frankly, depressingly slight.

      There is also something very peculiar about how these information systems are being setup and used. In the two cases that we know about (Snowden and Bradley Manning), very low level people inside the system have somehow been able to access very sensitive information that one would assume would be unavailable to them because of need-to-know restrictions. For example, both men had extensive access to sensitive diplomatic secrets which would appear to have nothing to do with their jobs.

      So you really have to wonder about the assurances that we’re being given by the Obama administration about what’s being collected, how it’s being used and, most especially, about the effectiveness of the supposed self-imposed policy-based limitations on accessing very sensitive personal information. It wouldn’t surprise me if it turned out that NSA people were amusing themselves by spying on celebrities or their neighbors or exploring questions of marital fidelity unrelated to the nation’s security. Don’t forget about how they entertained themselves eavesdropping on very intimate telephone conversations between soldiers in Iraq and their families, girlfriends and, if recent revelations about the sexual habits of the general staff are any guide, a wide assortment of outside women.

      • Cranky Observer says

        = = = There is also something very peculiar about how these information systems are being setup and used. In the two cases that we know about (Snowden and Bradley Manning), very low level people inside the system have somehow been able to access very sensitive information that one would assume would be unavailable to them because of need-to-know restrictions. For example, both men had extensive access to sensitive diplomatic secrets which would appear to have nothing to do with their jobs. = = =

        The TLAs are no more immune to the laws of staffing physics or the limitations of systems development than anyone else. Back in the 1980s the NSA in particular was rumored to have computer systems that were tightly designed and controlled to be secure from their operators (an incredibly hard goal to achieve [1]). Since then, as you may have noticed, there has been a 10,000,000x increase in the demand for complex systems by every area of every society on Earth. If you’ve tried to staff a complex systems project, or even worse a complex systems operations unit, in the last 10 years you know the reality: about 5% of systems people are very good, 10% are good, 30% are OK, and 50% are below average. No matter how much money you dangle or what challenges you offer you simply can’t staff a project with 100, much less 10,000, systems superstars – there aren’t enough of them and only a few want to work for you. And unlike the 1980s the NSA no longer offers the most exciting computer/systems work in the world. If you were a superstar would you rather work for Google or Facebook where you get good money, stock options, and can talk to your friends or spouse about what you do, or the NSA where you don’t, you don’t, and you can’t? In 1982 I knew several very good people who chose the TLA route. Today? Not sure they’re very competitive in the job market.

        And that’s for design and build jobs. Operations? Very, very few people like to do precision operations over a long period of time. And again, if you do like operations and are very good at it, Google or NSA? Stock options or lifetime silence?

        Cranky

        [1] Lotus Notes is the only commercial system I am aware of that could, in theory, be set up and operated so as to be secure from its administrators. Doing so raised the complexity of administration, the possibility of irrecoverable data loss, and the annoyance factor of using the system so high that I never saw one so configured in the commercial world.

        • Mitch Guthman says

          I don’t think it’s even a question of these systems being insecure against high-level sysadmins or even super-users. For example, both Manning and Snowden appear to have been very low level analysts which corresponding levels of access limited to whatever kind of counterinsurgency or counter-terrorism operations they were supporting. Instead, both databases (if, indeed, there isn’t just one giant pool of data from which all of these differently named programs draw)allowed low-level users to access things like mid to high level State Department diplomatic traffic—which was apparently captured, decoded and stored on NSA and military servers. That seems crazy.

          Similarly, it now looks like Snowden was able to access extremely sensitive American diplomatic traffic as well as raw intercepts of what one would expect to be top-level stuff, namely, information about the G-8 intercepts in London and even possibly the intercepted traffic itself, which one would presume to be graded far above his top secret clearance. Again, how is it possible for low level analysts like Manning and Snowden to access this kind of information? I think it’s clear that whatever “policy” based limitations might exist in principle, the system is clearly incapable of enforcing those limits, if indeed they are real limits as opposed to limits that exist only as talking points for Congressional testimony.

          • Katja says

            The reports say that Edward Snowden was last working as a systems administrator at the NSA’s Kunia Regional SIGINT Operations Center in Hawaii. If that is accurate, it is utterly unsurprising that he had access to all the materials he had.

  10. CharleyCarp says

    Count me in with the group who finds complaints about Greenwald, or speculations about Snowden’s personality, to be not only irrelevant, but misdirective. And as has been observed above, a set of guidelines and restrictions that couldn’t prevent Manning or Snowden from doing what they did are the functional equivalent of no guidelines at all. Here’s an analogy: suppose when your city tows cars, they leave them in an unlocked unsupervised lot in a bad part of town. You point out that there is some risk of theft, and they point you to the statute book, which very clearly indicates that theft is illegal. Oh, ok, then, I guess that’s fine.

    We know that at the UBL level, this sort of surveillance program was assumed, and so nothing of significance was found at that level from doing this. They don’t seem to have been able to get the loser loners who set off rice cooker bombs in Boston. So what can they point to to justify the intrusion, the risk that hackers can steal our data, and the expense of running the program through contractors with cleared staff? Assurances that if they could only trust us with the truth, we’d think these programs are a really good deal.

    (While I’m on the soap box, a short line about Clapper. If he thought that using his own definition of “collect” might be misleading, he could easily have included in his answer that the intelligence community uses a highly limited definition for the word collect. If he didn’t think using his own definition is misleading in this context, he’s a moron, too stupid to be trusted with the responsibilities of participating in democratic government. He’s not stupid, but instead of being clear, he lied, and is now caught parsing about what the meaning of is, is. Except, this isn’t about some bullshit effort to catch him out on something that has nothing to do with his job, but was in response to a direct question from the other branch of government overseeing his compliance with the laws that govern his core functions. The man has to go.)

    • Anonymous says

      Count me in with the group who finds complaints about Greenwald, or speculations about Snowden’s personality, to be not only irrelevant, but misdirective.

      Yep. The agenda of people who want to change the subject to Greenwald and journalistic errors is actually itself a much more interesting topic than whether Greenwald actually made mistakes in reporting the story. Orwell’s “Squealer” in “Animal Farm” is a dead-on caricature of this sort of thing.

      • Anonymous says

        Right on, especially when the sources are right-wing rags like Mother Jones and the Nation.