For our course about the impacts of the internet, we developed an exercise to get our students thinking critically about the credibility of things they find on the web. As a number of colleagues have expressed in interest in this, I thought I would post it here. Feel free to use it and adapt it!

Near the beginning of the course, we set the students to read the chapter “Crap Detection 101: How to Find What You Need to Know, and How to Decide If It’s True” from Rheingold & Weeks’ book NetSmart. During the tutorial, we get them working in small groups, and give them several, carefully selected web pages to test their skills on. We pick webpages that we think are not too easy nor too hard, and use a mix of credible and misleading ones. It’s a real eye-opener exercise for our students.

To guide them in the activity, we give them the following list of tips (originally distilled from the book by our TA, Matt King, who wrote the first draft of the worksheet).

Tactics for Detecting Crap on the Internet

Here’s a checklist of tactics to use to help you judge the credibility of web pages. Different tactics will be useful for different web pages – use your judgment to decide which tactics to try first. If you find some of these don’t apply, or don’t seem to give you useful information, think about why that is. Make notes about the credibility of each webpage you explored, and which tactics you used to determine its credibility.

  1. Authorship
    • Is the author of a given page named? Who is s/he?
    • What do others say about the author?
  2. Sources cited
    • Does the article include links (or at least references) to sources?
    • What do these sources tell us about credibility and/or bias?
  3. Ownership of the website
    • Can you find out who owns the site? (e.g. look it up using
    • What is the domain name? Does the “pedigree” of a site convince us of its trustworthiness?
    • Who funds the owner’s activities? (e.g. look them up on
  4. Connectedness
    • How much traffic does this site get? (e.g. use for stats/demographics)
    • Do the demographics tell you anything about the website’s audience? (see again)
    • Do other websites link to this page? (e.g. google with the search term “link: http://*paste URL here*”)? If so, who are the linkers?
    • Is the page ranked highly when searched for from at least two search engines?
  5. Design & Interactivity
    • Does the website’s design and other structural features (such as grammar) tell us anything about its credibility?
    • Does the page have an active comment section? If so, does the author respond to comments?
  6. Triangulation
    • Can you verify the content of a page by “triangulating” its claims with at least two or three other reliable sources?
    • Do fact-checking sites have anything useful on this topic? (e.g. try
    • Are there topic-specific sites that do factchecking? (e.g. for urban legends, for climate science). Note: How can you tell whether these sites are credible?
  7. Check your own biases
    • Overall, what’s your personal stake in the credibility of this page’s content?
    • How much time do you think you should allocate to verifying its reliability?

(Download the full worksheet)


  1. A very good worksheet! Links have to be copy-pasted, though (clicking just tacks on http://www... to the end of your website’s url).

  2. @Marko: Sorry about that – I suck at cut and paste. Should work now!

  3. Useful list. As non-native speaker, I must naturally object to using grammar skills as sign of credibility, however. 🙂

  4. @Victor: The checklist doesn’t say poor grammar correlates to low credibility. It merely prompts the students to think about this. The key idea is to get them to think more about the relationship. The amount of effort that’s been put into copy-editing is a useful indicator, at least of the seriousness of the website owners, even if that doesn’t always correlate with credibility. People who don’t care about truth and accuracy also tend not to put that kind of effort into careful writing either, but that’s quite different from the occasional slip that non-native speakers make.

  5. Using alexa without caution is dodgy.

  6. Nice, though I find the criteria under 4) Connectedness and 5) Design & Interactivity a little odd, as it seems to confuse popularity (4) and slickness (5) with quality. A pseudo-scientific blog such as WUWT would rank highly on the criteria under 4 for example.

    In addition to the other criteria, which make a lot of sense, I think there could be more, especially in terms of spotting logical fallacies. My example of a bullshit detector for scientific topics (with examples from climate science, but most of the heuristics are generally applicable I think) is here:

    And from something I found on the internet related to medical science:

  7. Stumbled onto a comment at Real Climate that may apply:

    “In the book: “Revolutionary Wealth” by Alvin & Heidi Toffler 2006 Chapter 19, FILTERING TRUTH, page 123 lists six commonly used filters people use to find the “truth”. They are:
    1. Consensus
    2. Consistency
    3. Authority
    4. Mystical revelation or religion
    5. Durability
    6. Science

    As the Tofflers say: “Science is different from all the other truth-test criteria. It is the only one that itself depends on rigorous testing.” They go on to say: “In the time of Galileo . . . the most effective method of discovery was itself discovered.” [Namely Science.] The Tofflers also say that: “The invention of scientific method was the gift to humanity of a new truth filter or test, a powerful meta-tool for probing the unknown and—it turned out—for spurring technological change and economic progress.” All of the difference in the way we live now compared to the way people lived and died 500 years ago is due to Science.”

    – See more at:

  8. One suggestion — you mention using Alexa, perhaps this covers it: use the method Perlman suggested in _The_Long_Con_ — consider the advertising that appears on the site, and ask what kind of people these advertisers are expecting to reach by advertising there.

    In particular, if the ads are from sources looking for suckers, for gullible people who will readily believe outrageous claims about health, or conspiracy, or the economy — you’ve learned something.

  9. The guidelines are pretty good. However, sometimes it helps to have an oversize well marinated brain. For example, in late 2002 and early 2003 almost all USA media insisted that Iraq had it was developing weapons of mass destruction. I could write a book about how this lie was created, polished, disseminated and bandied around by all the cognoscenti, including my former hero, Colin Powell. And I could write about other examples, some of which I write about, and some of which I can’t because I’m likely to turn up literarily dead in a trash can. The worksheet is fine, but in the end one has to be educated, and learn to smell.

  10. Fernando, so you agree but finish suggesting to trust your gut (which is warned about in #7) above all? Weird.

    Your example is a particular good example where this list does not work because all information, good or bad, was coming from a very short list of secret intelligence agencies. However, the situation is quite different with climate change science.

Leave a Reply

Your email address will not be published. Required fields are marked *