Crowdsourcing, in journalism, is the use of a large group of readers to report a news story. It differs from traditional reporting in that the information collected is gathered not manually, by a reporter or team of reporters, but through some automated agent, such as a website.
At its heart, modern crowdsourcing is the descendent of hooking an answering machine to a telephone “tip line,” where a news organization asks readers to phone suggestions for stories. Or asking readers to send in photos of events in their community.
True crowdsourcing involves online applications that enable the collection, analysis and publication of reader-contributed incident reports, in real time.
Mobile phones and the widespread adoption of the Internet into homes and offices everywhere are taking this crowd sourcing practice to a new level.
Then, we’ll then be doing some crowdsourcing of our own and learning how to display the data we collect.
Some great examples of this in practice:
PriceofWeed.com: Allows users to share the cost, quality and availability of marijuana in their area. Also ranks the attitudes and law enforcement practices.
Project Cassowary: Tracks sightings of the rare endangered Cassowary bird – mobile apps available.
Project Noah: app that tracks local wildlife.
Q. How can I be sure this information I source is legit?
A. You can’t.
In a true crowdsourced project, information is not verified manually by a reporter between submission and publication.
A well-designed crowdsourcing project, like a well-edited newsroom, can discourage bogus submissions while minimizing their influence if accepted. Requesting the reader submit personal identification along with the report (email verification, name, city) helps.
Asking readers to identify themselves sends the message that you take this project seriously and that you wish them to do the same. Obviously bogus ID allows you to flag bogus records for deletion with ease.
You can also tailor your pool to include a specific, relevant crowd (ie; Talking to University students about campus issues doesn’t require a submission form that’s open to the entire web).
Be careful to note that crowdsourcing is NOT polling. Drawing broad conclusions about community behavior based on your crowdsourced incident reports is a mistake – always let the audience know how you gathered your data. Crowdsourced material is often more effective for QUALITATIVE data than QUANTITATIVE.
Q. Do I need to be able to build websites and graphics for this?
Crowdsourcing Tools (data collection):
Data display tools:
Many Eyes (not infographics, but next best thing)