Weighted Crowdsourcing. What the?

crowd2.png

A network I covered last week, called Big Think, turned out to be a video network of sorts, for smart people. Going through the site, I realized that it was full of content and commentary from pundits, experts, and analysts from various walks of life. The point of bringing in all these experts was to get the ball rolling on topics that are of importance to a large number of people. I mentioned Big Think’s use of experts boiled down to a weighted crowsourcing model for user-generated content, and a few people emailed me to ask for elaboration, so here it is.

Now, I see a lot of web sites on a daily basis, and one thing that often stands out to me is the subtle (or not so subtle) ways in which startups get users to begin interacting with their sites. Some do contests, others hope to connect existing groups based on their offline, physical manifestations. But what Big Think did was bring in a bunch of content from industry experts, then asked its users to respond.

bigthink-logo-spaced.pngIf you read my Mashable post on Big Think then you know how the network reminded me of those newsies that scream at the television while watching these pundits, as if they can hear them through the tube. In fact, I was reminded of my Aunt Norma. She watched the news (and the lottery) religiously, every evening. She was the one mumbling under her breath in response to an asinine comment from some pundit, while furiously working on her needlepoint.

Given the serious tone of Big Think, I found it suffice to say that this mature, NPR-prone crowd is comprised of the individuals that will totally go for Big Think’s set up. I can see my mother watching a clip from a UCLA professor speaking about global warming, and feeling compelled to leave her own opinion on the matter, in the comments thread.

crowd1.pngAs we’ve seen with Newsvine and Gather, the members of this more mature crowd will then have no problem presenting their own questions to the community. What I found with Big Think’s layout, however, is that the most readily accessed content was that of all the experts feature throughout the site. This brings us back to the topic of weighted crowdsourcing. With all the rich content being shared on Big Think by way of user contributions, it’s still the content from the experts that I’ll find at the top of most pages.

When gathering data from a large user base, I think it’s always important to include some editorial content for resourceful, validated reasons. And this is something particularly accomplished by product review sites. But when combined in a community-driven, self-regulated network such as Big Think, the wisdom of the crowds is in fact weighted due to the prominence of the editorial content.

Is this good or bad for a structured institution?

peoplejam-logo-spaced.pngIt seems to work quite well for an editorial position when the process of a collaborative publication, such as Assignment Zero’s project with Wired last year. PeopleJam is another network that has taken a similar approach, with the hopes of helping you help yourself. Content is aggregated around experts in such a fashion that is authoritative, yet still integrated with the community at large. And I think that, despite its potential for leading people into confirmation traps, it could also work for Big Think, specifically in its targeting of the mature and active demographic.

One thought on “Weighted Crowdsourcing. What the?

  1. Pingback: SavvyDoc Startup Review: CureHunter, Semantic Search for Disease Info «

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s