New ground rules for aggregators

Moves by the German government to force aggregators to pay publishers may be a pipe dream but will self-regulation provide the answer?

There’s a compelling scene in Page One – the revealing documentary about the machinations of the New York Times, where media reporter David Carr holds up a print-out of the homepage of news aggregation website Newser.com. 

He explains how the site brings together content from multiple news outlets – something numerous new media outlets do – and then steals the show by holding up another print-out, one that shows what that site would look life if all the traditional news outlets went out of business. What’s left is a page full of cut-out holes.

This week the German government revealed plans to legislate charges for content aggregators and search engines, forcing them to pay publishers whose content they highlight, even when it might be a short excerpt. Newspaper executives in Germany are already talking about the positive impact it could have to their bottom line.

Unsurprisingly, this idea has strong support from publishers, but has been slammed by Google, with Google chairman Eric Schmidt saying he fears it will “slow down the development of the internet”.

It’s been a little over a year since Page One was released, but David Carr is still looking for answers in a new media world dominated by plunging advertising revenue, a public unwilling to pay for information, and a proliferation of online media that is only too willing to ride off the back of the producers of original content.

Over the weekend Carr reported on two new approaches under cosideration to give credit to original content producers.

One is from a group of media heavy hitters that are working on some best practices for linking, summarising and aggregating. The other is from the Curator’s Code, a site which suggests that content aggregators slap a symbol alongside content to show that it came directly from another source, and an alternative symbol for a source that inspired an article.

The Curator’s Code developer, Maria Popova says discovery of information is a form of intellectual labour, telling the New York Times “When we don’t honour discovery, we are robbing somebody’s time and labour.” It all sounds very polite.

Of course this is a very American discussion, coming from a country that laughed loudly at Rupert Murdoch when he called aggregation sites, including Google News, “parasites” and “plagiarists” back in 2009. Both of the proposals discussed by Carr are self-regulatory options that couldn’t be further from what Germany is proposing.

But filling the web with some kind of cookie trail of symbols could prove a massive challenge. As Wired co-founder Kevin Kelly said the internet is the world’s greatest copy machine.

And the truth is, no matter how frustrating it might be for experienced journalists, people often care more about the information itself than who actually produced it or even passed it on to them.

The role of the curator is what’s at question here. New research from Pew in the US has found one in five American’s have unfriended people on social networks because they didn’t agree with their views or politics, suggesting a trend towards filtering out uncommon views.

A raft of sites is helping to grow this echo chamber, including Twitter, Tumblr, and new kid on the block, Pinterest.

Kevin Kelly calls sharing the “primary verb of this world”, and as Richard Waters explains in his FT piece today, “frictionless” sharing has become the internet buzzword of the moment.

But by making sharing so easy, are we also making it harder to give credit where credit’s due?

Related Articles