Difference between revisions of "Jta-participants-karp"

From IVP Wiki
(New page: frame|left|[GWU SMPA Building] frame|left|[] =AN URGENT DISCUSSION:<BR><BR>"From Gatekeeper to Information Valet:<br><br>Work Plans for Sustai...)
 
(Scott Karp: News Aggregation - Taking Back Control of News Distribution to Sustain Journalism)
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
[[Image:Gwu-smpa.jpg|frame|left|[GWU SMPA Building]]]
 
[[Image:Gwu-smpa.jpg|frame|left|[GWU SMPA Building]]]
[[Image:Ccj-logo.gif|frame|left|[]]]
+
[[Image:Ccj-logo.gif|frame|left|[http://www.concernedjournalists.org CCJ HOME]]]
=AN URGENT DISCUSSION:<BR><BR>"From Gatekeeper to Information Valet:<br><br>Work Plans for Sustaining Journalism"=
+
=Scott Karp: News Aggregation - Taking Back Control of News Distribution to Sustain Journalism=
===Wed., May 27, 2009 / 10 a.m.-4 p.m. / The George Washington University / Jack Morton Auditorium / 805 21st Street NW / Washington D.C.===
+
 
 
[http://newshare.com/wiki/index.php/Gwu-program PAYMENTS / PRIVACY / PERSONALIZATION / ADVERTISING / AGGREGATION / COLLABORATION/ RESEARCH]
 
[http://newshare.com/wiki/index.php/Gwu-program PAYMENTS / PRIVACY / PERSONALIZATION / ADVERTISING / AGGREGATION / COLLABORATION/ RESEARCH]
 
[[Image:Rji-ideas.jpg|thumb|150px|right|[http://rji.missouri.edu/image-library/stories/new-building/index.php RJI PICTURED]]]
 
[[Image:Rji-ideas.jpg|thumb|150px|right|[http://rji.missouri.edu/image-library/stories/new-building/index.php RJI PICTURED]]]
Line 8: Line 8:
 
<hr><h3>[http://newshare.com/wiki/index.php/Gwu-program VIEW PROGRAM] / [https://extweb.missouri.edu/NewWebReg/Login.aspx?uid=3&pid=112389 REGISTER NOW]  / [http://newshare.com/wiki/index.php/Gwu-participants WHO'S PARTICIPATING?]
 
<hr><h3>[http://newshare.com/wiki/index.php/Gwu-program VIEW PROGRAM] / [https://extweb.missouri.edu/NewWebReg/Login.aspx?uid=3&pid=112389 REGISTER NOW]  / [http://newshare.com/wiki/index.php/Gwu-participants WHO'S PARTICIPATING?]
 
/[http://tinyurl.com/cymuke VIEW/PRINT TWO-PAGE FLYER]</h3><hr>
 
/[http://tinyurl.com/cymuke VIEW/PRINT TWO-PAGE FLYER]</h3><hr>
[[Image:Gwu-morton.jpg|frame|left|[Jack Morton Auditorium in use]]]
 
  
I'd love to present the model we are developing, where journalism is funded by a news aggregator power by journalists, which shares revenue robustly with
+
The key to developing a new model for sustaining journalism is understanding how it was sustained in the 20th Century.  Paid subscriptions have never sustained journalism -- subtract the cost of circulation marketing, production, printing, and delivery, and subscription revenue at most newspapers generated little if any profit (if not a loss). What sustained journalism was monopoly control over distribution in a local market, which gave newspapers robust pricing power for advertising. Control of distribution is what made classifieds the highly profitable cash cow that paid for journalism.
journalists and news orgs (unlike every other aggregator). Google has proven aggergation is worth billions -- if journalism could get it's share, it
+
 
could provide billions in new revenue.
+
On the web, the big mistake newspapers made was not failing to charge for content (since the marginal cost of distribution is near zero), but rather ceding control of distribution to aggregators like Google. Google is on track to generate as much revenue as the entire newspaper industry combined, and that is entirely a function of control over content distribution, i.e. the "package."
 +
 
 +
As [http://newsosaur.blogspot.com/2009/05/what-would-google-do-about-newspapers.html Bill Grueskin summed up it recently]: "...people used to buy newspapers to get disparate chunks of information (sports scores, movie times, local-government coverage, weather forecasts) that papers provided, yet those readers were effectively subsidizing the entire newsroom. By atomizing content, the Web makes each story instantaneously and ubiquitously accessible, meaning newspapers have gone from the profitable front end of the distribution chain to the unprofitable back end."
 +
 
 +
Without control over distribution, news sites become mere content hosts in a hyper-competitive content market, where ad rates are pushed down by an over-abundance of supply.
 +
 
 +
The key to sustaining journalism is to return news organizations to the profitable front end of the distribution chain, and that can only be achieved by creating new packages of content with high consumer value, i.e. aggregation.
 +
 
 +
The music industry provides a important lesson about aggregation in a digital media world. Apple's iTunes gave consumers a way ignore music industry packages, i.e. albums, and instead create their own playlists out of individual songs from any album. In the same way, news consumers have been trained by aggregators to expect their news to come from a rich diversity of sources.
 +
 
 +
News companies will likely not generate significant revenue by continuing to package only their own content -- regardless of whether the charge for that content -- because that will leave them still at the unprofitable back end of distribution.
  
As Bill Grueskin summed up it nicely <http://newsosaur.blogspot.com/2009/05/what-would-google-do-about-newspapers.html> : ...people used to buy
+
The only way for news organizations to generate the kind of Google-like profits that can sustain journalism is to build a news aggregator that draws from every news source on the web and that exceeds in quality every other news aggregator on the web. Such an aggregator could take back control over news distribution from Google.  Eric Schmidt himself admits that Google's algorithms can't handle news well.
newspapers to get disparate chunks of information (sports scores, movie times, local-government coverage, weather forecasts) that papers provided, yet
 
those readers were effectively subsidizing the entire newsroom. By atomizing content, the Web makes each story instantaneously and ubiquitously
 
accessible, meaning newspapers have gone from the profitable front end of the distribution chain to the unprofitable back end.
 
  
Our model is to put news orgs back in the profitable front end of distribution (something pay walls likely won't do).
+
But what could effectively aggregate all the highest quality news on the web is the human judgment of professional journalists -- news organizations' greatest underleveraged asset. A news aggregator powered by the collective news judgment of every journalist in the world could become a major consumer destination -- the Hulu of news.
  
There's a great story of all the news orgs in the North West collaborating to create a regional newswire:
+
Such an aggregator could then take the final step that Google cannot -- share revenue robustly with news organizations, to pay for the journalists who create the high quality content that drives the value of the news aggregator, and whose judgment powers the aggregation.
  
http://newscollaboration.ning.com/
+
Google will likely generate over $30 billion in advertising revenue this year. News organizations can get a large share of that revenue not through a beg and threaten strategy, but by collectively building a better product.
http://publishing2.com/2009/01/09/networked-link-journalism-a-revolution-quietly-begins-in-washington-state/
 
  
Would also be interesting to play out this debate with the Journalism Online folks:
+
- Scott Karp is the CEO and co-founder of [http://www.publish2.com Publish2], a platform for collaborative journalism.
http://newsosaur.blogspot.com/2009/04/tale-of-two-very-different-journalism.html  Our model is based on the proven economics of the web (rather than
 
speculating on new economics).
 

Latest revision as of 23:26, 25 May 2009

[GWU SMPA Building]

Scott Karp: News Aggregation - Taking Back Control of News Distribution to Sustain Journalism

PAYMENTS / PRIVACY / PERSONALIZATION / ADVERTISING / AGGREGATION / COLLABORATION/ RESEARCH


VIEW PROGRAM / REGISTER NOW / WHO'S PARTICIPATING?

/VIEW/PRINT TWO-PAGE FLYER


The key to developing a new model for sustaining journalism is understanding how it was sustained in the 20th Century. Paid subscriptions have never sustained journalism -- subtract the cost of circulation marketing, production, printing, and delivery, and subscription revenue at most newspapers generated little if any profit (if not a loss). What sustained journalism was monopoly control over distribution in a local market, which gave newspapers robust pricing power for advertising. Control of distribution is what made classifieds the highly profitable cash cow that paid for journalism.

On the web, the big mistake newspapers made was not failing to charge for content (since the marginal cost of distribution is near zero), but rather ceding control of distribution to aggregators like Google. Google is on track to generate as much revenue as the entire newspaper industry combined, and that is entirely a function of control over content distribution, i.e. the "package."

As Bill Grueskin summed up it recently: "...people used to buy newspapers to get disparate chunks of information (sports scores, movie times, local-government coverage, weather forecasts) that papers provided, yet those readers were effectively subsidizing the entire newsroom. By atomizing content, the Web makes each story instantaneously and ubiquitously accessible, meaning newspapers have gone from the profitable front end of the distribution chain to the unprofitable back end."

Without control over distribution, news sites become mere content hosts in a hyper-competitive content market, where ad rates are pushed down by an over-abundance of supply.

The key to sustaining journalism is to return news organizations to the profitable front end of the distribution chain, and that can only be achieved by creating new packages of content with high consumer value, i.e. aggregation.

The music industry provides a important lesson about aggregation in a digital media world. Apple's iTunes gave consumers a way ignore music industry packages, i.e. albums, and instead create their own playlists out of individual songs from any album. In the same way, news consumers have been trained by aggregators to expect their news to come from a rich diversity of sources.

News companies will likely not generate significant revenue by continuing to package only their own content -- regardless of whether the charge for that content -- because that will leave them still at the unprofitable back end of distribution.

The only way for news organizations to generate the kind of Google-like profits that can sustain journalism is to build a news aggregator that draws from every news source on the web and that exceeds in quality every other news aggregator on the web. Such an aggregator could take back control over news distribution from Google. Eric Schmidt himself admits that Google's algorithms can't handle news well.

But what could effectively aggregate all the highest quality news on the web is the human judgment of professional journalists -- news organizations' greatest underleveraged asset. A news aggregator powered by the collective news judgment of every journalist in the world could become a major consumer destination -- the Hulu of news.

Such an aggregator could then take the final step that Google cannot -- share revenue robustly with news organizations, to pay for the journalists who create the high quality content that drives the value of the news aggregator, and whose judgment powers the aggregation.

Google will likely generate over $30 billion in advertising revenue this year. News organizations can get a large share of that revenue not through a beg and threaten strategy, but by collectively building a better product.

- Scott Karp is the CEO and co-founder of Publish2, a platform for collaborative journalism.