Saturday, May 16, 2009
Champions (Again)
MediaWeek (Vol 2, No 20): Harpercollins, NYTimes, Long Tail,
“Change is easier for some people than for others,” she said. "You know how some people are hoarders and they don't like to throw anything out? I'm the opposite: I get this weird thrill from throwing everything out and having nothing." Ms. Stier is the head of digital marketing at HarperCollins, as well as the associate publisher of HarperStudio, the small imprint there whose stated mission since it formed last spring has been to question conventional industry wisdom concerning advances and returns, and to experiment with untested methods of promotion. Ms. Stier is among the most visible and energetic believers in the idea that publishers must stop relying on critics, journalists and talk show hosts for coverage, and instead start finding creative ways of reaching readers directly through emerging social media tools like Facebook, Twitter, Tumblr, etc.There is a lot of debate about the merits of the central arguments made by Chris Andersen in The Long Tail - that is, whether there is one. A new report (summarized here) looks at P2P and appears to debunk the Long Tail concept:
“I’ve been running down the halls screaming ‘fire’ for a couple of years now, and you know, I feel like it’s only recently that people are starting to hear me,” Ms. Stier said. “It’s hard for me because I’m up here in my own little beehive of exciting stuff, and I forget that there’s a world of people out there in the rest of the industry who don’t believe. But there are definitely pockets of people who do, and those pockets are growing more and more, faster and faster, which is good.”
A study of P2P music exchanges to be revealed this week suggests that the ailing music business is shunning a lucrative lifeline by refusing to license the activity for money. Entitled "The Long Tail of P2P", the study by Will Page of performing rights society PRS For Music and Eric Garland of P2P research outfit Big Champagne will be aired at The Great Escape music convention tomorrow. It's a follow-up to Page's study last year which helped debunk the myth of the "Long Tail".Via Mr Nash, here is the link to the whole document. (Link)
Page examined song purchases at a large online digital retail store, which showed that out of an inventory of 13 million songs, 10 million had never been downloaded, even once. It suggested that the idea proposed by WiReD magazine editor Chris Anderson, who in 2004 urged that the future of business was digital retailers carrying larger inventories of slow-selling items was a Utopian fantasy.
The brianiacs at the NYTimes are showing off some of their toys. (Nieman)
The R&D group is obsessed with the ability to seamlessly transition among web-enabled gadgets. They’re not convinced that the future will land on a single, multipurpose contraption — like some sort of Kindle meets Chumby meets Minority Report. Instead, they predict consumers will connect to the Internet through their cars, on their televisions, over mobile networks, and in traditional browsers, while expecting those devices to interact and sync with each other.Reports in the Guardian from the Journalism Enterprise and Experimentation unconference in sunny Birmingham looked at hyper-local success stories (Guardian):
A session in a break-out room featured James Hatts talking about the London SE1 Community website. James was quite candid about getting different levels of support for the initiative from different organisations. Their patch covers Southwark and Lambeth. Southwark Council have, it seems, for years treated them as a news outlet on an equal footing with the traditional local media. By contrast, SE1 have found it difficult at times to even get Lambeth Council to send them press releases. Similarly, Hatt said that whilst Scotland Yard were forthcoming with information about serious crime in the area, the local police forces were more cagey.The International Coalition of Library Consortia has weighed in on the OCLC data usage guidelines (ICOLC):
The member consortia endorsing this ICOLC statement add our recommendation to others in the library community calling for OCLC to withdraw the proposed policy and start anew to formulate a record use policy. Most notably we add our support to the January 30, 2009 Final Report to the ARL Board by the Ad Hoc Task Force to Review the Proposed OCLC Policy for Use and Transfer of WorldCat Records. It includes an extended review of the policy and six recommendations. We concur with the ARL report that OCLC develop a new policy based on widespread member library participation with a clear set of goals and explanations as to how the policy will achieve these goals and how member libraries will be affected operationally and legally.Caroline Pittis from Harpercollins uses BookBusiness magazine to argue publishers must be less reclusive in order to thrive in the social economy (BookBus):
So, how do book publishers add visible value for their authors and consumers in new ways? What needs to change, and perhaps more importantly, what needs to stay the same? As both a publishing “insider” and a frequent reader of publishing’s critics, I am often struck by how the public discussion of these questions is fundamentally different than private ones, how the focus of those inside publishing houses is different from those in the blogosphere. Beyond publishers’ walls, the tremendous value editors and their publishing colleagues provide in helping an author create a publishable work is often unknown. Yet, the vast majority of publishing time and energy go into just this activity—the core of what publishers do.
In the blogosphere, some opine about how hidebound and irrelevant publishers now are, how slow to change and resistant to risks. It makes good copy sometimes—I know I always bite on the most critical headlines first! Rare are the critics, however, who have concrete, insightful, specific suggestions of how to evolve publishing without throwing the baby out with the bathwater. Black-and-white thinking and talk of violent revolution distract many from the natural evolution that is both occurring and will likely be more sustaining for the “book” economy in the long run.
Friday, May 15, 2009
Summary of Making Information Pay
Who was reading in 2008
- 45% of Americans read a book
- The average age of those who read a book was 44
- 58% of readers are women
- 32% of readers are over the age of 55
- The average reader spends 5.2 hours reading per week vs. 15 hours online and 13.1 hours watching TV (In 2008, going online surpassed watching TV as a primary activity)
Who was buying books in 2008
- 50% of Americans over 13 bought a book
- The average age of the most frequent book buyers was 50 yrs old
- 57% of book buyers are female and they buy 65% of books (e.g. women buy books and they buy in volume)
- 67% of books are bought by people over 42; Gen X buy 17% of books; Gen Y buys 10%
- Of books purchased by those who earn $100K or more, mystery and detective fiction represent 16%, juvenile 13%, romance 6%, thrillers 4% and comics and graphic novels 4%
- 41% of all books are purchased by those who earn less than $35K
- The average price of a book purchased last year was $10.08
- 31% of all book purchases are impulse buys
Pelecanos Interviewed by LATimes
George is on tour and will also be autographing at BookExpo.GP: I've been working in adult prisons and juvenile prisons for some time. The prison in the book is based on a place called Oak Hill here in D.C., which is where all the juveniles are sent if they do time. I was out there one day -- I was kind of walking around, I had full access, the boys were in class, and I went into one of the kids' cells -- it just kind of hit me. It was a 6-by-9 cell, it's basically a cot and an open commode sitting in the middle of the room, and there's a dirty piece of plexiglass on the wall that functions as a window, but it's so dirty that you can't see out of it, and very little light gets in. I just started thinking: What's it like? What's it like for a kid to go to jail, and also what's it like for his family? How does this tear them all apart?
I'd been very interested for a long time in incarceration reform. Here in Washington we have a new guy that's been at it for several years now. He's done a tremendous job of trying to change things so that these incarcerations don't just rip up families but neighborhoods and our city, because that's what happens. That's why the book is split into two parts, so you see the way the system was run before, and then these guys go back later in the book when the jail is getting ready to be torn down, and they see how much it's changed for the better.
Thursday, May 14, 2009
JISC Study on Higher Education
The impetus for change will come from students themselves as the behaviours and approaches apparent now become more deeply embedded in subsequent cohorts of entrants and the most positive of them – the experimentation, networking and collaboration, for example – are encouraged and reinforced through a school system seeking, in a reformed curriculum, to place greater emphasis on such dispositions. It will also come from policy imperatives in relation to skills development, specifically development of employability skills. These are backed by employer demands and include a range of ‘soft skills’ such as networking, teamwork, collaboration and self-direction, which are among those fostered by students’ engagement with Social Web technologies.
Higher education has a key role in helping students refine, extend and articulate the diverse range of skills they have developed through their experience of Web 2.0 technologies. It not only can, but should, fulfil this role, and it should do so through a partnership with students to develop approaches to learning and teaching. This does not necessarily mean wholesale incorporation of ICT into teaching and learning. Rather it means adapting to and capitalising on evolving and intensifying behaviours that are being shaped by the experience of the newest technologies. In practice it means building on and steering the positive aspects of those behaviours such as experimentation, collaboration and teamwork, while addressing the negatives such as a casual and insufficiently critical attitude to information.
Tuesday, May 12, 2009
Google Print: A Numbers Game
The following post is written by Andrew Grabois who worked with me at Bowker and has (among other things) compiled bibliographic stats out of the Books In Print database for a number of years. His contact details are at the bottom of this article.
On February 6th, Google announced that the Princeton University library system agreed to participate in their Book Search Library Project. According to the announcement, Princeton and Google will identify one million works in the public domain for digitization. This follows the January 19th announcement that the University of Texas libraries, the fifth largest library in the U.S., also climbed on board the Library Project. Very quietly, the number of major research libraries participating in the project has more than doubled to twelve in the last two years. The seven new libraries will add millions of printed items to the tens of millions already held by the original five, and more fuel to the legal fire surrounding Google’s plan to scan library holdings and make the full texts searchable on the web.
The public discussion has been mostly one-sided, with Google supporters trying to hold the high moral ground. Their basic argument goes something like this: The universe of published works in the U.S. consists of some 32 million books. They argue that while 80 percent of these books were published after 1923, and, therefore, potentially protected by copyright, only 3 million of them are still in-print and available for sale. As a result, mountains of books have been unnecessarily consigned to obscurity.
No one has yet challenged the basic assumptions supporting this argument. Perhaps they’ve been scared off by Google’s reputation for creating clever algorithms that “organize the world’s information”. This one, though, doesn’t stand up to serious scrutiny.
The figures used by supporters of the Library Project come from a 2005 study undertaken by the Online Computer Library Center (OCLC), the largest consortium of libraries in the U.S. According to the OCLC study, its 20,000 member libraries hold 31,923,000 print books; the original five research libraries participating in the Google library scanning project hold over 18 million.
OCLC did not actually count physical books. They searched their massive database of one billion library holdings and isolated 55 million catalog records describing “language-based monographs”. This was further refined (eliminating duplicates) to 32 million “unique manifestations”, not including government publications, theses and dissertations. The reality of library classification, however, is such that “monographs” often include things like pamphlets, unbound documents, reports, manuals, and ephemera that we don’t usually think of as commercially published books.
The notion that 32 million U.S. published books languish on library shelves is absurd. Just do the math. That works out to more than 80,000 new books published every year since the first English settlement in Jamestown in 1607. Historical book production figures clearly show that the 80,000-threshold was not crossed until the 1980’s, after hovering around 10,000 for fifty years between 1910 to1958. The OCLC study showed, moreover, that member libraries added a staggering 17 million items (half of all print collections) since 1980. That averages out to 680,000 new print items acquired every year for 25 years, or more than the combined national outputs of the U.S., U.K., China, and Japan in 2004.
Not only will Google have to sift through printed collections to identify books, and then determine if they are in the public domain, but they will also have to separate out those published in the U.S. (assuming that their priority is scanning U.S.-based English-language books) from the sea of books published elsewhere. The OCLC study clearly showed that most printed materials held by U.S. libraries were not published in the U.S. The study counted more than 400 languages system-wide, and more than 3 million print materials published in French and German alone in the original Google Five. English-language print materials accounted for only 52% of holdings system-wide, and 49% in the Google Five. Since more than a few works were probably published in the United Kingdom, the total number of English-language books published in the U.S. will constitute less than half of all print collections, both system-wide and in Google libraries.
So how many U.S.-published books are there in our libraries? Annual book production figures show that some 4 million books have been published in the 125 years since figures were regularly compiled in 1880. If, very conservatively, we add an additional 1.5 million books to cover the pre-1880 years, and another 1.5 million to cover books published after 1880 that might have been missed, we get a much more realistic total of 7 million.
Using the lower baseline for published books tells a very different story than the dark one (that the universe of books consists of works that are out-of-print, in the public domain, or “orphaned” in copyright limbo) told by Google and their supporters. With some 3 million U.S. books in print, the inconvenient truth here is that 40% of all books ever published in the U.S. could still be protected by copyright. That would appear to jive with the OCLC finding that 75% of print items held by U.S. libraries were published after 1945, and 50% after 1974.
If we’re going to have a debate that may end up rewriting copyright law, let’s have one based on facts, not wishful thinking.
Andrew Grabois is a consultant to the publishing industry. He has compiled U.S. book production statistics since 1999. He can be reached at the following email address: agrabois@yahoo.com
Clarification update from Andrew: My post is not intended to be a criticism of the OCLC study ("Anatomy of Aggregate Collections: The Example of Google Print for Libraries") by Brian Lavoie et al, which is a valuable and timely look at print collections held by OCLC member libraries. What I am attempting to do here is point out how friends of the Google library project have misinterpreted the paper and cherry-picked findings and conclusions out of context to support their arguments.
Monday, May 11, 2009
Houghton Mifflin Harcourt Lose Credit Rating
Moody’s said that it took into account “the business risk and competitive position of the company versus others within its industry; the capital structure and financial risk of the company; the projected financial and operating performance of the company over the near-to-intermediate term, and management’s track record and tolerance for risk”.
Last month, Moody’s downgraded some of HMH’s debts to Caa3 from Caa1, and put it on a negative outlook. The move meant that it classed the company’s debts as high risk.
Earlier this year, long time HMH executive and current CEO Tony Lucki retired and was replaced by Barry O'Callaghan.
Sunday, May 10, 2009
MediaWeek (Vol 2, No 19): Newspapers, BookClubs, Television, George Orwell
In the Internet era, many sectors of American media have been re-enacting their at first complacent and finally panicked behavior of 60 years ago. Few in the entertainment business saw the digital cancer spreading through their old business models until well after file-sharing, via Napster, had started decimating the music industry. It’s not only journalism that is now struggling to plot a path to survival. But, with all due respect to show business, it’s only journalism that’s essential to a functioning democracy. And it’s not just because — as we keep being tediously reminded — Thomas Jefferson said so.Yes, journalists have made tons of mistakes and always will. But without their enterprise, to take a few representative recent examples, we would not have known about the wretched conditions for our veterans at Walter Reed, the government’s warrantless wiretapping the scams at Enron or steroids in baseball.A few months ago I made a similar observation (PND):
Last week, I was discussing this topic with an acquaintance who lives in a fairly affluent part of Central New Jersey. He noted that, in a wide swath covering eight to ten townships and a number of counties, he wasn’t aware of more than one journalist assigned to that market from the larger state-wide newspapers. In Hoboken (regional HQ for PND), where mayoral and city council budget incompetence has seen our property taxes increase 50% in the past six months, there is rarely any local media coverage nor any attendance at city business meetings by traditional media. And forget investigative reporting - even in a state where you could throw a rock in any direction and hit a shady politician. The lack of journalistic attention means that one of the mainstays of democracy (the fourth estate) is eroded and this is seen starkly in Hoboken, where private citizens are forced (on their own initiative) to file freedom of information requests to gain access to basic public interest materials such as meeting minutes and financial statements.
And Rupert Murdoch was vocal this week becoming the self-interested shrill for paid newspaper content (Guardian) Nature Publishing is launching Nature Education and their first project is something named Scitable (ZDNet)
The first deliverable from the Education group is Scitable, an incredible combination of social media and a vast library of articles “commissioned, edited, and reviewed by NPG editors.” While Nature Education intends to expand Scitable’s offerings to include cellular and molecular biology and ultimately tackle the physical sciences, their initial focus has been on genetics. Exploring Scitable makes it very clear that this strategy of sticking with a single area of expertise and dealing with it expertly and in-depth makes a great deal of sense. I’m not a geneticist, but I spent enough years in the publish or perish world of biomedical research to know that they nailed this. The content is accessible, deep, relevant, and understandable. Entire college courses could be taught around this material and there is more than enough content to keep high school students digging deeper into their Advanced Placement Biology courses or to build introductory genetics curricula.Watch the first series of Wallander stories on PBS happy in the knowledge that there will be a second series - and they won a BAFTA. (TheBookselller) Publishing Trends took a detailed look at Book Clubs (PublishingTrends)
Online retailers’ deep discounts, however, have lured away readers who might once have joined book clubs because they wanted cheap books, and any book can be found online, so today’s book clubs must offer something beyond price and selection. For PBC, that means an active online community; in fact, the club is online only. To expand its reach, PBC has linked with 37 “Alliance Partners,” including the Huffington Post and Daily Kos, and e-mails these organizations’ members about new books once a month. “Those people may buy books through us or may go on to Amazon,” Rosen says, claiming, “We just want to help these books sell, the books to do well, and the authors to do well, and this is our mission.”A look at how in failing healthGeorge Orwell was able to finish 1984 (Observer)
This is one of Orwell's exceedingly rare references to the theme of his book. He believed, as many writers do, that it was bad luck to discuss work-in-progress. Later, to Anthony Powell, he described it as "a Utopia written in the form of a novel". The typing of the fair copy of "The Last Man in Europe" became another dimension of Orwell's battle with his book. The more he revised his "unbelievably bad" manuscript the more it became a document only he could read and interpret. It was, he told his agent, "extremely long, even 125,000 words". With characteristic candour, he noted: "I am not pleased with the book but I am not absolutely dissatisfied... I think it is a good idea but the execution would have been better if I had not written it under the influence of TB."And he was still undecided about the title: "I am inclined to call it NINETEEN EIGHTY-FOUR or THE LAST MAN IN EUROPE," he wrote, "but I might just possibly think of something else in the next week or two." By the end of October Orwell believed he was done. Now he just needed a stenographer to help make sense of it all.
Fears of new technology on incumbent businesses - in this case television - are often found to be unfounded. The Economist notes the impact of Digital Video Recorders (DVRs) that we supposed to destroy the television advertising model but have had nothing like that impact. In the words of one interviewee, DVR's have become "a hit saving machine" (The Economist)
Far from being revolutionary, in some ways DVR has made television more stable. With the exception of live events it is broadly true that the most popular programmes are recorded the most. Mr Wakshlag describes it as “a hit-saving machine”. Broadcast television receives a bigger boost from DVR playback than cable television. The device has made it harder to introduce a new television programme, particularly at 10pm when people are likely to be playing back shows they recorded at 8pm or 9pm.
Friday, May 08, 2009
Body Double Twits
Many people will know Mike Hyatt (CEO, Thomas Nelson) is an avid social network user. Why? Because he sees social networking as an important aspect of his job as CEO, and not just to spout off whimsically about this and that but to actively and meaningfully engage both his employees and customers. There would be no chance he would engage a body double twit. Mike often plays first line customer service rep on Twitter which is where his activity is significant. If he sees an item having to do with TN he will step in directly and engage with the person or persons who are either seeking help or complaining about something. A body double CEO can never do the same thing in the same way.
Played for laughs, Leaver went even further down the 'we're getting it wrong in social networking' road by telling us that the guy twit responsible for twittering on all things pregnancy - for their Good Expectations titles - recently got a response from someone that read, "I'm having Braxton Hicks contractions right now, what do I do?" It was amusing, but am I laughing because I'm imagining the helplessness of the twit or because I can't believe they've got this so wrong? If you are going to bother using social networking, be like Mike and make it real.
Thursday, May 07, 2009
Murdoch on the Kindle and Paid Content
If it is possible to charge for content on the web, it is obvious from the Journal’s experience. We are now in the midst of a [proper] debate over the value of content and it is clear to many newspapers the current model is malfunctioning. We have been at the forefront of that debate and you can confidently presume that we are leading the way in finding a model that maximizes revenues and returns for our shareholders. I can assure you we will not be feeding our content rights to the fine people who created the Kindle. We will control the prices for our content and we will control the relationship with our customers. Any device maker or website which doesn’t meet these basic criteria on content will not be doing business long-term with News Corporation.PS: Barely a mention of Harpercollins. A tough 3rdQ means HC will need a very strong 4thQ to finish the year in positive territory. (PublishersWeekly)
Too many content creators have been passive in the face of obvious violations of intellectual property rights. We rightly hold China and other countries accountable on this important issue. But the violation of these rights is rampant on the web in our own country. Our content is extremely valuable and the violators have recognized that value.
Within the company itself, the very bright people we have at our Slingshot Laboratories are devising clever ways to monetize the content of some of our long established print properties. We will be matching their contemporary expertise and the creation of communities within our traditional -- with our traditional expertise in the creation of content. The [current days of the Internet] will soon be over.
Big Kindle Goes to School (Shrug)
For Amazon however, Big Kindle will still be giggled over by those in Cupertino. Will the Apple tablet be any better? It is certain to have a far better form factor. Whether it will be expressly suited to delivering educational content in a dynamic and forward thinking manner remains to be seen. That is certainly not within the capabilities of Big Kindle.
Several universities participated in the launch of Big Kindle and the hype around the launch hid a troubling question; namely, why these schools were in-bed with the retailer at all? On a list serve I questioned,
Don't the participating universities appear to be endorsing a hardware platform (not to mention a specific retail channel). You could argue (possibly strongly) that allowing the bookstore to be managed by B&N or Follett or even the adoption of college textbooks themselves to be little different; however, doesn’t the changed paradigm suggest an opportunity to operate on a more open field of play or is this more of the same leading to more student frustration, higher prices and deadened innovation in education?In other words, why would the universities want to continue the (essentially) old way of doing business when most observers believe we are on the cusp of a renaissance in educational learning. The Kindle doesn't do multimedia, it doesn't do color and most importantly it doesn't do networking because the Kindle is a closed system. This is a short sighted collaboration between schools and Amazon that doesn't really suggest any major change.
As I thought about the Big Kindle development it struck me that there could still be a more interesting development. Content media companies suddenly developing a hardware delivery platform are growing like weeds from NewsCorp to Hearst, and there could be an opportunity for collaboration between the news/magazine world and education. What if CourseSmart or Safari joined one of these efforts? That would be a far more interesting and potentially game changing development than selling text book content on a Big Kindle. By definition, the hardware to support a digital magazine will be capable of all the aspects necessary in delivering a changed educational experience.
Will that happen? As it turns out some of the partners involved in CourseSmart are also participating in the Big Kindle roll out; but, what is CourseSmart if it isn't a new way to deliver educational materials to learners? That doesn't seem to be what students will be getting with Big Kindle. There may be all kinds of reasons why CourseSmart (or even an publisher themselves) won't be launching a device: The predominant reason may be the amount of print revenue tied to Amazon, and therefore from my perspective Big Kindle and education is more about marketing hype than anything fundamental. We await more developments with keen interest.
Tuesday, May 05, 2009
Library Associations Address Issues in Google Settlement
On how important the database may become:
Notwithstanding these deficiencies in the ISD, an institutional subscription will provide an authorized user with online access to the full text of as many as 20 million books. Students and faculty members at higher education institutions with institutional subscriptions will be able to access the ISD from any computer -- from home, a dorm room, or an office. Accordingly, it is possible that faculty and students at institutions of higher education will come to view the institutional subscription as an indispensable research tool. They might insist that their institution’s library purchase such a subscription. The institution’s administration might also insist that the library purchase an institutional subscription so that the institution can remain competitive with other institutions of higher education in terms of the recruitment and retention of faculty and students.And this in regard to market power:
However, as likely consumers of this essential research facility, the Library Associations cannot overlook the possibility that the Registry or Google might abuse the control the Settlement confers upon them. Abuse of this control would threaten fundamental library values of access, equity, privacy, and intellectual freedom.This with respect to pricing:
Google will have the incentive to negotiate vigorously with the Registry to set the price of the institutional subscription as low as possible to maximize the number of authorized users with access to the ISD. Nonetheless, Google’s business model, at least with respect to the institutional subscription, may change, and at some point in the future it may seek a profit maximizing price structure that has the effect of reducing access.On privacy:
Significantly, the predominant model for pricing of scientific, technical, and medical journals in the online environment has been based on low volume and high prices. Major commercial publishers have been content with strategies that maximize profits by selling subscriptions to few customers at high cost. Typically these customers are academic and research libraries. Therefore, the Registry and Google may seek to emulate this strategy in the market for institutional subscriptions.
Evidently, in the Settlement negotiations the class representatives insisted on these measures to protect the security of digital copies of their books; but no one demanded protection of user privacy. Users of the services enabled by the Settlement also cannot rely on competitive forces to preserve their privacy. In the online environment, competition is perhaps the most powerful force that can help to insure user privacy. If a user does not like one search engine firm’s privacy policy, he can switch to another search engine. Similarly, a user has many choices among online retailers, email providers, social networks, and Internet access providers. The competitive pressure often forces at least a minimal level of privacy protection. However, with the services enabled by the Settlement, there will be no competitive pressure protecting user privacy.They worry about intellectual freedom and censorship:
While Google on its own might not choose to exclude books, it probably will find itself under pressure from state and local governments or interest groups to censor books that discuss topics such as alternative lifestyles or evolution. After all, the Library Project will allow minors to access up to 20% of the text of millions of books from the computers in their bedrooms and to read the full text of these books from the public access terminals in their libraries.Addresses issues with new affiliated services:
Although the Settlement permits the Registry to license the rights it possesses to third parties such as Amazon, the Settlement does not require it to do so. Nor does it provide standards to govern the terms by which the Registry would license these rights. This means that the Registry could refuse to license the rights to Google competitors on terms comparable to those provided to Google under the Settlement.43 The Registry, therefore, could prevent the development of competitive services.
Monday, May 04, 2009
Remember the Book Clubs?
From this article:
He views PBC not as a mere book club, but as a book community. Members can discuss their views online in community discussion forums and read exclusive content from highly regarded writers and journalists. PBC also offers a unique book-reading and charity-giving combination; books are selected for their liberal bent by a renowned editorial board (members include Michael Chabon, Dave Eggars and Erica Jong) and $2 of the proceeds from each book sale go to one of the participating nonprofits.And this:
(My post Silos of Curation from last week)."We don't like to think of ourselves as pushing our books on readers; we like to think that we are curating and recommending a selection of books for our readers," Ms. Siegel said.
Direct Brands spent $51.8 million on advertising in 2008, according to TNS Media Intelligence estimates, excluding internet data -- a decrease of 18% over 2007.
Sunday, May 03, 2009
MediaWeek Report (Vol 2, No 19): Blog Roundup: Week 18 - Schlager, Exact Editions, O'Reilly, Booksquare
For too long–decades now, really–reference publishers have pumped out a cascade of books (and now databases) but done very little to address a fundamental problem: discoverability. Reference books have always required a conduit–the librarian–to be used properly and fully, because their contents don’t show up in any card catalog. A student writing a paper about the Battle of Gettysburg has no idea that the multivolume encyclopedia buried away in a far corner of the library has wonderful information that can tell her everything she needs to know, unless a librarian is there to help her, and unless that librarian himself is familiar with that set. As a result, as studies have shown, print reference sections in all libraries have been gathering dust, day by day, year by year, decade by decade. The familiar library convention discussion group topic–”Is Print Reference Dying?”–is both mordantly funny and also terrifyingly legitimate. The truth is that lots of print reference is still published and bought, but most of the new stuff joins its ancestors–it sits on a shelf, unused.
The situation is only modestly better with electronic reference. Tech-savvy students may indeed be more likely to stumble upon resources that they can use in this setting, but “stumble” is still the operative word. First, they have to navigate a myriad of unique, siloed databases, with inscrutable names and idiosyncratic search interfaces. Then, they have to be careful enough to pick the search results gems from what may be a torrent of hits.
Adam Hodgkin reflects on London Bookfair as well as thoughts about the meaning of the Google Book Agreement (ExactEditions):
If we think of this rather large and hangar-like hall being occupied by the books that are currently the focus of the commercial market for books, we can also imagine a skyscraper of 30 or perhaps 40 stories being built above the Earls Court Stadium. The stacked stories of this skyscraper will each contain another 300,000 mostly older books, but this time all of them ordered, regimented and deployed in total silence and precise obedience with no noisy haggling or discordant trading. Such a skyscraper would be a serious obstacle on the flight path for planes approaching Heathrow, but its towering shadow does give us an idea of the relative scale of the Google Books Search project as set against the current (this year, last year) output of publishers in the English language. The 10,000,000+ books that Google will have in its arsenal when the Google Book Search library goes live in a year of two will completely dwarf the current activity. The 40 odd stories of the Google Books skyscaper will not need the traditional tools and mechanisms of the book trade. The transactions, accessibility, searchability, and reading of these millions of books will all be a matter of database and web-driven activity. Commercial arrangements will be settled by the Books Rights Registry or the publishers' agreements with Google and the commercial transactions and access rules will be executed by Google or its contracted distributors. There will be very little need for human intervention, except at the periphery. When authors, agents and publishers decide to put things into the system, or, at the consumer edge, when readers, searchers, librarians or consumers decide that they wish to have some form of access to the repository. Of course Google will also not need a skyscraper at all. The few hundred terabytes, possibly by then one or two petabytes, that may be needed for the Google nearly-complete libary in 2012 will comfortably fit in the confines of the whirling, bladed and racked systems, housed in a single standard freight container. We should add a few more trailers to cope with the bandwidth of a billion users, but it is all fitting nicely in the underground loading bay that they have at Earls Court. The efficiency and reliability of the Google system does not require large physical infrastructure. Push on a couple of years, and by 2014 I think one can be sure that Google will have most of the world's published literature in the Google database. How will new books then be working in relation to the 50, 60, 70, 80 stories high skyscraper of previously published but now completely databased and universally accessible digital books?
Brian O'Leary had some thoughts on curation as a method to disperse the increasing clutter of information (Magellan Partners):
But that’s not quite a business model. As my colleague Mac Slocum noted, “The world definitely needs a clear-headed curation advocate, particularly one that links it directly to revenue (this labor of love stuff only goes so far ...).”
Historically, reviewers got paid by newspapers, who relied on some mix of advertising and subscription revenue to create and deliver a comprehensive product. It’s believed that few people read everything, but the “one size fits all” model was accepted by readers and advertisers alike.
Booksellers are paid for curation when they sell something. Unfortunately, the small, independent bookstore whose title mix reflects a niche, a neighborhood or a community is hard to sustain. This curation question came to a head two weeks ago, when Amazon appeared to delist titles with gay and lesbian content.
Joe Wikert posted some video from the CEO Roundtable at the TOC conference earlier this year. It features, Bob Young (Lulu), Mike Hyatt (ThomasNelson), Tim O'Reilly and Clint Greenleaf (Greenleaf Publishing). (2020 Blog)
Back on the curation theme Clay Shirky talks about "filter failure" and how collaboration may be the answer to finding the best information and news content (Pub 2.0)
The #swineflu hashtag on Twitter serves as a good point of reference for what Clay Shirky called “filter failure.” The problem is not that there’s a wild abundance of useful information, overloading us with detail, facts, and commentary; the problem is that we don’t have the proper filtering system set up to separate trusted sources and reliable resources from rumors, jokes, misinformation, and ephemera. If those seeking to provide links to reliable information started using a hashtag such as #therealswineflu, it would likely be overtaken — quickly — by tagged content with less value, whatever its source.
So how do we solve filter failure?
We depend on humans to serve as our filters. We do this all the time, when we ask a friend a question, or talk with someone we know who happens to be an expert on a given topic. (I imagine the world’s epidemeologists are fielding a huge number of Facebook messages from old friends this week.)
When it comes to reliable sources for news that breaks on a massive scale, our best sources are likely to be Wikipedia for facts, and journalists for explanation, clarification, context, and meaningful analysis.
Tim O'Reilly discusses a new book project written in (on?) Powerpoint (Radar)
Of course, modularity isn't the only thing that publishers can learn from new media. The web itself, full of links to sources, opposing or supporting points of view, multimedia, and reader commentary, provides countless lessons about how books need to change when they move online. Crowdsourcing likewise.
But I like to remind publishers that they are experts in both linking and in crowdsourcing. After all, any substantial non-fiction work is a masterwork of curated links. It's just that when we turn to ebooks, we haven't realized that we need to turn footnotes and bibliographies into live links. And how many publishers write their own books? Instead, publishers for years have built effective business processes to discover and promote the talents of those they discover in the wider world! (Reminder: Bloomsbury didn't write Harry Potter; it was the work of a welfare mom.) But again, we've failed to update these processes for the 21st century. How do we use the net to find new talent, and once we find it, help to amplify it?
Kassia Krozser expresses some concern over privacy that may engender if the Google Booksettlement goes ahead (Booksquare)
If the settlement is approved, then Google owns lots and lots of readers. We’re locked into the Google service if we want the best possible search results. Yet our concerns were not addressed in the settlement. One such worry is the privacy factor.
Every move we make online is tracked and traceable. Generally, this is not a concern; so much data is being crunched that individuals are rarely singled out for close examination. But this audit trail can be used against us, and I hadn’t really considered the implications of my online activity in light of GBS until I read a recent call from the Electronic Frontier Foundation.
Physical libraries have long held firm against law enforcement seeking to use customer records against individuals (and it’s just one more reason to love librarians!). What we read should remain private to us. However, once we, as a society move beyond the physical into the digital, new rules seemingly apply. Now is the time to ensure that the GBS includes consumer privacy protections.
MediaWeek (Vol 2, No 18): Amazon, OCLC, Springer, RFID
We are very excited to announce that more than 25 providers, including CardinalCommerce, Miva Merchant, Magento, ShopVisible, Mercantec, and Zoovy will be supporting Amazon Payments as part of their offerings. In addition, Convio, a software and services provider to the nonprofit community, is integrating Amazon Payments into its fundraising platform to enable its client base to accept alternate payment methods for online donations. Plug-ins for widely-used open source e-commerce platforms, such as osCommerce and ZenCart, are also now available for quick and easy integration with Amazon Payments.Informa moves domicile for tax reasons and also looks to raise cash via rights issue (TimesOnline):
Informa, the publisher behind Lloyd's List, is to desert Britain for Switzerland to avoid “double taxation” controversially introduced by Alistair Darling for profits earned overseas.
The move, announced as the debt-laden company also said that it would tap shareholders in a £242 million rights issue, will save Informa an extra charge of about £10 million a year, based on last year's profits.
Discussion over RFID use in Libraries. Some commentary on placing bibliographic data on the RFID chip which seems to me to be a disastrous idea. RFID Blog:
The question of “what goes on the tag” has been occupying the list quite a bit this week. Prompted by an enquiry from Helen Jarvis at the University of Kent I wrote a short reply to try and explain my assertion that adding bibliographic data to tags was not necessarily a good idea. My invitation for someone to “tell me I’m an idiot” was enthusiastically accepted by Ivar Thyssen, Export Manager of PV Supa, who suggests that placing any bibliographic data on tags is, in fact, illegal. I must confess that this came as something of a surprise to me but not as much of a surprise at it will be to those libraries that have already begun adding bibliographic data to tags. We’ll have to see how Ivar’s assertions stand up under scrutiny, since he has been invited to provide backing for this claim by Brian Green Executive Director of the ISBN agency but if he’s right the rules have just changed again.OCLC announces a report: Online Catalogs, What Users and Librarians Want.. Selected key research findings:
- The end user’s experience of the delivery of wanted items is as important, if not more important, than his or her discovery experience.
- End users rely on and expect enhanced content including summaries/abstracts and tables of contents.
- An advanced search option (supporting fielded searching) and facets help end users refi ne searches, navigate, browse and manage large result sets.
- Important differences exist between the catalog data quality priorities of end users and those who work in libraries.
- Librarians and library staff, like end users, approach catalogs and catalog data purposefully. End users generally want to fi nd and obtain needed information; librarians and library staff generally have work responsibilities to carry out. The work roles of librarians and staff infl uence their data quality preferences.
- Librarians’ choice of data quality enhancements refl ects their understanding of the importance of accurate, structured data in the catalog.
Just as predictably, the leaders in cloud computing are absent from the list of supporters: Amazon, an online retailer that has successfully branched out into computing services; Google, which is not only a huge cloud unto itself but has built a cloud-computing platform for use by others; Salesforce.com, the biggest provider of software-as-a-service; and Microsoft. Indeed, it was an executive at the world’s biggest software firm, Steven Martin, who first leaked the manifesto, complaining that it had been drawn up in secret. “It appears to us that one company, or just a few companies, would prefer to control the evolution of cloud computing,” he wrote in a blog.Speculation on which Private Equity firms may be interested in an investment in Springer (FT):
Leading private equity groups are competing to inject about €400m ($530m) of equity into Springer Science and Business Media, the German academic publisher, which is looking to sell a stake of as much as 49 per cent.
Blackstone, CVC Capital Partners and TPG are all considering submitting first-round bids, due by next month.
Other groups mulling over a bid are: Kohlberg Kravis Roberts, Hellman & Friedman, Carlyle, EQT and Providence Equity Partners.
Friday, May 01, 2009
Truthiness a Casualty in Reality
Additionally, there was no significant difference between the groups in thinking Colbert was funny, but conservatives were more likely to report that Colbert only pretends to be joking and genuinely meant what he said while liberals were more likely to report that Colbert used satire and was not serious when offering political statements. Conservatism also significantly predicted perceptions that Colbert disliked liberalism.See, that's what makes these people so dangerous.
Tuesday, April 28, 2009
Amazon Stanza: This Changes Nothing
- E-Pub - whether e-pub is the correct standard is less important than everyone adopting a common standard. It is also crucial that they adopt the common standard without bastardizing it and creating 'their version' of e-pub. At LBF last week, there were many asides such as 'they have their own version' of e-pub. Publishers need to force this issue collectively and aggressively (and this is not a collusion issue) so that the e-Book supply chain can run as smooth as possible. The industry is wimpy in its policing of a standard for e-Book content and it needs to get stronger. The gyrations publishers are going through to prepare and deliver a variety of formats is stupid and has to stop.
- Interoperability - An e-book purchaser has to be able to take all of this or her purchased content from one e-reader to another with no degradation in experience and no added expense. If I buy a book to read on the Kindle, I must be able to read the same book on a Sony or IRex. That is the minimum a reader should expect. In "bookland" we view our marketplace as the center of the universe; however, there is a much larger related issue around Amazon's data services that impacts a much wider segment of business. Earlier this month in The Economist, the newspaper reported on the "Open Cloud Manifesto" which attempts to set common standards for interoperability across the various cloud computing market offerings. Guess who is refusing to play: Amazon and Microsoft. This means that if I decide to use Amazon for the first three years and then strike a deal with another provider (for lower cost or better service) I am going to have a very difficult time making the switch. Publishers need to require that all retailers enable interoperability across e-readers.
- Archiving - As a reader, I do not want to run the risk of losing access to my e-Books because the vendor stops selling the hardware or sells the division to a competitor. There has to be a mechanism to effectively escrow my content so that I can always get to it. Is it too much to expect a replication of the practice of placing a p-book on my shelf for 50 yrs? Maybe, but why limit my expectations? This also applies when I go overseas: It is unfathomable that I would lose access to something I purchased legally.
- Collaboration - Trade publishers should give serious consideration to collective activity in building a trade version of CourseSmart which is a JV combining many of the top educational publishers in an effort to leverage e-content. CourseSmart is not anti-competitive, rather, it seeks to provide a level playing field for the delivery of all e-content into the educational marketplace. In the p-world, publishers combined their sales, fulfillment and distribution with other publishers (less so in the US but it is a common practice in the UK and Australia) so why not have a similar program for e-Books? If so, it can and should be done in combination with the other ideas above.
- Fight - Hold back e-Content from retailers that refuse to play by the rules. "Fat chance" you say? Well, think about how hard this one will be to consider as an option in 5 or 10 years. If not now then never, and it really is the only credible option. Getting a Coursesmart offering off the ground or actively helping B&N (strange bed-fellows) with their expected e-Book store could be critical here.
There are possibly more points upon which the publishing community should be more forceful, especially as they relate to Amazon; however, rather than lament the further consolidation of e-Book power around the Kindle/Amazon (which will get nowhere,) think about forcing adoption of standards, processes and terms of service that make for a more efficient market. Those are not necessarily in Amazon's interests, but they should be in the publishers'.
Monday, April 27, 2009
London Days of Futures Past
The writer in this case was looking for some new type of reader he could wrap his article around. What he (and the audience) should have been asking is what will we be reading (perhaps “interacting with” would be more accurate) in five years. And there lies the issue. If you asked most of the attendees at this week’s London Bookfair, they might have answered, ”an electronic version of that” while blithely pointing to something on the shelf. A more sophisticated respondent may have noted that e-book sales are only 1% of total revenues and therefore not enough to induce any radical change.
The capabilities of today’s ebook readers and applications surpass the requirements of today’s content and I don’t think that’s a good thing. In the wider world of publishing, there is a lack of ambition when it comes to how content could be presented or even how the novel could be made over. As a case in point, a panel presentation on the publishing future of 2020 offered nothing to the inventive publisher looking for guidance or direction for the future - just a lot of mild blather about workplace diversification (an important topic, certainly, but not in this context), platitudes, generalities and a presentation about workforce training in which the presenter held (in the air!) his ‘props’ (which was charmingly retro but, to my mind, underlined the lack of collective imagination).
This session was followed by the annual ‘state of British publishing’ presented by numerous heads of house in which platitudes were plentiful and expectations were dulled. Guttenberg has been dead a long time but the eBook has cryogenically resuscitated him while, at the same time, causing us to collectively lose our ability to think strategically about the impact of movable type. I mean, if we’re going to continue to bring him up let’s at least document the path from illuminated manscripts to today’s self-publishing. Mention Guttenberg and we get a free pass: I won’t have to think hard about the significance of the migration to e-text, e-delivery, e-commerce, e-interaction, etc., etc., because Guttenberg is the God of publishing change and everyone will get the significance. These publishers stand to be the victims of change rather than its master: The millennial-Monk forced out of a job.
Earlier this year, publishers were as much in the dark about the capabilities of the new Kindle reader as the rest of us. What we should really be considering is how can we can re-think our current products and involve ourselves in the technical development of e-readers and apps that take advantage of our new ideas. Force developers to adapt the technology or incorporate new applications that enable the delivery of our uniquely developed concepts. Press developers to work for us (not surprise us) and haul back some of the product development initiative. Perhaps then consumers will find themselves less in love with the ‘Kindle’ (machine) and more enamored of the content.
Saturday, April 25, 2009
MediaWeek (Vol 2, No 16): OCLC, Television, NYTimes
This report is a high level summary of proceedings, outcomes and proposed next steps. Participant biographies, agenda, presentations, related reading and upcoming events can also be accessed from this Symposium website. Purpose of the Symposium: Explore current models for creation, distribution and maintenance of publisher supply chain and library metadata: Are they sustainable? What are the common needs? Are they subject to duplication of effort across communities? To what extent are they shared and interoperable? Explore new paradigms for metadata creation, distribution and maintenance that: Are more easily shared and interoperable, start upstream and allow metadata to evolve over time, engage multiple communities in the metadata lifecycleAlso, OCLC has updated the interface for worldcat.org and it looks pretty spiffy. Here is a link to an entry for The Good Soldier by Ford Maddox Ford and one of my favorite books. OCLC also took an important step in creating a set of network services dedicated to the library community. Long in the making, OCLC has put some definition around how it sees its revised role in the library world. (This move by them requires a much longer post).
OCLC's vision is similar to Software as a Service (SaaS) but is distinguished by the cooperative "network effect" of all libraries using the same, shared hardware, services and data, rather than the alternative model of hosting hardware and software on behalf of individual libraries. Libraries would subscribe to Web-scale management services that include modular management functionality. Moreover, libraries would benefit from the network-level integration of numerous services that are not currently part traditional integrated library systems, e.g., Knowledge Base Integration, WorldCat Collection Analysis, WorldCat Selection, WorldCat Local, etc..NYTimes (via GigaOm) points out the likely long term failure of international rights in the age of eBooks. (Well not really but that's at the core of this issue). NYT
So we see that Fictionwise is not the only retailer affected by these outdated licensing practices, and that’s exactly what is at play here. Publishers like Fictionwise and Amazon do not own the content they sell, they simply license it for sale just like you and I license it when we buy e-books. The archaic licensing system means that publishers have to make separate license deals for each country in which they want to sell e-books. This is something that is very difficult to do, even for the bigger houses like Fictionwise and Amazon.BTW - Those DVD's that President Obama give to Gordon Brown a week ago don't work in the UK. (Guardian) In Newsweek, read about changes in Television that could presage changes in publishing.
For decades network TV has been about reach. Programmers traditionally chose shows with broad appeal, the better to get millions of viewers and, in turn, persuade national advertisers to buy those eyeballs. That era is essentially over and the networks are scrambling to adapt to a fragmented landscape where even popular shows are lucky to pull in 10 million viewers. "They have to rethink what they put on the air, how many hours they'll do it, everything in their playbook," says a former top executive who now produces TV shows.LATimes festival of books and discussing the future of books. The always quotable Richard Nash: (LAT)
Nash noted that poetry micropresses are flourishing in this new, hectic publishing environment. With what may be the quote of the festival, he added, "Poetry, like porn, is a harbinger of culture."Private Equity investment firm has taken a bath on NYTimes shares and maybe looking to off load them. Who would buy? Reuters
Interest has grown as Harbinger, which bought the shares as part of a campaign against the Times to change its business, reels from losses in its funds, the Journal reported.
Harbinger bought the Times stake over a period of weeks in 2007 and 2008, eventually pouring more than $500 million into the publisher. Since then, Times shares have fallen along with other newspapers, which are fighting for their lives as advertising revenue slumps.Harbinger's stake is worth less than $160 million now.
Tuesday, April 21, 2009
Overview of Open Access Book Projects
Here is a sample:
2008 wasn’t the first year that academic book publishers published OA monographs or discovered the synergy of OA and POD (print on demand). But in 2008 the OA-POD model moved from the periphery to the mainstream and became a serious alternative more often than an experiment. We saw OA monographs or OA imprints from Amsterdam UP, Athabasca UP, Bauhaus-Universität Weimar, Caltech, Columbia UP, Hamburg UP, Potsdam UP, the Universidad Católica Argentina, the American Veterinary Medical Association, the Forum for Public Health in South Eastern Europe, the Institut français du Proche-Orient, and the Society of Biblical Literature.He goes on to name many more programs. He also discusses the Google scanning project:
The settlement could mean that fair use will never be a workable rationale for large-scale book scanning projects, even if Google’s original fair-use claim was strong (as I believe it was). Future scanners may have to pay for permission, in part because Google paid and in part because the new commercial opportunities arising from the settlement itself will weigh against fair-use claims. At the same time, it means that users will have vastly improved online access to books under copyright but out of print (20% previews rather than short snippets), free full-text searching for a much larger number of books, free full-text access from selected terminals in libraries, free text-mining of full texts for some institutional users, and easier priced access to full-text digital editions.
Sunday, April 19, 2009
The Google Settlements Vast Supply of Content
We don't know what the pricing will be to libraries (and Mike Shatzkin and I are attempting to make some estimates) but the methodology for pricing is unlikely to differ substantially from the way existing databases are offered to public and academic libraries. Allowance will be made for institution budgets, school enrollment, population served, etc. and both Google and the Book Rights Registry (AG & AAP) will be interested in maximizing penetration so that their revenues are optimized. There may be built in protection against extortionate pricing since both Google and the BBR want to maximize views which argues for pricing that achieves the widest potential audience for the database. Library penetration will not be 100% but it will be high since libraries - particularly academics and large publics - will feel compelled to purchase access to this content to support their patrons. In fact, not having it will cause more consternation and deliberation.
Many libraries will see licensing this content as an opportunity to put their research capabilities on par with the top order of academic libraries. After all, this content comes from a who's who of top flight public and academic institutions. A small agricultural college in west Texas may never have had the resources to purchase a deep repository of content supporting their core curriculum but here they have the opportunity to do just that.
Opposition to this agreement is building in advance of the early May decision and, while I personally support adoption of this agreement, I am troubled that the fact of the scanning of this material is now treated as a fait accompli and has thus become a starting point for establishing agreement. Resolution should have addressed the core issue of fair use and copyright but that has not been the case and, because those issues have not been addressed, it leaves Google with a certain (some may say excessive) market power and leaves non-participants to this agreement/resolution open to possible future copyright violation.
As has been pointed out (and openly supported by Google), Orphan works legislation is not precluded or superseded by the agreement between AG, AAP and Google. What strikes me as odd however is the lack of attention any member of Congress has paid to this particular issue. To my knowledge no Congressional representative has come out either in support of the Google settlement or become newly interested in Orphan works legislation. Given the intensity of the attention paid to this issue in the publishing and library community it would seem that if Congress is still not interested in addressing Orphan works legislation then they never will. That's the situation we effectively had before the parties agreed to the settlement (and for many years past). I hope Congress does take up Orphan works legislation but in the meantime I also hope a lot of students and researchers make extensive use of this vast supply of content.
Saturday, April 18, 2009
MediaWeek (Vol 2, No 15): Kindle, McClatchy, Dawson, Springer
In actuality there is no reliability to his sources because nothing has been said publicly. Also, if these are the numbers now what will the numbers look like when (if) eBooks really take off and they constitute more than 5% of revenue? Publishing Technology (nee Vista & Ingenta) have announced trials with several publishers that enable e-Book sales directly off the publishers web site (Telegraph):Mahaney’s most recent estimates are for more than 1 million Kindle 2s to sell this year, and the current sales numbers absolutely support that, based on how the economy goes. Amazon could easily increase production to satisfy a surge in demand.
Our sources on Kindle sales have proven extremely reliable in the past. Last August we nailed the number of Kindle 1 devices sold at that time. And we first broke the news
of the Kindle 2 and the new large screen Kindle.
Chris Andersen's Booktour company which is identifies when and where authors have events has received $350K in seed money. Dawson holdings which distributes into the newsagent marketplace in the UK continues to loose clients and it's becoming clear they will have to exit this market (Telegraph):Publishing Technology is creating a new service which will allow publishers to sell their 'e-books' directly though their own websites.
The company has already been running trials with Random House, Harper Collins and Penguin in a move which could cut out the need for the big internet intermediaries.
Dawson also has a library supply business that to-date appears unaffected. McClatchy newspapers reported that they expect to make $225mm in digital revenue this year which translates to 15% of total revenues. (Reuters)Dawson Holdings, which distributes magazine and newspaper titles to newsagents, retailers and airlines, said in a trading update today that it expected Telegraph Media Group to "terminate" its deal in the autumn.
Last month Associated Newspapers, publisher of the Daily Mail, and Comag, the joint venture between Condé Naste and the National Magazine Company, announced they would not be renewing contracts with Dawson when they expire in 2010. The two deals accounted for £139m in revenue for Dawson last year.
Qualification from Springer & Candover that the investors are looking for a $500mm partial sale not a sale of the business. Take that with a grain of salt when viewed in the context of the fact that their entire portfolio is under review. (Hedgeweek)At McClatchy, 15% of our advertising revenue today comes from online. McClatchy, a company founded before the advent of electric lights, will generate nearly $200 million dollars in digital revenue this year at a higher profit margin than our print business.
What significance is this?
- Fifteen percent is above the average newspaper publisher’s take from digital
- $200 million would be almost enough to run The New York Times’s newsroom operations for a year. Not bad.
Higher profit margins than print? We know Gary is a big fan of pop music to highlight his industry presentations, and that he likes the Rolling Stones in particular. Maybe “Time Is on My Side” would be a decent choice for those kinds of numbers.
Thursday, April 16, 2009
CCC Holds Online Google Settlement Seminar: Recording
Google Windfall
Here is a sample of Mike's post:
We believe it is unfortunate that the attention has been focused there because there are some very real commercial questions that we think need answers to fully appreciate the practical implications of the settlement. We’ve been doing our best to build a model of what revenue will be and where it will go. Trying to do that makes it very clear how much important detail has been omitted from the debate we’ve heard so far (and we’ve both heard a lot of it.) Here’s a starter list of questions that need answers to forecast this business which we hope that people more familiar with the terms of the settlement than we are might be able to answer for us.