Sunday, September 27, 2009

Media Week 38: Carver, Google Scholar, Espresso Books, Reader's Digest, BusinessWeek

Per usual most of the following were on the twitter (@personanondata).

Long feature article on Raymond Carver and long time editor Gordon Lish in the Observer.

The pair had worked together for years – Lish, a dashing, influential literary figure once known as Captain Fiction, had published Carver's first stories in Esquire magazine. (They had met in Palo Alto, when Carver was, as his wife later put it, a "practising alcoholic" working at a textbook publisher's.) Lish later became an editor at Knopf and championed many other writers whose styles were unlike Carver's – Don DeLillo, for instance, and Richard Ford. He went on to give writing workshops at which he managed, by all accounts, to be gnomic, crushing and inspiring in relatively equal measure. Lish's own fiction – he wrote stories and novels – is compact, antic and self-reflexive, with titles such as Wouldn't A Title Just Make It Worse?.

Carver was about as far from this world – both in content and style – as it was possible to be. His characters worked in diners and motels; they had amputated limbs and their families had left them, with or without furniture; their working lives, their cropped, half-understood thoughts had not been seen in fiction. Lish had edited Carver's first collection, Will You Please Be Quiet, Please? and together they had composed a taut new voice full of left-field desire and hopeless dread. As Carver put it in the letter of 8 July: "You've given me some degree of immortality already."

Ex-Reed Elsevier CEO Sir Crispin Davis now favorite to become ITV Chairman (Guardian):

"The committee has therefore concluded that it would not be in the best interests of the company to appoint Mr Ball as ITV's chief executive," it added.

ITV insiders maintain that Ball expressed an unwillingness to work with the committee's leading candidate to replace Grade, the former Reed Elsevier boss Sir Crispin Davis, and expressed doubts about another potential candidate, the former Channel 4 chairman and founder of BMI, Sir Michael Bishop. After meeting Ball, Crosby is understood to have got the impression that Ball wanted a mere figurehead as chairman.

"The board was close to appointing Ball and told him about some of the chairman candidates and he told them he did not like any of them," said a source involved in the talks. "The board just felt like it could no longer go on dealing with this man."


PD James is interviewed by The Telegraph:

She has a crack at explaining the genre’s appeal in Talking about Detective Fiction, an idiosyncratic and entertaining primer written at the suggestion of the Bodleian Library, which is publishing the book and to which James is donating hardback royalties. It is not a comprehensive history – she does not read much contemporary crime fiction apart from books by Ian Rankin and her old friend Ruth Rendell – but an imaginative response to some of her favourite authors.

The 89-year-old Lady James is trying to recall what first drew the teenage Phyllis, along with millions of other readers in the Thirties, to the so-called Golden Age detective stories.

“Those books suggested we live in a moral, comprehensible universe, at a time when there was a great deal of disruption and violence at home and abroad, and of course the ever-present risk of war. And we live in times of unrest now, so perhaps we may soon enter another Golden Age.”

Peter Jacso writing in Library Journal takes a long critical look at Google Scholar (LJ):
Google’s algorithms create phantom authors for millions of papers. They derive false names from options listed on the search menu, such as P Login (for Please Login).

Very often, the real authors are relegated to ghost authors deprived of their authorship along with publication and citation counts. In the scholarly world, this is critical, as the mantra “publish or perish” is changing to “publish, get cited or perish.”

Compounding the problem, the inflated publication and citation counts produced by GS will embarrass those who take the reported numbers at face value, as they discover that many of the publications, randomly scattered in the detailed result lists, are just variant formats of the same paper, and the citations are mismatched.

While GS developers have fixed some of the most egregious problems that I reported in several reviews, columns and conference/workshop presentations since 2004—such as the 910,000 papers attributed to an author named “Password”—other large-scale nonsense remains and new absurdities are produced every day.

On-Demand Books' Espresso Machine continues its glacier like expansion with the addition of the Harvard Bookstore (UWire):

The Espresso Book Machine—produced by New York-based firm On Demand Books—has been rolled out to a select few stores to date, but the one at Harvard Book Store will be the first with access to the 2 million public-domain texts digitized by Google, which also announced a deal with On Demand last Thursday.

After the unveiling on Sept. 29, Harvard Book Store customers will be able to order a printed copy of Google’s titles or On Demand’s 1.6 million works—all in public domain because they were copy-righted before 1923.

Store Marketing Manager Heather Gain said owner Jeffrey Mayersohn ’73 bought the machine in pursuit of a broader vision for the store—which he took over from long-time owner Frank Kramer last October.

“He would like to provide customers with every book ever written,” Gain said.

The Espresso Book Machine will be able to print a 300-page paperback book in four minutes, according to Gain, who added that printed books will be competitively priced and indistinguishable from those sitting on the shelves.

Customers will be able to request a book to be printed online or in the store, after which they can either pick it up in-store within minutes or have the book delivered by bicycle either the same or next day. Books can also be shipped to domestic or overseas locations.

See also The NYTimes.

JISC (UK) has undertaken a market study of the impact of eBooks in higher education and their initial reports indicates some startling and unintuitive results (JISC):
The current estimate of revenue generated from publishers selling textbooks direct to students in the UK is £200 million. Publishers are therefore extremely cautious about making e-textbooks available, free at the point of use, through the university library, in case it cannibalises print sales. During the course of the project, the impact on the sales of the print equivalents of the 36 e- books licensed and made freely available to all UK higher education intuitions has been monitored. The data we have suggests that the availability of the e-versions has no impact on the print sales and that, certainly at the moment, e-textbooks are a back up to the print and will co-exist. JISC Collections is encouraging publishers to think of e-textbooks not as a threat, but as a new and different market.
Readers Digest announced it is consolidating its international web presence onto one digital platform (RD):

The Reader's Digest Association is launching a major new global initiative to bring its flagship iconic brand into the international digital arena, it was announced today by Eva Dillon, President, Reader's Digest Community. The company is rolling out a new Global Web Platform in over 40 international markets including China to debut live this week. The launch is a key part of a wider digital monetization strategy that will see Reader's Digest leverage its branded content on a variety of platforms.

Dillon said, "As one of the world's largest producers of original content, Reader's Digest continues its transformation in creating a global brand experience online. This new platform allows each of our international markets to focus on driving digital revenue via advertising sales and e-commerce, and creates a compelling online experience for new and existing customers."

Content re-packaging is a key component of the new Global Web Platform and the company is looking to leverage its existing material, as well as developing Web-exclusive content going forward.

Pressure builds on the commercial activities of the BBC particularly with respect to the company's purchase of Lonely Planet (Bookseller):

A report published on Wednesday (23rd September) by the Commons' Culture, Media and Sport Committee branded BBC Worldwide's purchase of Lonely Planet "the most egregious example" of the company's expansion beyond its existing remit. The committee added that if the Trust had been "a more responsible oversight body" more thought would have been given to the impact of the purchase on the sector as a whole.

The acquisition was originally resisted by rival publishers, who called for a review by the Office of Fair Trading at the time. Time Out guides m.d. Peter Fiennes said that this report had really "upped the ante". He added: "It's really significant that they have singled out the Lonely Planet acquisition. Before it was just one of a number of things . . . they've said it's quite clearly wrong." Fiennes said that there is now "so much more pressure" on the Trust to do something about the acquisition.

Adam Hodgkin makes some suggestions for Bloomberg in their thoughts over the acquisition of BusinessWeek and he concludes, (EE):

Many of these recommendations amount to saying "Make Business Week more like The Economist". One can be sure that The Economist does feature in a competitive analysis of what has gone wrong with BW, but The Economist also has not yet figured out how to deliver a solid audience of digital subscribers. BW will have some advantages in getting this right first. This sale is a break with the past. So much has not been working out well for BW in its digital initiatives that it is time that some sacred cows were sacrificed and some simple steps taken. Building digital subscriptions is the obvious path that needs to be developed.
Great to see Little Dorrit do so well at the Emmys (Guardian):

Little Dorrit, starring Matthew Macfadyen and Sir Tom Courtenay, was named best mini-series and won a brace of awards for writing, directing, art direction and costumes, chalking up more prizes than any other programme. But many of the most prestigious Emmys went to familiar US favourites. The sitcom 30 Rock, starring Alec Baldwin as an egotistical television executive, was named top comedy for the third consecutive year while Mad Men, a critically acclaimed depiction of politically incorrect 1960s advertising executives, won best drama for the second year in a row.

Friday, September 25, 2009

Document Cloud: Back-up in the Cloud

In a twist on the idea of massive data sets - the idea that the data supporting research is as valuable as the conclusions and should be made available as part of research findings - a new start-up is taking the idea to news gathering. DocumentCloud already has the participation of numerous large media companies that will enable these companies to 'deposit' the back-ground material and primary research supporting their investigative and news gathering. Funded initially by grants, DocumentCloud expects to create a revenue model over the next two years as they expand. From their website:
DocumentCloud will be software, a Web site, and a set of open standards that will make original source documents easy to find, share, read and collaborate on, anywhere on the Web. Users will be able to search for documents by date, topic, person, location, etc. and will be able to do "document dives," collaboratively examining large sets of documents. Organizations will be able to do all this while keeping the documents--and readers--on their own sites. Think of it as a card catalog for primary source documents.
Enabling access to this material will, the company says, make it easier for researchers, journalists or bloggers to do research, investigate and report on a wide range of material. And you may be saying to yourself 'I can do this at Scribd, so what's different?' The answer is you can deposit your documents in Scribd and other similar sites and list your documents with DocumentCloud; however, DocumentCloud will be scrupulous in allowing submissions to their listings. From their site:

How will you guarantee authenticity? How will you fight copyright infringement? How will you keep the collection free from spam and inappropriate material?

It will be of the utmost importance to us that the collection remain of the highest integrity, so we're planning to limit the right to list documents, at least initially, to individuals and organizations involved in original reporting. Contributors will agree to a set of guidelines, and will have to vouch for the authenticity of the documents they upload.

More information on their website.

Seth Godin: Rethinking the Publishing Industry

The following was written by Eugene Schwartz who sat next to me at yesterday's meet-up and took better notes than I.

Publishers need to develop their own “tribal” networks to reach readers to whom they will be selling books in the new marketing environment. It is a concept that applies to authors, agents and anyone in the business who wants efficiently to create a market for their work. The old way of waiting for the publisher to promote the work is becoming ineffectual.

This was Seth Godin’s message at a Brown Bag lunch sponsored by the Digital Publishing Group (founded by Susan Danziger of Daily Lit) and held at the Random House building in New York. Godin is author of ten best-selling books including Permission Marketing and Purple Cow. He is the founder of the interest community “lens” builder, and former vice president of direct marketing at Yahoo.

Preaching revolution in the master’s den so to speak, Godin advised the more than 150 largely mainstream publishing house staffers that if they want to advance into the future and their employers didn’t see the light, they should put in the sweat labor in their off hours to demonstrate to their employers the efficacy of building social network followings centered around themes and/or authors. And if this didn’t do the trick, there would be something to be said for leaving and finding – or starting – another venture that understood where the future lies.

Publishers need to recognize that many of the production, marketing and distribution skill sets which authors relied on them for in the past are easily available to the authors themselves as well as to startup publishers by other means. The publisher’s value proposition needs to be reinvented in that light. Godin made the comparison to the music industry, “Music hasn’t gone away, but the old music industry has.”

According to Godin there is a five year window of opportunity for the industry to reshape itself to the new realities: readers can find advance information about any book on line before they buy it, and they will respond to free previews – or even free whole books – by buying more printed works or eBooks. It is in the next five years, he believes, that tribal franchises will be defined and won.

You build this tribe and the right to promote new books to them by gaining their permission through the prior interest you have generated by generous access to content and experiences that draw them to your site and your mailing list.

“If you have people’s attention, you can make money,” Godin declared. You start promoting your new book well before it is written by using the internet through blogging, hosting content-driven sites, and social networking to accumulate a tribal following. When to start promoting? “Five years ahead of time,” he suggested, underscoring the point.

The major error being made by established publishers (and agents and authors I would add) using conventional business models, Godin says, is to see new technology and the internet as a way to make old business models work better instead of as an opportunity to destroy (no sentimentality here) and reinvent the old. Strong medicine, imho – but true. Hard to conceive of at a meeting on the 44th floor of the Random House building – although we can take comfort that at least the building will survive in its present form.

-- Eugene G. Schwartz

Gene Schwartz is currently launching a new web service for authors named WorthyShorts.

Thursday, September 24, 2009

SharedBook Announces Several New Initiatives

Many readers will know I have been a fan of the SharedBook model for several years now and enjoy keeping track of their new initiatives. Here is a recent update on several new announcements from the company:
Today, Congressman John Culberson (R-Texas) placed a link on his site that allows his constituents to read and comment on the House Healthcare Bill, using SharedBook’s annotation platform. In the Congressman’s words, “this new website will give you, my constituents in the Seventh District, a choice in the health care debate. You now have an opportunity to read and comment on the bill. I look forward to reading your comments and restoring public trust in the government by raising the level of openness, order and discourse.” We applaud the Congressman for taking Transparency to a new level by allowing his constituents to give him feedback on healthcare reform in a very granular and detailed way.

Meanwhile, Google has informally announced on its blog that they now have an affiliation with SharedBook’s Blog2Print product, to enable users of the Blogger platform to easily translate their blog into a physical book or PDF download. Since Blog2Print was introduced in July of ‘07 ago, tens of thousands of bloggers have created a permanent record of their posts and photos. See

On September 14th, Hachette, through its Twelve Books imprint, with authors Po Bronson and Ashley Merryman, launched three chapters of their best seller “NurtureShock” into the SharedBook annotation platform, to allow readers to discuss their controversial findings on child rearing. The New York Times and others covered this experiment in social media as applied to published works and we are excited to watch the discussion progress. (

And Woman’s Day is using SharedBook’s Smart Button to allow consumers of their web content to create their own cookbooks <> from Woman’s Day recipes. With a couple of clicks, users can compile a cookbook of their choice, add more personal content to it if they so desire, and save it to their hard drive or create a hard or soft bound copy.

Finally, we’ve added another 50 titles from assorted publishers to and are looking forward to the Holiday season, when we are told that some major magazines and media outlets will choose these personalized books as a featured Holiday gift item. A home page redesign will be launched to usher in the 4th quarter.
More posts on SharedBook.

Wednesday, September 23, 2009

Is all Springer Science + Business now in Play?

Bloomberg is reporting that bids for the 'up to' 49% share of Springer have been disappointing at the PE owners may be considering sale of the business or sale of a majority stake. Earlier this week bids from a short list of PE firms were noted in the press but these appear to be lower than expected. From the report (Bloomberg)

The Berlin-based publisher may draw better offers with the sale of the whole company or even majority control, the people said. Springer Science announced in April it planned to sell as much as 49 percent of the company to lower debt and fund acquisitions. Offers haven’t met the company’s targets, the people said.

The initial stake sale was intended to raise as much as 500 million euros ($740 million), Eric Merkel-Sobotta, a Springer Science spokesman, had said.

EQT Partners AB, a Swedish private-equity firm partly owned by the Wallenberg family, TPG Inc., Apax Partners LLP and a combination of Providence Equity Partners Ltd. and the Carlyle Group were among the bidders for that stake. The bidders learned Sept. 18 that EQT was the frontrunner, the people said. TPG Inc., Apax and the combination of Providence Equity and Carlyle remain in the running.

Tuesday, September 22, 2009

Justice Prevails: The Deal is Done.

Last Friday, the Justice department (DoJ) effectively ended the debate on the Google Book Settlement (GBS). In an exceptionally well-thought-out, rational and practical submission, the DoJ established for everyone the parameters of the argument and the terms under which the GBS should be approved by the NY court. Google, AG and AAP don't have to agree to all of the suggested changes; there are both degrees and some negotiated offsets that will give the plaintiffs some flexibility in authoring the final, revised document, but what Google, AG and AAP will do is exhale a sigh of relief, incorporate many of the suggestions noted by DoJ and, as a consequence, will expect the court to approve the revised agreement. It is unlikely Judge Chin will preside over the final decision-- there simply isn't time before he (presumably) begins the approval process for his elevation to the Appeals Court. It's possible he will approve as is but with some oversight using specific guidelines to address imposed changes, but that would be unlikely given the 'importance' of this agreement to copyright law. More likely, this case will be passed on to a second judge. As a result, it could be another six months before the final revised version is approved by the court.

Encouragingly, the DoJ was balanced in its opinion, specifically noting the wide public interest that this content database will support. It is this argument that - in part - balanced some of their important concerns. Opponents took specific heart in this statement regarding DoJ's emphatic statement that the agreement should not be approved in its present form. It is important to remember that the DoJ and Judge Chin's court represent two separate arms of government and Chin is not obliged to accept all of the DoJ's statement out of hand. The DoJ's statement is wider in scope than the law upon which Judge Chin is to adjudicate: Specifically, potential impacts on competition that may result from this agreement. But in the context of a well-balanced, nuanced and implementable statement, the judge can use the DoJ statement to 'encourage' the plaintiffs to address some expansive concerns that, in the true interpretation of his remit, would otherwise be outside his immediate concern.

It is the opposition to this agreement that is, ironically, left out in the cold. Congratulations are in order for the success of the intense opposition to the agreement (or parts of it); however, the DoJ statement has established the future parameters of the arguments against. With the DoJ imprimatur now ranking the arguments, it would seem unlikely that any new argument or variation of the old will gain support: In their way, DoJ has validated all the legitimate arguments and anything outside of those will not now have 'legitimacy'. The plaintiffs benefit also because they no longer have a moving target, nor a need to 'read the tea leaves' from the Chin court: The requirements are now specific and actionable.

All this tells me that--unless AG, AAP and Google want to grasp defeat from the jaws of victory and ignore the DoJ statement (and they have already returned to the negotiating table so this is unlikely)-- this agreement will be approved with many of the changes DoJ has specified. Justice prevails for both sides of this argument.

Monday, September 21, 2009

Conference on Google Book Settlement

NYU Law School is hosting a conference on the Google Book Settlement in two weeks. I am on a panel of publishing industry people and will be discussing my estimate of the Orphans population. Here are details (D is for Digitize):

Everything about the Google Book Search project is larger than life, from Google's audacious plan to digitize every book ever published to the gigantic class action settlement now awaiting court approval. The groundbreaking proposed settlement in the Google Book Search case is so complex that controversy has outpaced conversation and questions have outnumbered answers.

We aim to help close these gaps.

D Is For Digitize will give this complex lawsuit the sustained attention it deserves. An interdisciplinary lineup of academics and practitioners will examine the settlement through the lenses of copyright, civil procedure, antitrust, information policy, literary culture, and the publishing industry.

The conference is timed to coincide with the rescheduled fairness hearing in the Google Book Search case, which will be held on Wednesday, October 7 in New York City, just five blocks from the Law School. The next few days after the hearing (October 8th - October 10th) are to provide a forum for addressing the numerous issues that have emerged and are most relevant to society at large.

The conference schedule and speaker list have been posted, and will continue to be updated. For more information about the settlement, visit, our site to study and discuss the proposed Google Book Search settlement. There you can browse and annotate the proposed settlement, section-by-section.


Sunday, September 20, 2009

Media Week 37: Google, Newspapers, Open Access, Australia, Kindle

On Friday the Justice department submitted a "statement of interest" to the NY District Court - Southern District which under Judge Denny Chin is adjudicating the Google Book Settlement agreement. ResourceShelf may well have the most comprehensive list of commentary on the Justice opinion but I found Danny Sullivan's reading of the opinion to be most useful. In short, Justice said there are areas of specific concern and we (Justice) want Judge Chin to instruct the parties to revisit these issues and amend the agreement. In some cases, Justice made specific suggestions as to the changes they sought but in other cases they raised the issue of concern and effectively have left it to the parties to resolve the concerns. From Danny's post:

Finally, the Department Of Justice had two additional thoughts on the settlement.

First, that there be full access to those visually impaired:

In the Proposed Settlement, Google has committed to providing accessible formats and comparable user experience to individuals with print disabilities – and if these goals are not realized within five years of the agreement, Google will be required to locate an alternative provider who can accomplish these accommodations. Along with many in the disability community, the United States strongly supports such provisions.

Second, that the data be “open” for use in a variety of ways:

Second, given the nature of the digital library the Proposed Settlement seeks to create, the United States believes that, if the settlement is ultimately approved, data provided should be available in multiple, standard, open formats supported by a wide variety of different applications, devices, and screens. Once these books are digitized, the format in which they are made available should not be a bottleneck for innovation.

And the conclusion:

This Court should reject the Proposed Settlement in its current form and encourage the parties to continue negotiations to modify it so as to comply with Rule 23 and the copyright and antitrust laws.

Lastly, there is some speculation that Judge Chin will indeed defer his decision because of the manner in which he is handling the significant amount to submissions to the court. Since Judge Chin is to be promoted, the speculation is that we wants to leave as clean a case as possible for the next presiding Judge; which means he is keeping his trap shut.

The Obama administration (FCC) is expected to make a potentially far reaching statement on net neutrality next week (NYTimes):

In 2005, the commission adopted four broad principles relating to the idea of network neutrality as part of a move to deregulate the Internet services provided by telephone companies. Those principles declared that consumers had the right to use the content, applications, services and devices of their choice using the Internet. They also promoted competition between Internet providers.

In a speech Monday at the Brookings Institution, Mr. Genachowski is expected to outline a proposal to add a fifth principle that will prevent Internet providers from discriminating against certain services or applications. Consumer advocates are concerned that Internet providers might ban or degrade services that compete with their own offerings, like television shows delivered over the Web.

He is also expected to propose that the rules explicitly apply to any Internet service, even if delivered over wireless networks — something that has been unclear until now.
Five major American universities commit to support open access journals (Press Release):
Cornell, Dartmouth, Harvard, the Massachusetts Institute of Technology, and the University of California at Berkeley—today announced their joint commitment to a compact for open-access publication.

Open-access scholarly journals have arisen as an alternative to traditional publications that are founded on subscription and/or licensing fees. Open-access journals make their articles available freely to anyone, while providing the same services common to all scholarly journals, such as management of the peer-review process, filtering, production, and distribution.

According to Thomas C. Leonard, University Librarian at UC/Berkeley, "Publishers and researchers know that it has never been easier to share the best work they produce with the world. But they also know that their traditional business model is creating new walls around discoveries. Universities can really help take down these walls and the open-access compact is a highly significant tool for the job."

The economic downturn underscores the significance of open-access publications. With library resources strained by budget cuts, subscription and licensing fees for journals have come under increasing scrutiny, and alternative means for providing access to vital intellectual content are identified. Open-access journals provide a natural alternative.

Google launched a new newsreader named 'flip' and Adam Hodgkin added his thoughts (Exact Ed):
The newspaper and the magazine as a digital experience have to offer sufficient value that a readers is prepared to become a subscriber to the magazine or newspaper. Subscription services -- especially of the digital edition and its archive will generate much more for most publishers, than a Fast Flip of streamed content which will catch a trickle of Ad-sense revenues. Of course, there are changes and more will be needed. Search within a publication is very important. Internal navigation is very important. The possibility of citation and book-marking is essential. External navigation is especially important when it is relevant to the reading experience. But, Fast Flipping? That may be about as much use as Shuffling the news.
In the course of discussions about this new effort from Google, I was made aware of a prototype 'viewer' from the NYTimes which I like. (Prototype). (In true 'this is a demo' fashion it refuses to load at the moment).

A pissing match has developed in Australia over the economics behind the Productivity Commissions findings that lower prices would result if importation rules were lifted on the sale of Books in Australia. Arguments in support of maintaining these rules had focused on the effective 'subsidization' of the local publishing community by publishers who benefit from the (partially) closed Australian book market. (The Australian):

The commission's defence of its findings comes as federal cabinet prepares to make a decision on the issue, with Competition Policy Minister Craig Emerson supporting the commission recommendation, but many of his colleagues, including Industry Minister Kim Carr, Arts Minister Peter Garrett, Attorney-General Robert McClelland, Regional Development Minister Anthony Albanese and Immigration Minister Chris Evans strongly opposed to it.

Some government sources suggested a compromise could be considered, along the lines of the commission's draft report, which recommended the import restrictions apply for 12 months after the release of a book.

In The Atlantic, Kevin Maney addresses "The Kindle Problem": How long can the Kindle survive when it can't match the functionality of print, "the book disappears" nor can it compete with technical functionality from increasing convergence:

All in all, the Kindle ended up caught in a no-man’s land: it has a number of nifty features and convenient aspects – but also significant drawbacks and a high price tag. All of which leaves many consumers unconvinced that they really need to buy the thing.

Meanwhile, competitors have spotted an opening and are taking the opportunity to try to elbow the Kindle aside. As of this year, Google has made 1 million public domain books available for free on the Sony Reader, which is priced at $100 less than the Kindle. By thus joining forces, Google and Sony just might out-convenience Kindle. And in September, Asus, maker of the bargain-basement EeePc netbooks, said it, too, will make a super-cheap e-book reader.

What should Amazon do? Given the device’s inherent limitations, which make it impossible for the Kindle to ever outdo the appeal of the traditional book in every way, Amazon would probably do best to concentrate on the convenience angle. Bezos already has the right approach, with his goal of making every last book available to readers within 60 seconds. If he can achieve that goal, the Kindle will surely be the most convenient bookstore ever. But cost is a facet of convenience, too. And that suggests that the Kindle needs to dramatically drop in price.

Humor with several grains of salt on publishing and self-publishing from Joe Quirk at SFGate (SFGate):
Your publisher has no clothes: Exclusive print-on-demand publishers are knocking traditional books off the bestsellers list and paying authors three times as much money per sale. Robber baron publishing is toppling, and the only things propping it up are myths-- misconceptions that authors themselves cling to. Let's kick out each of these myths in short order, and watch the robber barons fall.

Big New York publishers will give me an advance! Your advance is a LOAN with your career as collateral. If borrowing money from your credit card at 8% interest to support your writing is a bad idea, than borrowing money from a big New York publisher against books you haven't sold yet is a catastrophic idea. Bankruptcy ends after 7 years. The Red Mark next to your name is forever.

Big New York publishers will get me publicity! If you can't pay for your own publicity, why would you let your publisher borrow profits from books you haven't sold yet to make spending decisions over which you have no control? It works for the robber barons to pay extravagantly for ten long-shots if one book pays off extravagantly. It is catastrophic for you to be one of the nine long-shots they waste money on.
In Sports, another humiliation for the England cricket team (ABC)

Saturday, September 19, 2009

Boeing Boeing Gone

My family did a lot of traveling when I was young. We moved clockwise around the world starting in 1968 when we moved to Thailand, and with the grand parents back in the UK we traveled back to the UK every few years. Dad worked for a company owned by Pan Am which made travel free and as our travel experiences coincided with the launch of the 747 we spent many hours flying in these fantastic aircraft.

I've been scanning some old photos and came across this one of the Pam Am 747 Clipper Intrepid (N749PA). This photo was taken in Honolulu in December 1980 and out of curiosity I tried to track the aircraft down. Since you can find anything out on the internet, it turns out the name of the aircraft was changed to Clipper Dashing Waves and was finally taken out of service in 1991. Sadly, it's last resting place was far less glamorous than Honolulu. (Photo)

Wednesday, September 16, 2009

ISBN Webinar: Slide Presentation

A number of people I emailed about the Webinar asked about whether the slides from Mark Bide's presentation would be available and sure enough here they are.

Monday, September 14, 2009

ISBN Webinar

Don't forget to sign up for the free BISG seminar on the future of ISBN and identifiers with Mark Bide as your host.

Mark Bide of Editeur is hosting a BISG Webcast on the future of ISBN (BISG):

The book industry has had the ISBN for nearly 40 years; there has been little cause for excitement. Now, suddenly the whole subject of "identifiers" has become a hot topic, particularly when it comes to digital books and other online resources. This BISG Webcast will explore why the book industry has standard identifiers, and consider the future of the ISBN (International Standard Book Number), as well as the role of newer identification standards like ISTC (International Standard Text Code) and ISNI (International Standard Name Identifier). What do you need to know to make informed decisions about how -- and whether -- to use them? Register today to find out.
Register here: It is even FREE!

Read my post The ISBN is Dead.

Sunday, September 13, 2009

Media Week 36: Lexis, Google, Copyright, Peer Review, Blackboard

Also on the Twitter highlight reel: Link

Information World Review on how legal publishers are incorporating work flow solutions into their products and documents the path Lexis has take from a collection of public records to a suite of content and services. Their conclusion (IWR):

Meanwhile, Brewer fears that in the medium term the difference in currency and quality between free and paid-for will inevitably narrow. “The paid-for sector will increasingly need to focus on ensuring the benefits of paying for information are not only in the quality of the information, but also in the additional value that can be created by providing it in a form that best suits how the subscriber works, and what they want to do with the information when they receive it. The consumer of legal information can’t really lose: free information is another source to turn to on occasion and its availability will ensure that information providers continue to raise their game.

“We have to deliver our content across different media in order to best serve the needs of professionals, be it in print, online, CD-ROM, RSS feeds, email, etc. This means we need a best-of-breed publishing system. Having an XML-based publishing repository means that most content is now held in a media-neutral format for delivery across multiple media, making it as versatile as possible to suit the digital needs of lawyers and accounting professionals.”

Marybeth Peters, The Register of Copyrights in prepared testimony before the Committee on the Judiciary makes some strong statements regarding the Google Book Settlement. The statement, coupled with her comments at Columbia University last year where she noted Congress had shown only limited interest in Orphan works legislation, seemed to me to be an admonishment to her bosses that they should pay more attention. By some accounts there was a general shug of the shoulders from the Committee.

Here is a sample:
In the view of the Copyright Office, the settlement proposed by the parties would encroach on responsibility for copyright policy that traditionally has been the domain of Congress. The settlement is not merely a compromise of existing claims, or an agreement to compensate past copying and snippet display. Rather, it could affect the exclusive rights of millions of copyright owners, in the United States and abroad, with respect to their abilities to control new products and new markets, for years and years to come. We are greatly concerned by the parties’ end run around legislative process and prerogatives, and we submit that this Committee should be equally concerned.
She summarizes:
It is our view that the proposed settlement inappropriately creates something similar to a compulsory license for works, unfairly alters the property interests of millions of rights holders of out-of-print works without any Congressional oversight, and has the capacity to create diplomatic stress for the United States. As always, we stand ready to assist you as the Committee considers the issues that are the subject of this hearing.
Copyright legal expert William Paltry has a new book (Moral Panics and the Copyright Wars) and is interviewed by Publishers Weekly. (PW):

PW: J.D. Salinger's lawyers are attempting to stop a novel they claim is “an unauthorized sequel” to “The Catcher in the Rye, “ and I can't help thinking that for most of his life Salinger never dreamed he'd be fighting this copyright fight in 2009, because the book was supposed to enter the public domain by now. Can you give us your perspective on copyright term extensions?

WP: That's a wonderful example of copyright gone awry. If you look historically at the terms of copyright, they used to be relatively short. From 1909 to December 31, 1977, you had a 28-year original term and a 28-year renewal term, with renewal being conditioned upon filling out an application. The book industry had a shockingly low rate of renewal, around 10%. Book publishers had staffs that were quite capable of filing renewals. It wasn't a burden for them to do, and it was cheap. Yet publishers didn't renew the vast majority of their copyrights. Why not? Because, economically, it didn't mean anything to them. Most books make their money in a very short period of time. It differs by industry, but for most books, 28 years is enough. With the last extension in 1998, copyright became totally unmoored to its purpose of providing incentive to create new works. The public got nothing from that, and no author in history has ever said, “Life plus 50 is just not enough. I will not create that work unless the copyright exists for my life and 70 years.” That's absurd.

Steve Jobs suggests that the Kindle will only be short-lived unless it offers more functionality than just reading content (NYTimes):

But in the interview, Mr. Jobs said general-purpose devices are more appealing than specialized devices like’s Kindle e-book reader.

“I think people just probably aren’t willing to pay for a dedicated device,” he said. “You notice Amazon never says how much they sell; usually if they sell a lot of something, you want to tell everybody.”
Someone forgot to mention the iPod is a standalone device.

The Times notes the results of a widely distributed research study on Peer Review (TimesOnline):
The first conclusion worth mentioning is that while few people think peer-review is perfect, the scientific community seems broadly content with it. Only 32 per cent of respondents thought that it was as good as it could be, but 69 per cent said they were satisifed. I don't think anybody should find this particularly surprising. Among the scientists I speak to, the general consensus on the process seems usually to be the Churchillian one. More eye-catching was the finding that 81 per cent think that peer-review should be capable of indentifying plagiarism, and 79 per cent think it should catch fraud. I find this interesting because, with the best will in the world, it's hard to see peer-review as it stands reliably accomplishing either goal.

Blackboard is periodically noted as an acquisition target and here Inside Higher Ed gives us five reasons Microsoft will buy the company (IHEd):
Is Blackboard too small a company to take advantage of the opportunities they have created by rolling up the for-profit CMS space? Is Blackboard an outlier in a world of consolidation within the technology industry? Is Microsoft the right company to purchase Blackboard? Would this be a good or bad thing for higher education? What do you think the odds are that I'm correct that we will see a Microsoft purchase of Blackboard by the end of 2010?
I think you could come up with five good reasons why Google will by Blackboard.

Remember when Steve McQueen, Jimmy Garner and Dickie Attenborough where digging holes under the barbed wire? Not so the Welsh imprisoned at Stalag IVb, near Mühlberg in Germany, between July 1943 and December 1944. They went in for publishing (BBC):
But some Welsh prisoners of war overcame adversity with a remarkable series of morale-boosting magazines about their homeland called Cymro (Welshman). They stole medicine to make ink, while their meagre rations were used to stick illustrations onto pages from school exercise books. It featured snippets of news from home taken from letters sent by loved ones, and was handwritten in English and Welsh from inside Stalag IVb, near Mühlberg in Germany, between July 1943 and December 1944. Now, as the 70th anniversary of the start of the war is commemorated, the National Library of Wales in Aberysytwyth has published its collection of the magazines online.

In sport, England: an outstanding display to record an eighth successive win of a flawless qualifying campaign. What a relief. (Link) Then there's Wayne Rooney.

Thursday, September 10, 2009

Senator Al Franken draws map of USA

Once you get to Nebraska it gets easier. Asked to place the US on a world map I've seen some place it upside down: this is almost a party trick.

Wednesday, September 09, 2009

580,388 Orphan Works – Give or Take

Clearly one of the most (if not the most) contentious issue regarding the Google Book Settlement (GBS) centers on the nebulous community of “orphans and orphan titles”. And yet, through the entirety of the discussion since the Google Book Settlement agreement was announced, no one has attempted to define how many orphans there really are. Allow me: 580,388. How do I know? Well, I admit, I do my share of guess work to get to this estimate, but I believe my analysis is based on key facts from which I have extrapolated a conclusion. Interestingly, I completed this analysis starting from two very different points and the first results were separated by only 3,000 works (before I made some minor adjustments).

Before I delve into my analysis, it might be useful to make some observations about the current discussion on the number of orphans. First, when commentators discuss this issue, they refer to the ‘millions’ of orphan titles. This is both deliberate obfuscation and lazy reporting: Most notably, the real issue is not titles but the number of works. My analysis attempts to identify the number of ‘works’; Titles are a multiple of works. A work will often have multiple manifestations or derivations (paperback, library version, large print, etc.) and thus, while the statement that there may be ‘millions of Orphans titles’ may be partially correct, it is entirely misleading when the true measure applicable to the GBS discussion is how many orphan works exist. It is the owner (or parent) of the work we want to find.

To many reporters and commentators, suggesting there are millions of orphans makes sense because of the sheer number of books scanned by Google but, again, this is laziness. Because Google has scanned 7-10 million titles then, so the logic goes, there must be ‘millions of orphans’. However, as a 2005 report (which I understand they are updating) by OCLC noted, all types of disclaimers should be applied to this universe of titles such as titles in foreign languages, titles distributed in the US, titles published in the UK, to name a few. Accounting for these disclaimers significantly reduces the population of titles at the core of this Orphan discussion. These points were made in the 2005 OCLC report (although they were not looking specifically at orphans) when they looked at the overlap in title holdings among the first five Google libraries. (And if you like this stuff, this was pretty interesting). Prognosticators unfamiliar with the industry may also believe there are millions and millions of published titles since, well, there are just lots and lots in their local B&N and town library.

The two methods I chose to try to estimate the population of orphans relied, firstly, on data from Bowker’s BooksinPrint and OCLC’s Worldcat databases and, secondly, on industry data published by Bowker since 1880 on title output. I accessed BooksinPrint via NYPL (Bowker cut off my sub) and Worldcat is free via the web. The Bowker title data has been published and referred to numerous times over the years and I found this data via Google Book Search; I also purchased an old copy of The Bowker Annual from Alibris.

In using these databases, my goal was to determine whether there are consistencies across the two databases that I could then apply to the Google title counts. In addition to the ‘raw data’ I extracted from the databases, OCLC (Dempsey) also noted some specific numbers of ‘books’ in their database (91mm), titles from the US (13mm) and non-corporate ‘Authors’ (4mm). Against the title counts from both sets of data, I attributed percentages which I then applied to the Google universe of titles (7mm). (My analysis also 'limits' these numbers to print books excluding for example dissertations).

In order to complete the analysis to determine a specific orphan population, I reduced my raw results based on best guess estimates for non-books in the count, public domain titles and titles where the copyright status is known. These final calculations result in a potential orphan population of 600,000 works. I also stress-tested this calculation by manipulating my percentages resulting in a possible universe of 1.6mm orphan works. This latter estimate is (in my view) illogical as I will show in my second analysis.

An important point should be made here. I am calculating the potential orphan population, not the number of orphans. These numbers represent a total before any effort is made to find the copyright holder. These efforts are already underway and will get easier once money collected by the Books Rights Registry is to be distributed.

My second approach emanated from my desire to validate the first approach. If I could determine how many works had been published each year since 1924 then I could attribute percentages to this annual output based on my estimate of how likely it was that the copyright status would be in doubt. Simply put, my supposition was that the older the work, the more likely it was that it could be an orphan.

Bowker has consistently calculated the number of works published in the US since 1880 (give or take) and the methodology for these calculations remained consistent through the mid-1990s. According to their numbers, approximately 2mm works were published between 1920 and 2000. Unsurprisingly, a look at the distribution of these numbers confirms that the bulk of those works were published recently. If there were (only) 2mm works published since the 1920s, it is impossible to conclude there are millions of orphan works.

To complete this analysis, I aggressively estimated the percentage of works published each decade since 1920 which could be orphan works. The analysis suggests a total of 580K potential orphan works which, as a subset of the approximately 2mm works published in the US during this period, seems a reasonable estimate. My objective to ‘validate’ my first approach (using OCLC and BIP data) shows that both approaches, using different methodology, reach similar conclusions.

There are several conclusions that can be drawn from this analysis. Firstly, since the universe of works is finite then, beyond a certain point, the Google scanning operation will begin to find ‘new’ orphans at a decreasing rate. I don’t know if this number is 5mm scanned titles or 12mm but my estimate is 7mm because, according to Worldcat, there are 3mm authors to 12mm titles. If you apply this ratio to the Bowker estimate of total of works published, the number is around 7-8mm titles. Secondly, publishing output accelerated in the latter part of the 20th century which means that, while my estimates in percentage terms of the number of latter day orphans were comparably lower than the percentages applied in the early part of the century, the base number of published titles is much higher, therefore the number of possible orphans is higher. Common sense dictates that it will be far easier to find the parents of these later ‘orphans’.

In the aggregate, the 600K potential orphans may still seem high against a “work” population of 2.2mm (25%). I disagree, given the distribution of the ‘orphan’ works (above paragraph) and because I have assumed no estimate of the BRR’s effort to find and identify the parents. In my view, true orphans will be a much lower number than 600,000, which leads me to my final point. Money collected on behalf of unidentified orphan owners will eventually be disbursed to cover costs of BRR or to other publishers. There has been some controversy on this point and it derives, again, from the idea that there are millions of orphans and thus the pool of undisbursed revenues will be huge. The true numbers don’t support this conclusion. There will not be a huge pool of royalty revenues to be ultimately disbursed to publishers who don’t ‘deserve’ this windfall because there won’t be very many true orphans. The other point here is that royalty revenues will be calculated on usage and, almost by definition, true orphan titles for the most part are not going to be popular titles and therefore will not generate significant revenues in comparison with all other titles.

This analysis is not definitive, it is directional. Until someone else can present an argument that examines the true numbers and works in more detail, I think this analysis is more useful to the Google Settlement discussion than referring by rote to the ‘millions of orphans’. The prevailing approach is lazy, misleading and inaccurate.

(Thanks to Mike Shatzkin who encouraged me to think about analysis and helped me conclude it. Grateful thanks to others who also helped review the post).
Reblog this post [with Zemanta]

Monday, September 07, 2009

Dear Bank of America (and Ken Lewis): Here's my Problem

Mr. Kenneth D. Lewis
CEO & President
Bank of America Corporation
100 North Tryon Street
Charlotte, NC 28202

August 31, 2009


Ref: Making a Checking Deposit

Dear Mr. Lewis,

I wouldn’t say I love the Bank of America brand so much as grossly respect it for all its red, white and blue effrontery. That ‘in your face we’re bigger and better than all of you’ attitude is hard to resist, which is consequently why I continue to use your bank despite a spate of blunders on your behalf. This last incident stole the show and, Kenneth, I know I shouldn’t use the word ‘steal’ with respect to any financial institution (let alone yours), but for the two weeks my cash sat in monetary purgatory I began to believe stealing it back was my only option.

We can both agree that technology is a powerful and seductive mistress. Imagine - if you can - how it might feel to become the victim of a crime so seductive that you hardly know it has occurred. (I bet a lot of American taxpayers feel that way today – am I right, Ken?). How was I to know that, at the moment your auto-mistress sucked my check from my hand (giving me just that little electric tingle of self-satisfaction that I was working on the veritable cusp of technology), that this act would set in motion a series of draconian events no one from your fine corporation could ever hope to explain?

Kenneth, what happens when a check is deposited and clears the payees’ account? Yes, it is made available to the depositor (not a trick question)! I am sorry if this is elemental for you yet, in my recent experience, a perfectly good check deposited via the same cash machine – identical in amount, payee and issuer to 15 previous checks deposited at virtually the same time each month for the past 15 months - was summarily rejected by Bank of America. Now Mr. Kenneth, I’m sure you are thinking (just like I was) ‘how could this happen at Bank of America to one of our long-time customers?’ I’ll get to that last bit in a minute. But I really hope you know the answer because no one else at your bank has a clue.

I guess (and why shouldn’t I? That’s what your staff do when they don’t know the solution), the real answer lies in your use of technology: Bank of America has become a programming experiment and, as a result, the staff is now as clueless as the customer to explain how simple tasks - like depositing a check or transferring money from one account to another - can go inexplicably wrong. And, Kenneth, it’s not the staff’s fault - you have placed them in this intolerable situation; But maybe now I begin to understand your inspired and strategic leadership in closing down the retail operations. I mean, getting rid of staff that’s uninformed and lacking in effective training must be better than leaving them defenseless on the front line of customer service. By the way, have you spent any time in one of your local branches recently, Kenneth? Was it like bobbing rudderless in a sea of ineptitude?

Kenneth, for fifteen days a significant amount of money was neither in my account nor in the account of my employer. It was, however, in your account. Fifteen days, Kenneth! For a check no different than one deposited a month earlier and one deposited a month later (which cleared in the usual day or two). Why, Kenneth? Where’s the explanation? (And Ken, please note that I’m looking for an explanation that actually makes sense).

Which reminds me, Kenneth, that I did want to come back to what I thought was my long-standing relationship with your bank. Since my account has been open for over 20 years, I believe I have banked with Bank of America for over 20 years. Far be it from me to be so tactless to note the value of my deposits over those years but it is your business and it is a lot. Really – a lot, but that seems to be utterly meaningless to your staff: “Not with this bank” one of your staffers was quick to assert, only because I have been one of a multitude of accounts swallowed up by the bank that couldn’t say no (that would be yours). Truth be told, I guess I’m really a Nat West customer and I sometimes look back longingly on those days. Far be it from the government to talk Nat West into a bank merger, Kenneth! (Raising the issue of your deal making might not seem relevant but if acquired customers aren’t ‘real’ customers, then what are they?) And the shareholders, Kenneth, imagine how they feel when they realize you’ve prioritized deal making over their interests. What kind of executive management is it that folds in the face of such government cajoling? I’m sure Merrill Lynch will eventually come good for you, though.

So where does this leave us, Ken? I wish I could say I want to stay with Bank of America because you are the best around. That’s not the case. I’m stuck with you. Just this week, we realized that in ten years you’ve never reduced our overdraft interest rate (in spite of the fact that the prime lending rate has collapsed over that period), but you are still as inefficient as ever in crediting our account with deposits, which, of course, causes us to use the credit line. Slowness pays dividends (and bonuses too, I suppose. Am I right Kenneth?). Just last Sunday, your technology placed a hold on my cash card for some spurious and inconvenient reason. (Well, probably – who really knows?) Kenneth, you are getting worse not better.

I’m leery of your technology and despite your retail close-down I’m now looking for more human contact. In fact, for all my deposits, check cashing and payments I now go to the teller window. Sure, it’s less efficient and costs you more, but one of the nonsensical explanations for my problem(s) was that had I done it at the teller window, I wouldn’t have had a problem. Mr. Ken, I’ve tested this out and it seems to be the case!

So, along those lines, I am enclosing a check for $7.83 which I was hoping you could deposit for me. I’ve included a deposit slip.

I look forward to your apology.

Best regards,

Michael Cairns
A lifetime customer of Nat West.

NOTE AND UPDATE (SEPTEMBER 15th): I received a friendly call from Melanie in Mr. Lewis' office. She called to discuss my issue and to tell me that my letter had been forwarded to Mr. Lewis. She also indicated that my deposit check had been forwarded to the deposit by mail department for deposit.

Wednesday, September 02, 2009

I'll Be Back: With Free Textbooks

All educational publishers know the holy trinity of textbook publishing: California, Florida and Texas. And winning or losing one out of three of these states in an adoption can tip the economic balance of any program. If California goes free not only will the economics for education publishing companies radically shift, but it is likely that Florida and Texas and many other states will follow California's lead in sourcing free educational content. Most immediately, California's migration toward the provision of free textbooks has been driven by the state's precarious financial situation, and there is an effective moratorium on new textbook purchases that is expected to last until 2014. While California's approach may seem drastic (or innovative, depending on your perspective), California is actually following a movement toward free textbooks that has been gaining steam over the past several years (GeorgiaTech). That said, California appears to be the first state to specifically identify free electronic texts that may be used in the classroom.

In May, Governor Schwarzenegger established a "Free Digital Textbook Initiative" to review free digital high school textbooks to determine which met the state's established academic standards. State education officials asked content developers to submit content and the California Learning Resource Network (CLRN) was asked to facilitate the review of the submitted content. The results were not to be considered an endorsement by the state (eventhough most of the free textbooks scored highly) however even as a 'dry-run' or experiment, this effort is likely to both encourage other suppliers of free content and local decision makers to consider adopting free content as part of their curriculum. Which is the intention.

In this first step, the initiative asked for textbooks in math and science and nine suppliers submitted 16 titles. The publishers were both individual educators and publishers and Pearson was the only 'traditional' publisher that chose to submit content. Embarrassingly, Pearson scored one of the lowest scores against the 'content standards met' criteria. (Why they were there at all is perhaps a more interesting discussion point.) The full report is located here.

In addition to the direction from the state level to evaluate digital content, other agencies have also joined in to support this initiative. Notable among these has been the California Educational Technology Professionals Association (CETPA) which recently organized a seminar showing participants how digital content could be integrated into the HS curriculum. The textbook content reviewed by CLRN will be available in classrooms in the fall.

The Governor's office made the following announcement:
Since these digital books are downloadable and may be projected on a screen, viewed on a computer, printed chapter by chapter, or bound for use in the classroom, schools can take advantage of these free, standards-aligned resources using existing hardware - even in classrooms without computers or laptops for every student.

To showcase the multiple ways in which digital textbooks can be used, the California Educational Technology Professionals Association (CETPA) today hosted 200 educators, technology professionals and content providers for a digital textbook symposium at the Orange County Department of Education. Teachers led students through lesson plans using digital textbooks in four mock classrooms, demonstrating the materials’ interactive potential. CETPA also moderated panel discussions about the future of digital education and potential next steps in this innovative effort.

Secretary of Education Glen Thomas spoke at the symposium and added, “I applaud the Governor for his leadership and vision in launching this groundbreaking initiative. This represents an important first step toward ubiquitous instruction that will help ensure all California students have access to the first-rate education they deserve.”
As this program develops, it will be interesting to see how the concept of a textbook begins to change. One of the criteria listed in the 'parameters' for review of the digital content is that the material must be 'stable for two years': Changes to the content are not allowed. For some subjects, this parameter should be no problem but, as the state evaluates social science and some other (dynamic) subjects, this parameter will begin to look quaint and limiting in what advantages digital content - free or paid - is able to deliver over print formats. In turn, as the parameters change, so will the process of vetting and approving titles for use in high schools. This initiative, viewed skeptically when it was announced earlier this year, has not only delivered tangible results to California educators but also represents a significant strategic issue for all traditional publishers as they navigate their digital frontier.

Tuesday, September 01, 2009

In Support of the Google Book Settlement

It isn't unusual to hear from financial analysts here at PND HQ who offer all manner of crazy predictions over (in particular) eBook take-up and how the Amazon Kindle is going to take over the universe, so it was particularly welcoming this morning to read Jeffrey Lindsay's (Analyst, Global Internet at Bernstein) defense of the Google Book Agreement.

From Lindsay:
Google has just done something rather wonderful. It is on the verge of an astonishing achievement that will benefit the U.S. for generations, bridging a major part of the digital divide and giving the country a global lead in a key area – scholarship. Its reward: a lawsuit, public criticism from the hastily reconstituted and Orwellian-named “Open Book Alliance” (Microsoft, Yahoo! and Amazon) and scrutiny by the Justice Department. Imagine what might have happened had they had tried to destroy a competitor’s business model by bundling its product into an operating system or attempted to corner the e-book market by making a proprietary closed system to force users to buy online books only form them.
I like the irony. Lindsay alerts us to the effort by Microsoft (in particular) and their aborted effort to implement their own digitization program - one which in my view never really got off the ground. With a little bit of a dig, he seems to suggest that Microsoft didn't really have the consumers' or publishers' interest in mind when they unceremoniously canceled the Live Search Books program after their late and halfhearted approach last year. So under those circumstances is Microsoft a viable challenger to this agreement when they chose to abandon their effort?
Only Google stayed the course and so now only Google has the world’s largest digital book archive. So what is it going to do that is so terrible now that it has this archive? According to Google it is simply going to let people search it for free and if they want to buy the books direct them to a range of other sellers – hardly cornering much of the value of book digitization.
Lindsay does address three important objections - competition, BRR representation and privacy - and introduces these as follows:
Ignoring the competitively-motivated hyperbole there are some grounds for concern with the Google Book Rights Registry agreement. No legal agreement is perfect and given the way events have picked up speed since Google reached agreement with the Authors Guild, some concerns these do merit some serious consideration. The usual Google refrain "trust us, we do no evil" line may be well intended but the company has already had a couple of near misses on privacy; the Viacom-YouTube lawsuit for example (where Viacom subpoenaed and received full records of all videos seen on YouTube). Moreover Google already caved on censorship in China – clearly as a corporate entity it is susceptible to arm-twisting to a greater degree than the small but well documented number of brave librarians and book sellers in the U.S. who have turned down user reading list requests from the Police and FBI.
Moreover even assuming Google's current management team is well intentioned and trustworthy who can give guarantees about the actions of future generations of management? Considered objections from academics and public watchdog institutions such as the Center for Democracy and Technology fall into three broad categories: (1) lack of competition; (2) limited representativeness of the BRR and its potential for self interested behavior; and (3) Privacy
With respect to pricing he notes critics of the GBS use the pricing models of academic journal publishers as proof that Google will act with similar disregard for universal access and fairness; however, he does note that Google's behavior to date has been more 'altruistic' than the behavior exhibited by those same publishers. Despite this, he concludes that perhaps some type of regulatory oversight might be called for once the agreement is approved.

On the Book Rights Registry he comments,
The BRR in principle has no incentive to drive down the costs of knowledge and given its privileged position could actually act in self-interested ways – analogies to the Olympics venue selection committees have been made. In addition parallels have been drawn with the BRR's unique gatekeeper position relative to the fragmented base of book users prompting comparisons with the cable industry and health insurers.
Again the solution seems to be some sort of oversight of regulation to counter-balance market failure.The EFF position on privacy which I noted a few weeks ago is also referenced as an important issue in not only the debate over the settlement but the wider implications for how Google charts everything we see and do. Books of course hold a particular sacrosanct position in terms of privacy and librarianship and if nothing else many would want Google to act in a similar way to the stand taken by many librarians in the face of subpoena and the FBI.

Lindsey closes with a desire to see the settlement approved by the court noting that the access to knowledge afforded by the agreement exceeds any negative aspects of the deal especially if supervision is also prescribed. His final comments concludes,
With good regulation this repository of human knowledge and ideas could be kept accessible to millions at low or zero cost while ensuring the rights to knowledge and privacy set out in the Bill of Rights could be preserved for generations. What is the alternative? Forcing Google to destroy this database may delight a small number of extremely rich individuals in the Pacific Northwest, but would be one of the greatest acts of Luddite vandalism of modern time. We hope the regulators will be enlightened and bold in the upcoming hearings on October 7th.