Wednesday, September 23, 2009

Is all Springer Science + Business now in Play?

Bloomberg is reporting that bids for the 'up to' 49% share of Springer have been disappointing at the PE owners may be considering sale of the business or sale of a majority stake. Earlier this week bids from a short list of PE firms were noted in the press but these appear to be lower than expected. From the report (Bloomberg)

The Berlin-based publisher may draw better offers with the sale of the whole company or even majority control, the people said. Springer Science announced in April it planned to sell as much as 49 percent of the company to lower debt and fund acquisitions. Offers haven’t met the company’s targets, the people said.

The initial stake sale was intended to raise as much as 500 million euros ($740 million), Eric Merkel-Sobotta, a Springer Science spokesman, had said.

EQT Partners AB, a Swedish private-equity firm partly owned by the Wallenberg family, TPG Inc., Apax Partners LLP and a combination of Providence Equity Partners Ltd. and the Carlyle Group were among the bidders for that stake. The bidders learned Sept. 18 that EQT was the frontrunner, the people said. TPG Inc., Apax and the combination of Providence Equity and Carlyle remain in the running.

Tuesday, September 22, 2009

Justice Prevails: The Deal is Done.

Last Friday, the Justice department (DoJ) effectively ended the debate on the Google Book Settlement (GBS). In an exceptionally well-thought-out, rational and practical submission, the DoJ established for everyone the parameters of the argument and the terms under which the GBS should be approved by the NY court. Google, AG and AAP don't have to agree to all of the suggested changes; there are both degrees and some negotiated offsets that will give the plaintiffs some flexibility in authoring the final, revised document, but what Google, AG and AAP will do is exhale a sigh of relief, incorporate many of the suggestions noted by DoJ and, as a consequence, will expect the court to approve the revised agreement. It is unlikely Judge Chin will preside over the final decision-- there simply isn't time before he (presumably) begins the approval process for his elevation to the Appeals Court. It's possible he will approve as is but with some oversight using specific guidelines to address imposed changes, but that would be unlikely given the 'importance' of this agreement to copyright law. More likely, this case will be passed on to a second judge. As a result, it could be another six months before the final revised version is approved by the court.

Encouragingly, the DoJ was balanced in its opinion, specifically noting the wide public interest that this content database will support. It is this argument that - in part - balanced some of their important concerns. Opponents took specific heart in this statement regarding DoJ's emphatic statement that the agreement should not be approved in its present form. It is important to remember that the DoJ and Judge Chin's court represent two separate arms of government and Chin is not obliged to accept all of the DoJ's statement out of hand. The DoJ's statement is wider in scope than the law upon which Judge Chin is to adjudicate: Specifically, potential impacts on competition that may result from this agreement. But in the context of a well-balanced, nuanced and implementable statement, the judge can use the DoJ statement to 'encourage' the plaintiffs to address some expansive concerns that, in the true interpretation of his remit, would otherwise be outside his immediate concern.

It is the opposition to this agreement that is, ironically, left out in the cold. Congratulations are in order for the success of the intense opposition to the agreement (or parts of it); however, the DoJ statement has established the future parameters of the arguments against. With the DoJ imprimatur now ranking the arguments, it would seem unlikely that any new argument or variation of the old will gain support: In their way, DoJ has validated all the legitimate arguments and anything outside of those will not now have 'legitimacy'. The plaintiffs benefit also because they no longer have a moving target, nor a need to 'read the tea leaves' from the Chin court: The requirements are now specific and actionable.

All this tells me that--unless AG, AAP and Google want to grasp defeat from the jaws of victory and ignore the DoJ statement (and they have already returned to the negotiating table so this is unlikely)-- this agreement will be approved with many of the changes DoJ has specified. Justice prevails for both sides of this argument.

Monday, September 21, 2009

Conference on Google Book Settlement

NYU Law School is hosting a conference on the Google Book Settlement in two weeks. I am on a panel of publishing industry people and will be discussing my estimate of the Orphans population. Here are details (D is for Digitize):

Everything about the Google Book Search project is larger than life, from Google's audacious plan to digitize every book ever published to the gigantic class action settlement now awaiting court approval. The groundbreaking proposed settlement in the Google Book Search case is so complex that controversy has outpaced conversation and questions have outnumbered answers.

We aim to help close these gaps.

D Is For Digitize will give this complex lawsuit the sustained attention it deserves. An interdisciplinary lineup of academics and practitioners will examine the settlement through the lenses of copyright, civil procedure, antitrust, information policy, literary culture, and the publishing industry.

The conference is timed to coincide with the rescheduled fairness hearing in the Google Book Search case, which will be held on Wednesday, October 7 in New York City, just five blocks from the Law School. The next few days after the hearing (October 8th - October 10th) are to provide a forum for addressing the numerous issues that have emerged and are most relevant to society at large.

The conference schedule and speaker list have been posted, and will continue to be updated. For more information about the settlement, visit thepublicindex.org, our site to study and discuss the proposed Google Book Search settlement. There you can browse and annotate the proposed settlement, section-by-section.

Registration

Sunday, September 20, 2009

MediaWeek (Vol 2, No 37): Google, Newspapers, Open Access, Australia, Kindle

On Friday the Justice department submitted a "statement of interest" to the NY District Court - Southern District which under Judge Denny Chin is adjudicating the Google Book Settlement agreement. ResourceShelf may well have the most comprehensive list of commentary on the Justice opinion but I found Danny Sullivan's reading of the opinion to be most useful. In short, Justice said there are areas of specific concern and we (Justice) want Judge Chin to instruct the parties to revisit these issues and amend the agreement. In some cases, Justice made specific suggestions as to the changes they sought but in other cases they raised the issue of concern and effectively have left it to the parties to resolve the concerns. From Danny's post:

Finally, the Department Of Justice had two additional thoughts on the settlement.

First, that there be full access to those visually impaired:

In the Proposed Settlement, Google has committed to providing accessible formats and comparable user experience to individuals with print disabilities – and if these goals are not realized within five years of the agreement, Google will be required to locate an alternative provider who can accomplish these accommodations. Along with many in the disability community, the United States strongly supports such provisions.

Second, that the data be “open” for use in a variety of ways:

Second, given the nature of the digital library the Proposed Settlement seeks to create, the United States believes that, if the settlement is ultimately approved, data provided should be available in multiple, standard, open formats supported by a wide variety of different applications, devices, and screens. Once these books are digitized, the format in which they are made available should not be a bottleneck for innovation.

And the conclusion:

This Court should reject the Proposed Settlement in its current form and encourage the parties to continue negotiations to modify it so as to comply with Rule 23 and the copyright and antitrust laws.

Lastly, there is some speculation that Judge Chin will indeed defer his decision because of the manner in which he is handling the significant amount to submissions to the court. Since Judge Chin is to be promoted, the speculation is that we wants to leave as clean a case as possible for the next presiding Judge; which means he is keeping his trap shut. The Obama administration (FCC) is expected to make a potentially far reaching statement on net neutrality next week (NYTimes):

In 2005, the commission adopted four broad principles relating to the idea of network neutrality as part of a move to deregulate the Internet services provided by telephone companies. Those principles declared that consumers had the right to use the content, applications, services and devices of their choice using the Internet. They also promoted competition between Internet providers.

In a speech Monday at the Brookings Institution, Mr. Genachowski is expected to outline a proposal to add a fifth principle that will prevent Internet providers from discriminating against certain services or applications. Consumer advocates are concerned that Internet providers might ban or degrade services that compete with their own offerings, like television shows delivered over the Web.

He is also expected to propose that the rules explicitly apply to any Internet service, even if delivered over wireless networks — something that has been unclear until now.
Five major American universities commit to support open access journals (Press Release):
Cornell, Dartmouth, Harvard, the Massachusetts Institute of Technology, and the University of California at Berkeley—today announced their joint commitment to a compact for open-access publication.

Open-access scholarly journals have arisen as an alternative to traditional publications that are founded on subscription and/or licensing fees. Open-access journals make their articles available freely to anyone, while providing the same services common to all scholarly journals, such as management of the peer-review process, filtering, production, and distribution.

According to Thomas C. Leonard, University Librarian at UC/Berkeley, "Publishers and researchers know that it has never been easier to share the best work they produce with the world. But they also know that their traditional business model is creating new walls around discoveries. Universities can really help take down these walls and the open-access compact is a highly significant tool for the job."

The economic downturn underscores the significance of open-access publications. With library resources strained by budget cuts, subscription and licensing fees for journals have come under increasing scrutiny, and alternative means for providing access to vital intellectual content are identified. Open-access journals provide a natural alternative.

Google launched a new newsreader named 'flip' and Adam Hodgkin added his thoughts (Exact Ed):
The newspaper and the magazine as a digital experience have to offer sufficient value that a readers is prepared to become a subscriber to the magazine or newspaper. Subscription services -- especially of the digital edition and its archive will generate much more for most publishers, than a Fast Flip of streamed content which will catch a trickle of Ad-sense revenues. Of course, there are changes and more will be needed. Search within a publication is very important. Internal navigation is very important. The possibility of citation and book-marking is essential. External navigation is especially important when it is relevant to the reading experience. But, Fast Flipping? That may be about as much use as Shuffling the news.
In the course of discussions about this new effort from Google, I was made aware of a prototype 'viewer' from the NYTimes which I like. (Prototype). (In true 'this is a demo' fashion it refuses to load at the moment). A pissing match has developed in Australia over the economics behind the Productivity Commissions findings that lower prices would result if importation rules were lifted on the sale of Books in Australia. Arguments in support of maintaining these rules had focused on the effective 'subsidization' of the local publishing community by publishers who benefit from the (partially) closed Australian book market. (The Australian):

The commission's defence of its findings comes as federal cabinet prepares to make a decision on the issue, with Competition Policy Minister Craig Emerson supporting the commission recommendation, but many of his colleagues, including Industry Minister Kim Carr, Arts Minister Peter Garrett, Attorney-General Robert McClelland, Regional Development Minister Anthony Albanese and Immigration Minister Chris Evans strongly opposed to it.

Some government sources suggested a compromise could be considered, along the lines of the commission's draft report, which recommended the import restrictions apply for 12 months after the release of a book.

In The Atlantic, Kevin Maney addresses "The Kindle Problem": How long can the Kindle survive when it can't match the functionality of print, "the book disappears" nor can it compete with technical functionality from increasing convergence:

All in all, the Kindle ended up caught in a no-man’s land: it has a number of nifty features and convenient aspects – but also significant drawbacks and a high price tag. All of which leaves many consumers unconvinced that they really need to buy the thing.

Meanwhile, competitors have spotted an opening and are taking the opportunity to try to elbow the Kindle aside. As of this year, Google has made 1 million public domain books available for free on the Sony Reader, which is priced at $100 less than the Kindle. By thus joining forces, Google and Sony just might out-convenience Kindle. And in September, Asus, maker of the bargain-basement EeePc netbooks, said it, too, will make a super-cheap e-book reader.

What should Amazon do? Given the device’s inherent limitations, which make it impossible for the Kindle to ever outdo the appeal of the traditional book in every way, Amazon would probably do best to concentrate on the convenience angle. Bezos already has the right approach, with his goal of making every last book available to readers within 60 seconds. If he can achieve that goal, the Kindle will surely be the most convenient bookstore ever. But cost is a facet of convenience, too. And that suggests that the Kindle needs to dramatically drop in price.

Humor with several grains of salt on publishing and self-publishing from Joe Quirk at SFGate (SFGate):
Your publisher has no clothes: Exclusive print-on-demand publishers are knocking traditional books off the bestsellers list and paying authors three times as much money per sale. Robber baron publishing is toppling, and the only things propping it up are myths-- misconceptions that authors themselves cling to. Let's kick out each of these myths in short order, and watch the robber barons fall. Big New York publishers will give me an advance! Your advance is a LOAN with your career as collateral. If borrowing money from your credit card at 8% interest to support your writing is a bad idea, than borrowing money from a big New York publisher against books you haven't sold yet is a catastrophic idea. Bankruptcy ends after 7 years. The Red Mark next to your name is forever. Big New York publishers will get me publicity! If you can't pay for your own publicity, why would you let your publisher borrow profits from books you haven't sold yet to make spending decisions over which you have no control? It works for the robber barons to pay extravagantly for ten long-shots if one book pays off extravagantly. It is catastrophic for you to be one of the nine long-shots they waste money on.
In Sports, another humiliation for the England cricket team (ABC)

Saturday, September 19, 2009

Boeing Boeing Gone

My family did a lot of traveling when I was young. We moved clockwise around the world starting in 1968 when we moved to Thailand, and with the grand parents back in the UK we traveled back to the UK every few years. Dad worked for a company owned by Pan Am which made travel free and as our travel experiences coincided with the launch of the 747 we spent many hours flying in these fantastic aircraft.

I've been scanning some old photos and came across this one of the Pam Am 747 Clipper Intrepid (N749PA). This photo was taken in Honolulu in December 1980 and out of curiosity I tried to track the aircraft down. Since you can find anything out on the internet, it turns out the name of the aircraft was changed to Clipper Dashing Waves and was finally taken out of service in 1991. Sadly, it's last resting place was far less glamorous than Honolulu. (Photo)

Wednesday, September 16, 2009

ISBN Webinar: Slide Presentation

A number of people I emailed about the Webinar asked about whether the slides from Mark Bide's presentation would be available and sure enough here they are.

Monday, September 14, 2009

ISBN Webinar

Don't forget to sign up for the free BISG seminar on the future of ISBN and identifiers with Mark Bide as your host.

Mark Bide of Editeur is hosting a BISG Webcast on the future of ISBN (BISG):

The book industry has had the ISBN for nearly 40 years; there has been little cause for excitement. Now, suddenly the whole subject of "identifiers" has become a hot topic, particularly when it comes to digital books and other online resources. This BISG Webcast will explore why the book industry has standard identifiers, and consider the future of the ISBN (International Standard Book Number), as well as the role of newer identification standards like ISTC (International Standard Text Code) and ISNI (International Standard Name Identifier). What do you need to know to make informed decisions about how -- and whether -- to use them? Register today to find out.
Register here: It is even FREE!

Read my post The ISBN is Dead.

Sunday, September 13, 2009

MediaWeek (Vol 2, No 36): Lexis, Google, Copyright, Peer Review, Blackboard

Also on the Twitter highlight reel: Link Information World Review on how legal publishers are incorporating work flow solutions into their products and documents the path Lexis has take from a collection of public records to a suite of content and services. Their conclusion (IWR):

Meanwhile, Brewer fears that in the medium term the difference in currency and quality between free and paid-for will inevitably narrow. “The paid-for sector will increasingly need to focus on ensuring the benefits of paying for information are not only in the quality of the information, but also in the additional value that can be created by providing it in a form that best suits how the subscriber works, and what they want to do with the information when they receive it. The consumer of legal information can’t really lose: free information is another source to turn to on occasion and its availability will ensure that information providers continue to raise their game.

“We have to deliver our content across different media in order to best serve the needs of professionals, be it in print, online, CD-ROM, RSS feeds, email, etc. This means we need a best-of-breed publishing system. Having an XML-based publishing repository means that most content is now held in a media-neutral format for delivery across multiple media, making it as versatile as possible to suit the digital needs of lawyers and accounting professionals.”

Marybeth Peters, The Register of Copyrights in prepared testimony before the Committee on the Judiciary makes some strong statements regarding the Google Book Settlement. The statement, coupled with her comments at Columbia University last year where she noted Congress had shown only limited interest in Orphan works legislation, seemed to me to be an admonishment to her bosses that they should pay more attention. By some accounts there was a general shug of the shoulders from the Committee. Here is a sample:
In the view of the Copyright Office, the settlement proposed by the parties would encroach on responsibility for copyright policy that traditionally has been the domain of Congress. The settlement is not merely a compromise of existing claims, or an agreement to compensate past copying and snippet display. Rather, it could affect the exclusive rights of millions of copyright owners, in the United States and abroad, with respect to their abilities to control new products and new markets, for years and years to come. We are greatly concerned by the parties’ end run around legislative process and prerogatives, and we submit that this Committee should be equally concerned.
She summarizes:
It is our view that the proposed settlement inappropriately creates something similar to a compulsory license for works, unfairly alters the property interests of millions of rights holders of out-of-print works without any Congressional oversight, and has the capacity to create diplomatic stress for the United States. As always, we stand ready to assist you as the Committee considers the issues that are the subject of this hearing.
Copyright legal expert William Paltry has a new book (Moral Panics and the Copyright Wars) and is interviewed by Publishers Weekly. (PW):

PW: J.D. Salinger's lawyers are attempting to stop a novel they claim is “an unauthorized sequel” to “The Catcher in the Rye, “ and I can't help thinking that for most of his life Salinger never dreamed he'd be fighting this copyright fight in 2009, because the book was supposed to enter the public domain by now. Can you give us your perspective on copyright term extensions? WP: That's a wonderful example of copyright gone awry. If you look historically at the terms of copyright, they used to be relatively short. From 1909 to December 31, 1977, you had a 28-year original term and a 28-year renewal term, with renewal being conditioned upon filling out an application. The book industry had a shockingly low rate of renewal, around 10%. Book publishers had staffs that were quite capable of filing renewals. It wasn't a burden for them to do, and it was cheap. Yet publishers didn't renew the vast majority of their copyrights. Why not? Because, economically, it didn't mean anything to them. Most books make their money in a very short period of time. It differs by industry, but for most books, 28 years is enough. With the last extension in 1998, copyright became totally unmoored to its purpose of providing incentive to create new works. The public got nothing from that, and no author in history has ever said, “Life plus 50 is just not enough. I will not create that work unless the copyright exists for my life and 70 years.” That's absurd.

Steve Jobs suggests that the Kindle will only be short-lived unless it offers more functionality than just reading content (NYTimes):

But in the interview, Mr. Jobs said general-purpose devices are more appealing than specialized devices like Amazon.com’s Kindle e-book reader.

“I think people just probably aren’t willing to pay for a dedicated device,” he said. “You notice Amazon never says how much they sell; usually if they sell a lot of something, you want to tell everybody.”
Someone forgot to mention the iPod is a standalone device. The Times notes the results of a widely distributed research study on Peer Review (TimesOnline):
The first conclusion worth mentioning is that while few people think peer-review is perfect, the scientific community seems broadly content with it. Only 32 per cent of respondents thought that it was as good as it could be, but 69 per cent said they were satisifed. I don't think anybody should find this particularly surprising. Among the scientists I speak to, the general consensus on the process seems usually to be the Churchillian one. More eye-catching was the finding that 81 per cent think that peer-review should be capable of indentifying plagiarism, and 79 per cent think it should catch fraud. I find this interesting because, with the best will in the world, it's hard to see peer-review as it stands reliably accomplishing either goal.

Blackboard is periodically noted as an acquisition target and here Inside Higher Ed gives us five reasons Microsoft will buy the company (IHEd):
Is Blackboard too small a company to take advantage of the opportunities they have created by rolling up the for-profit CMS space? Is Blackboard an outlier in a world of consolidation within the technology industry? Is Microsoft the right company to purchase Blackboard? Would this be a good or bad thing for higher education? What do you think the odds are that I'm correct that we will see a Microsoft purchase of Blackboard by the end of 2010?
I think you could come up with five good reasons why Google will by Blackboard. Remember when Steve McQueen, Jimmy Garner and Dickie Attenborough where digging holes under the barbed wire? Not so the Welsh imprisoned at Stalag IVb, near Mühlberg in Germany, between July 1943 and December 1944. They went in for publishing (BBC):
But some Welsh prisoners of war overcame adversity with a remarkable series of morale-boosting magazines about their homeland called Cymro (Welshman). They stole medicine to make ink, while their meagre rations were used to stick illustrations onto pages from school exercise books. It featured snippets of news from home taken from letters sent by loved ones, and was handwritten in English and Welsh from inside Stalag IVb, near Mühlberg in Germany, between July 1943 and December 1944. Now, as the 70th anniversary of the start of the war is commemorated, the National Library of Wales in Aberysytwyth has published its collection of the magazines online.

In sport, England: an outstanding display to record an eighth successive win of a flawless qualifying campaign. What a relief. (Link) Then there's Wayne Rooney.

Thursday, September 10, 2009

Senator Al Franken draws map of USA

Once you get to Nebraska it gets easier. Asked to place the US on a world map I've seen some place it upside down: this is almost a party trick.

Wednesday, September 09, 2009

580,388 Orphan Works – Give or Take

Clearly one of the most (if not the most) contentious issue regarding the Google Book Settlement (GBS) centers on the nebulous community of “orphans and orphan titles”. And yet, through the entirety of the discussion since the Google Book Settlement agreement was announced, no one has attempted to define how many orphans there really are. Allow me: 580,388. How do I know? Well, I admit, I do my share of guess work to get to this estimate, but I believe my analysis is based on key facts from which I have extrapolated a conclusion. Interestingly, I completed this analysis starting from two very different points and the first results were separated by only 3,000 works (before I made some minor adjustments).

Before I delve into my analysis, it might be useful to make some observations about the current discussion on the number of orphans. First, when commentators discuss this issue, they refer to the ‘millions’ of orphan titles. This is both deliberate obfuscation and lazy reporting: Most notably, the real issue is not titles but the number of works. My analysis attempts to identify the number of ‘works’; Titles are a multiple of works. A work will often have multiple manifestations or derivations (paperback, library version, large print, etc.) and thus, while the statement that there may be ‘millions of Orphans titles’ may be partially correct, it is entirely misleading when the true measure applicable to the GBS discussion is how many orphan works exist. It is the owner (or parent) of the work we want to find.

To many reporters and commentators, suggesting there are millions of orphans makes sense because of the sheer number of books scanned by Google but, again, this is laziness. Because Google has scanned 7-10 million titles then, so the logic goes, there must be ‘millions of orphans’. However, as a 2005 report (which I understand they are updating) by OCLC noted, all types of disclaimers should be applied to this universe of titles such as titles in foreign languages, titles distributed in the US, titles published in the UK, to name a few. Accounting for these disclaimers significantly reduces the population of titles at the core of this Orphan discussion. These points were made in the 2005 OCLC report (although they were not looking specifically at orphans) when they looked at the overlap in title holdings among the first five Google libraries. (And if you like this stuff, this was pretty interesting). Prognosticators unfamiliar with the industry may also believe there are millions and millions of published titles since, well, there are just lots and lots in their local B&N and town library.

The two methods I chose to try to estimate the population of orphans relied, firstly, on data from Bowker’s BooksinPrint and OCLC’s Worldcat databases and, secondly, on industry data published by Bowker since 1880 on title output. I accessed BooksinPrint via NYPL (Bowker cut off my sub) and Worldcat is free via the web. The Bowker title data has been published and referred to numerous times over the years and I found this data via Google Book Search; I also purchased an old copy of The Bowker Annual from Alibris.

In using these databases, my goal was to determine whether there are consistencies across the two databases that I could then apply to the Google title counts. In addition to the ‘raw data’ I extracted from the databases, OCLC (Dempsey) also noted some specific numbers of ‘books’ in their database (91mm), titles from the US (13mm) and non-corporate ‘Authors’ (4mm). Against the title counts from both sets of data, I attributed percentages which I then applied to the Google universe of titles (7mm). (My analysis also 'limits' these numbers to print books excluding for example dissertations).


In order to complete the analysis to determine a specific orphan population, I reduced my raw results based on best guess estimates for non-books in the count, public domain titles and titles where the copyright status is known. These final calculations result in a potential orphan population of 600,000 works. I also stress-tested this calculation by manipulating my percentages resulting in a possible universe of 1.6mm orphan works. This latter estimate is (in my view) illogical as I will show in my second analysis.

An important point should be made here. I am calculating the potential orphan population, not the number of orphans. These numbers represent a total before any effort is made to find the copyright holder. These efforts are already underway and will get easier once money collected by the Books Rights Registry is to be distributed.

My second approach emanated from my desire to validate the first approach. If I could determine how many works had been published each year since 1924 then I could attribute percentages to this annual output based on my estimate of how likely it was that the copyright status would be in doubt. Simply put, my supposition was that the older the work, the more likely it was that it could be an orphan.

Bowker has consistently calculated the number of works published in the US since 1880 (give or take) and the methodology for these calculations remained consistent through the mid-1990s. According to their numbers, approximately 2mm works were published between 1920 and 2000. Unsurprisingly, a look at the distribution of these numbers confirms that the bulk of those works were published recently. If there were (only) 2mm works published since the 1920s, it is impossible to conclude there are millions of orphan works.

To complete this analysis, I aggressively estimated the percentage of works published each decade since 1920 which could be orphan works. The analysis suggests a total of 580K potential orphan works which, as a subset of the approximately 2mm works published in the US during this period, seems a reasonable estimate. My objective to ‘validate’ my first approach (using OCLC and BIP data) shows that both approaches, using different methodology, reach similar conclusions.

There are several conclusions that can be drawn from this analysis. Firstly, since the universe of works is finite then, beyond a certain point, the Google scanning operation will begin to find ‘new’ orphans at a decreasing rate. I don’t know if this number is 5mm scanned titles or 12mm but my estimate is 7mm because, according to Worldcat, there are 3mm authors to 12mm titles. If you apply this ratio to the Bowker estimate of total of works published, the number is around 7-8mm titles. Secondly, publishing output accelerated in the latter part of the 20th century which means that, while my estimates in percentage terms of the number of latter day orphans were comparably lower than the percentages applied in the early part of the century, the base number of published titles is much higher, therefore the number of possible orphans is higher. Common sense dictates that it will be far easier to find the parents of these later ‘orphans’.

In the aggregate, the 600K potential orphans may still seem high against a “work” population of 2.2mm (25%). I disagree, given the distribution of the ‘orphan’ works (above paragraph) and because I have assumed no estimate of the BRR’s effort to find and identify the parents. In my view, true orphans will be a much lower number than 600,000, which leads me to my final point. Money collected on behalf of unidentified orphan owners will eventually be disbursed to cover costs of BRR or to other publishers. There has been some controversy on this point and it derives, again, from the idea that there are millions of orphans and thus the pool of undisbursed revenues will be huge. The true numbers don’t support this conclusion. There will not be a huge pool of royalty revenues to be ultimately disbursed to publishers who don’t ‘deserve’ this windfall because there won’t be very many true orphans. The other point here is that royalty revenues will be calculated on usage and, almost by definition, true orphan titles for the most part are not going to be popular titles and therefore will not generate significant revenues in comparison with all other titles.

This analysis is not definitive, it is directional. Until someone else can present an argument that examines the true numbers and works in more detail, I think this analysis is more useful to the Google Settlement discussion than referring by rote to the ‘millions of orphans’. The prevailing approach is lazy, misleading and inaccurate.



(Thanks to Mike Shatzkin who encouraged me to think about analysis and helped me conclude it. Grateful thanks to others who also helped review the post).
Reblog this post [with Zemanta]

Monday, September 07, 2009

Dear Bank of America (and Ken Lewis): Here's my Problem

Mr. Kenneth D. Lewis
CEO & President
Bank of America Corporation
100 North Tryon Street
Charlotte, NC 28202

August 31, 2009

VIA FEDEX

Ref: Making a Checking Deposit

Dear Mr. Lewis,

I wouldn’t say I love the Bank of America brand so much as grossly respect it for all its red, white and blue effrontery. That ‘in your face we’re bigger and better than all of you’ attitude is hard to resist, which is consequently why I continue to use your bank despite a spate of blunders on your behalf. This last incident stole the show and, Kenneth, I know I shouldn’t use the word ‘steal’ with respect to any financial institution (let alone yours), but for the two weeks my cash sat in monetary purgatory I began to believe stealing it back was my only option.

We can both agree that technology is a powerful and seductive mistress. Imagine - if you can - how it might feel to become the victim of a crime so seductive that you hardly know it has occurred. (I bet a lot of American taxpayers feel that way today – am I right, Ken?). How was I to know that, at the moment your auto-mistress sucked my check from my hand (giving me just that little electric tingle of self-satisfaction that I was working on the veritable cusp of technology), that this act would set in motion a series of draconian events no one from your fine corporation could ever hope to explain?

Kenneth, what happens when a check is deposited and clears the payees’ account? Yes, it is made available to the depositor (not a trick question)! I am sorry if this is elemental for you yet, in my recent experience, a perfectly good check deposited via the same cash machine – identical in amount, payee and issuer to 15 previous checks deposited at virtually the same time each month for the past 15 months - was summarily rejected by Bank of America. Now Mr. Kenneth, I’m sure you are thinking (just like I was) ‘how could this happen at Bank of America to one of our long-time customers?’ I’ll get to that last bit in a minute. But I really hope you know the answer because no one else at your bank has a clue.

I guess (and why shouldn’t I? That’s what your staff do when they don’t know the solution), the real answer lies in your use of technology: Bank of America has become a programming experiment and, as a result, the staff is now as clueless as the customer to explain how simple tasks - like depositing a check or transferring money from one account to another - can go inexplicably wrong. And, Kenneth, it’s not the staff’s fault - you have placed them in this intolerable situation; But maybe now I begin to understand your inspired and strategic leadership in closing down the retail operations. I mean, getting rid of staff that’s uninformed and lacking in effective training must be better than leaving them defenseless on the front line of customer service. By the way, have you spent any time in one of your local branches recently, Kenneth? Was it like bobbing rudderless in a sea of ineptitude?

Kenneth, for fifteen days a significant amount of money was neither in my account nor in the account of my employer. It was, however, in your account. Fifteen days, Kenneth! For a check no different than one deposited a month earlier and one deposited a month later (which cleared in the usual day or two). Why, Kenneth? Where’s the explanation? (And Ken, please note that I’m looking for an explanation that actually makes sense).

Which reminds me, Kenneth, that I did want to come back to what I thought was my long-standing relationship with your bank. Since my account has been open for over 20 years, I believe I have banked with Bank of America for over 20 years. Far be it from me to be so tactless to note the value of my deposits over those years but it is your business and it is a lot. Really – a lot, but that seems to be utterly meaningless to your staff: “Not with this bank” one of your staffers was quick to assert, only because I have been one of a multitude of accounts swallowed up by the bank that couldn’t say no (that would be yours). Truth be told, I guess I’m really a Nat West customer and I sometimes look back longingly on those days. Far be it from the government to talk Nat West into a bank merger, Kenneth! (Raising the issue of your deal making might not seem relevant but if acquired customers aren’t ‘real’ customers, then what are they?) And the shareholders, Kenneth, imagine how they feel when they realize you’ve prioritized deal making over their interests. What kind of executive management is it that folds in the face of such government cajoling? I’m sure Merrill Lynch will eventually come good for you, though.

So where does this leave us, Ken? I wish I could say I want to stay with Bank of America because you are the best around. That’s not the case. I’m stuck with you. Just this week, we realized that in ten years you’ve never reduced our overdraft interest rate (in spite of the fact that the prime lending rate has collapsed over that period), but you are still as inefficient as ever in crediting our account with deposits, which, of course, causes us to use the credit line. Slowness pays dividends (and bonuses too, I suppose. Am I right Kenneth?). Just last Sunday, your technology placed a hold on my cash card for some spurious and inconvenient reason. (Well, probably – who really knows?) Kenneth, you are getting worse not better.

I’m leery of your technology and despite your retail close-down I’m now looking for more human contact. In fact, for all my deposits, check cashing and payments I now go to the teller window. Sure, it’s less efficient and costs you more, but one of the nonsensical explanations for my problem(s) was that had I done it at the teller window, I wouldn’t have had a problem. Mr. Ken, I’ve tested this out and it seems to be the case!

So, along those lines, I am enclosing a check for $7.83 which I was hoping you could deposit for me. I’ve included a deposit slip.

I look forward to your apology.

Best regards,

Michael Cairns
A lifetime customer of Nat West.

NOTE AND UPDATE (SEPTEMBER 15th): I received a friendly call from Melanie in Mr. Lewis' office. She called to discuss my issue and to tell me that my letter had been forwarded to Mr. Lewis. She also indicated that my deposit check had been forwarded to the deposit by mail department for deposit.

Wednesday, September 02, 2009

I'll Be Back: With Free Textbooks

All educational publishers know the holy trinity of textbook publishing: California, Florida and Texas. And winning or losing one out of three of these states in an adoption can tip the economic balance of any program. If California goes free not only will the economics for education publishing companies radically shift, but it is likely that Florida and Texas and many other states will follow California's lead in sourcing free educational content. Most immediately, California's migration toward the provision of free textbooks has been driven by the state's precarious financial situation, and there is an effective moratorium on new textbook purchases that is expected to last until 2014. While California's approach may seem drastic (or innovative, depending on your perspective), California is actually following a movement toward free textbooks that has been gaining steam over the past several years (GeorgiaTech). That said, California appears to be the first state to specifically identify free electronic texts that may be used in the classroom.

In May, Governor Schwarzenegger established a "Free Digital Textbook Initiative" to review free digital high school textbooks to determine which met the state's established academic standards. State education officials asked content developers to submit content and the California Learning Resource Network (CLRN) was asked to facilitate the review of the submitted content. The results were not to be considered an endorsement by the state (eventhough most of the free textbooks scored highly) however even as a 'dry-run' or experiment, this effort is likely to both encourage other suppliers of free content and local decision makers to consider adopting free content as part of their curriculum. Which is the intention.

In this first step, the initiative asked for textbooks in math and science and nine suppliers submitted 16 titles. The publishers were both individual educators and publishers and Pearson was the only 'traditional' publisher that chose to submit content. Embarrassingly, Pearson scored one of the lowest scores against the 'content standards met' criteria. (Why they were there at all is perhaps a more interesting discussion point.) The full report is located here.

In addition to the direction from the state level to evaluate digital content, other agencies have also joined in to support this initiative. Notable among these has been the California Educational Technology Professionals Association (CETPA) which recently organized a seminar showing participants how digital content could be integrated into the HS curriculum. The textbook content reviewed by CLRN will be available in classrooms in the fall.

The Governor's office made the following announcement:
Since these digital books are downloadable and may be projected on a screen, viewed on a computer, printed chapter by chapter, or bound for use in the classroom, schools can take advantage of these free, standards-aligned resources using existing hardware - even in classrooms without computers or laptops for every student.

To showcase the multiple ways in which digital textbooks can be used, the California Educational Technology Professionals Association (CETPA) today hosted 200 educators, technology professionals and content providers for a digital textbook symposium at the Orange County Department of Education. Teachers led students through lesson plans using digital textbooks in four mock classrooms, demonstrating the materials’ interactive potential. CETPA also moderated panel discussions about the future of digital education and potential next steps in this innovative effort.

Secretary of Education Glen Thomas spoke at the symposium and added, “I applaud the Governor for his leadership and vision in launching this groundbreaking initiative. This represents an important first step toward ubiquitous instruction that will help ensure all California students have access to the first-rate education they deserve.”
As this program develops, it will be interesting to see how the concept of a textbook begins to change. One of the criteria listed in the 'parameters' for review of the digital content is that the material must be 'stable for two years': Changes to the content are not allowed. For some subjects, this parameter should be no problem but, as the state evaluates social science and some other (dynamic) subjects, this parameter will begin to look quaint and limiting in what advantages digital content - free or paid - is able to deliver over print formats. In turn, as the parameters change, so will the process of vetting and approving titles for use in high schools. This initiative, viewed skeptically when it was announced earlier this year, has not only delivered tangible results to California educators but also represents a significant strategic issue for all traditional publishers as they navigate their digital frontier.