Friday, March 16, 2007
Speculation about Pearson
Reuters is reporting increased speculation (again) about a potential PE bid for Pearson plc. This is surely getting a little old for these people. Speculation has been evident for months and since the middle of last year the company's market cap has increased by well over £1.1billion. Today the share price has been up as much as 8p but it has fallen back slightly.
Wednesday, March 14, 2007
Deals: Wolters Kluwer
Wednesday was the day that offers were due to be submitted for the Wolters Kluwer educational assets. According to Reuters, the interested parties include Pearson Education and Wendel Investissement (French Educational company) and a number of private equity firms. Reuters went on to suggest that the company could fetch close to a $Billion.
It is hard to see which way this one will go. My guess is on one of the operators rather than PE. The unit is small relative to the other companies on the market and has some fairly specialized publishing programs which could limit effective cost restructuring and limit the potential for closer integration with a second follow-on acquisition. The integration with an existing publisher, such as Pearson, could be similar to a list acquisition meaning they could effectively eliminate all expenses other than those directly attributable to content development.
It is hard to see which way this one will go. My guess is on one of the operators rather than PE. The unit is small relative to the other companies on the market and has some fairly specialized publishing programs which could limit effective cost restructuring and limit the potential for closer integration with a second follow-on acquisition. The integration with an existing publisher, such as Pearson, could be similar to a list acquisition meaning they could effectively eliminate all expenses other than those directly attributable to content development.
Graf Spree and Gone with The Wind
I was in Orlando earlier this week for the Spring meeting of ASIDIC which was my first attendance at this event. It was a small meeting but it there were some high quality presentations and I will attempt to attend subsequent meetings. Among the speakers was Joe Wikert who wrote a post about the meeting. Dinner on the Monday evening was at the Orange County Regional History Center in downtown Orlando where most people wouldn't ordinarily go. The exhibits we actually quite interesting and they had an extensive civil war exhibit but the item that caught my eye was in
the smaller exhibit celebrating Gone with The Wind. In this newspaper front page a significant amount of space is given over to the premier of the movie but in the lower left is a story about the Graf Spree, a German battle cruiser that had been chased around the Atlantic by the British and had holed up in Uruguay. The juxtaposition of these two events - the movie and the war - is very interesting. It shows that at the same time the war was a very real thing in Europe that perhaps it wasn't quite so important at the time in the US. The British were frustrated in their victory in this skurmish of course since the Graf Spree was scuttled. On the other hand Viv Leigh did get best actress. (Sorry for the bad photo).
Informa Post Big Gain
Informa plc the event and publishing company which is the result of the Taylor and Francis and Informa merger several years ago posted a 50% gain in 2006 profit. The company also announced that Chairman Richard Hooper would retire and his position taken by current CEO Peter Rigby. Informa produces over 10,000 events, 40,000 book titles and 2,000 subscription products.
The company purchased a large events business in 2005 and 2006 was the first year revenues both included the full year impact of all recent acquisitions and did not include any material partial year acquisition revenue. In 2006 full year revenue was up over 40%. The company also said they are off to a fast start for 2007 but have ruled out any immediate additional acqusitions.
Here are their bullet point headlines:
The company purchased a large events business in 2005 and 2006 was the first year revenues both included the full year impact of all recent acquisitions and did not include any material partial year acquisition revenue. In 2006 full year revenue was up over 40%. The company also said they are off to a fast start for 2007 but have ruled out any immediate additional acqusitions.
Here are their bullet point headlines:
- Revenue up 42% to over £1 billion
- Adjusted operating profit3 49% higher at £219 million
- Total dividend increases 40%
- Strong trading across all three divisions (Academic & Scientific, Professional and Commercial) and all three business streams (Publishing, Performance Improvement (PI) and Events)
- Return on IIR acquisition exceeds cost of capital
- Adjusted operating margin rises above 21%
- Cash conversion more than 100% of adjusted operating profit
- Confident of 2007 outlook
Tuesday, March 13, 2007
Textbooks Are Too Expensive?
Make textbooks affordable is a web site set up by 19 student groups located at colleges and universities across the country. The insurgency against the 'high' cost of higher ed materials seems to be developing into a more coordinated approach to challenging publishing company's pricing policies. This site also hosts a report published by the MASS PIRG (Public Interest Research Group) titled Exposing the Textbook Industry: How Publishers’ Pricing Tactics Drive Up the Cost of College Textbooks which was published in February 2007. Among the findings/recommendations are the following:
There are more reports on this site and I suspect that this report and others will be circulated and placed on similar web sites to the maketextbooksaffordable site. Perhaps more worrying is that some state legislators are jumping on the band wagon and are proposing legislation to limit the ability of publishers to act in their commercial interest. In Minnesota for example, legislators are discussing legislation to regulate publishers and make faculty more accountable, "This is the hidden cost to higher education," said Democratic Rep. Frank Moe, the Minnesota's bill sponsor, who also teaches at Bemidji State University. "Reasonable profit makes sense. But the margins they are making on these textbooks is just absurd." (Lansing State Journal) Governments have looked at this issue before but there are currently more than 12 legislatures that are taking up this issue.
The legislation suggested in Alabama is typical:
On top of these state lead efforts, there is also a federal advisory board that is in the process of collecting information and feedback in a series of meetings around the county. From the Gainesville Sun:
This recent effort to manage textbook prices goes back to a report from GOA two years ago that, despite its flaws, is now regarded as the bible for all those addressing this issue. Unfortunately, the publishing industry was not well represented in that report and continue to be on the defensive on this issue. This will be monitored closely by all of us in the business.
- Textbooks should be produced and priced to be as inexpensive as possible without sacrificing educational value.
- New textbook editions should be produced only when educationally necessary; each book should be kept on the market as long as possible, with preference given to paper or online supplements over a whole new edition.
- Faculty should have the option to purchase textbooks unbundled; whenever a textbook is sold with additional materials, it also should be available without the extra materials.
- Publishers should provide faculty with more information on each book’s price, intended length of time on the market and substantive content differences from previous editions. Faculty want, and have the right to know, how their textbooks choices will affect their students. They should have easy access to information about all of the publisher’s products, low cost formats, options for bundling, and corresponding price information, voluntarily provided at the start of any sales transaction and on desk copies provided by the publisher.
- All textbooks should be available in a genuine low cost edition that contains comparable content in a low cost format. Information about these options should be easily available.
- Faculty should give preference to least cost options when choosing their books.
- There should be many avenues for students to access used books including rental programs, online bookswaps and bookstore buy-back.
There are more reports on this site and I suspect that this report and others will be circulated and placed on similar web sites to the maketextbooksaffordable site. Perhaps more worrying is that some state legislators are jumping on the band wagon and are proposing legislation to limit the ability of publishers to act in their commercial interest. In Minnesota for example, legislators are discussing legislation to regulate publishers and make faculty more accountable, "This is the hidden cost to higher education," said Democratic Rep. Frank Moe, the Minnesota's bill sponsor, who also teaches at Bemidji State University. "Reasonable profit makes sense. But the margins they are making on these textbooks is just absurd." (Lansing State Journal) Governments have looked at this issue before but there are currently more than 12 legislatures that are taking up this issue.
The legislation suggested in Alabama is typical:
The legislation would regulate the use of textbooks in state schools with the goal of keeping the cost of students’ required reading down.It would require that college classes use the same version of a textbook for at least two years “unless there are substantial differences] between the old and new edition," he said.It would also prohibit state college instructors and book-buyers from getting any kind of compensation for their choices, meaning it would become illegal for them to accept incentives and promotional gifts from book publishers. (Tuscaloosa)
On top of these state lead efforts, there is also a federal advisory board that is in the process of collecting information and feedback in a series of meetings around the county. From the Gainesville Sun:
The idea is to prod professors to take advantage of articles, lecture notes, study guides and other materials available for free on the Internet.That suggestion, and several others, were aired during a 3 1/2-hour meeting in Santa Clarita, Calif., where the Advisory Committee on Student Financial Assistance heard from college administrators, textbook publishers and other higher education leaders and advocates.The meeting - the second of three field hearings the panel will hold around the country before it delivers a congressionally requested report in May - didn't produce any consensus. And neither did it answer a question that seemed to be a subtext of the proceedings: Who is to blame for textbooks that cost more than $100?
This recent effort to manage textbook prices goes back to a report from GOA two years ago that, despite its flaws, is now regarded as the bible for all those addressing this issue. Unfortunately, the publishing industry was not well represented in that report and continue to be on the defensive on this issue. This will be monitored closely by all of us in the business.
Monday, March 12, 2007
Knovel New Appointments
On the back of announcing that Dave Shaffer from Thomson was joining the company, Knovel has announced some executive changes. It is not clear if these are related or not.
Robert E. Smith comes to Knovel asthe Vice President of Information Technology, John Dooley has joined Knovelas the Vice President of Sales and Delores Meglio has been promoted to VicePresident of Content Management.
Here is the press release.
Robert E. Smith comes to Knovel asthe Vice President of Information Technology, John Dooley has joined Knovelas the Vice President of Sales and Delores Meglio has been promoted to VicePresident of Content Management.
Here is the press release.
Digital Preservation
Some of you will have seen this long article on digital preservation that appeared in Sunday's New York Times. It is quite interesting and certainly identifies some of the issues regarding the amount of and cost related to preservation and digitization of historic materials. The worrying thing to me however, was the seeming underlying suggestion that if materials are not digitized they will somehow become lost. Users/seekers/researchers may not know of the availability of some obscure article of research because if it is not digitized they will not know about it. Which is untrue of course, because the material will be catalogued and the catalog record will always be available on a network (such as WorldCat). So the slight hysteria of the article should be taken with a grain of salt, but what is important is the cost and effort involved in digitizing the archives.
This is a very real issue and the article does a good job of reflecting the issues. I am reminded of something I read years ago in relation to content digitization along the lines of 'only 10% of the content will be used in electronic form the problem is you never know which 10%'.
This is a very real issue and the article does a good job of reflecting the issues. I am reminded of something I read years ago in relation to content digitization along the lines of 'only 10% of the content will be used in electronic form the problem is you never know which 10%'.
Friday, March 09, 2007
Barnes & Noble To Go Private?
That is the suggestion of David Scheck of Stifel Nicolaus on the basis that the publishing retail business really isn't suited to the expectations of Wall Street. Scheck went on to suggest that the fundamentals of the company are strong with good cash flow, significant retail presence and store productivity. My initial reaction to the general news coverage was a) things can't be that bad and b) are they trying to tap down expectations and the share price on purpose. Scheck's company views the news of depressed gross margin due to Harry P and the impact of their membership programs as a worse case scenario. By my calculation, they expect a $50mm decrease in net income which represents a 30% decrease compared to 2006.Stifel Nicolaus believe the company could perform better than management is suggesting and could be a PE target. Strictly speaking an MBO with management currently owning 20+% of the stock. Schecks' target price for the BN stock is in the low to mid $40's. After a dip early in the week, the stock is trading today at just under $38.00 and has a market cap of $2.5billion.
With the over-bearing requirements of financial reporting - evidenced by an on-going investigation at BN - the Riggio's may decide to bail out and indeed go private. While there are peaks and valleys in book retail, BN has been able to remain fairly consistent in delivering top line growth, strong operations and resulting good net income. Additionally, the capital requirements for this business wouldn't be onerous and as such it would make a safe place to put PE dollars for a decent return. Certainly one to keep watch of.
BN Press Release
On a less realistic note, George Gutowski on SeekingAlpha suggests that Jeff Bezos get out his check book and buy BN to consolidate the industry. He makes some interesting claims:
- Barnes & Noble has announced that its best customers are its worst financial problem. Most other industries make the most profit on their best customers, but not in this case.
- The book industry has not been able to sell itself other than through price competition. They all seem to offer the same book by the same author as their competitor (either retail or internet) and therefore have not been able to develop additional value added propositions.
- Acquisition allows the combination of best of breed in both the internet and retail categories. Barnes & Noble can entirely close down poorly performing internet distribution.
Labels:
Barnes Noble
Thursday, March 08, 2007
Ingram In Living Color
Perhaps it's just me, but when I saw this story earlier this week I yawned. Ingram Lightening Source has been a pioneer in POD and deserves huge credit for sticking with POD through some tough times early on, but this new initiative to go into color printing to support the photo market left me a little flat. What tipped the balance was that the news was reported in PW Daily and Publishers Lunch today (possibly because the news was on Joe Wikert's blog the day before...?) with nary a comment about the fact that there are many newcomers that are already doing color photo books.
I have mentioned my own experience with Blurb.com, but there are others including Shutterfly, Picaboo, Sharedink, Ipagz and Ourstory who are probably just as good. Each of these has established printing relationships with printers other than Ingram and are delivering thousands of units per week to happy customers. They are PUBLISHING. And more importantly they have already tapped into a massive (apparently $1.0bill market) which might not be readily apparent at least as the Ingram news was reported. The curious aspect of the Ingram publicity is that we seem to focus on the fact that it is Ingram and that this must be somehow revolutionary. Wags might ask why color wasn't already something Ingram were doing and why do they want to draw attention to the fact they don't. But that wouldn't be fair (or nice) since they are printing over a 1mm books a month in b/w which is just a phenomenal number.
Publishing is changing (if that isn't obvious) and I recently suggested to one of the traditional organs of the industry that they conduct a case study using several of the 'photo-book' printers (in quotes because they expand their capabilities all the time) and report on the experience in the magazine (ooops that gives it away). Evidently advice not taken. Like I said, perhaps it's just me.
I have mentioned my own experience with Blurb.com, but there are others including Shutterfly, Picaboo, Sharedink, Ipagz and Ourstory who are probably just as good. Each of these has established printing relationships with printers other than Ingram and are delivering thousands of units per week to happy customers. They are PUBLISHING. And more importantly they have already tapped into a massive (apparently $1.0bill market) which might not be readily apparent at least as the Ingram news was reported. The curious aspect of the Ingram publicity is that we seem to focus on the fact that it is Ingram and that this must be somehow revolutionary. Wags might ask why color wasn't already something Ingram were doing and why do they want to draw attention to the fact they don't. But that wouldn't be fair (or nice) since they are printing over a 1mm books a month in b/w which is just a phenomenal number.
Publishing is changing (if that isn't obvious) and I recently suggested to one of the traditional organs of the industry that they conduct a case study using several of the 'photo-book' printers (in quotes because they expand their capabilities all the time) and report on the experience in the magazine (ooops that gives it away). Evidently advice not taken. Like I said, perhaps it's just me.
Proquest Guidance
Clearly as a result of the garage style sell off, Proquest are undergoing a significant restructuring and this morning they released an update. The story so far:
The company had a lot of debt, began selling off assets, were hit with accounting irregularities, became embroiled in a subsequent SEC investigation, before they closed the sale to CIG the company announced that the Chairman (Aldworth) was leaving and some pundits said they sold the wrong business. This is were we pick up the story.
The company had a lot of debt, began selling off assets, were hit with accounting irregularities, became embroiled in a subsequent SEC investigation, before they closed the sale to CIG the company announced that the Chairman (Aldworth) was leaving and some pundits said they sold the wrong business. This is were we pick up the story.
- The company is moving all operations to Dallas where the Education division is located which should be completed by the end of 2007. Most staff functions will be eliminated in Ann Arbor with the exception of staff for transition and (presumably) accounting staff needed to deal with legacy issues (like SEC reporting and the investigation)
- Corporate functions transferred to the education unit are expected to be $4-5mm per year.
- They have two buildings with long term leases in Ann Arbor which will soon be surplus to their needs. It is assumed that they will attempt to buy-out the remain lease term but that will be discussed in a subsequent update.
- The company had a significant capital gain on the sale of the business solutions segment to Snap-On tools resulting in capital gain taxes of $60-65mm.
- The company had a significant capital loss on the sale of the information and learning segment to CIG and they will carry-back this loss against the gain in 2005 and they think that $40-45mm will come back as a refund in 2008.
- There is some class action suit stuff going on related to the accounting issues.
- Proquest Education includes Voyager Expanded Learning (acquired 1/31/05), Explore Learning (acquired 2/25/05) and LearningPage (acquired in 2004). Partial year 2005 revenues for the segment are expected to be $91million, EBIT of $7.4mm and EBITDA of $27.7mm.
- Education segment results for 2006 are projected to be Revenues of $117.3mm, EBIT of $12.0mm and EBITDA of $34.5mm.
- Guidance for 2007: Education segment revenues of $116-124mm, EBIT of $10-13mm and EBITDA of $32-35mm.
- 2007 Corporate expenses are expected to be between $30-32mm excluding taxes and interest. They note interest income of $4-5mm but don't project taxes and interest expense.
- There is no note about any subsequent debt repayments, write-downs, or other mitigating factors (like rent buy-outs) that could influence the 2007 results.
- The company still has a lot of financial reporting to do and risks being delisted if they miss an April 2, 2007 due date.
That's the update so far. Stay tuned for more.
Wednesday, March 07, 2007
Google Books Experienced
Via Lorcan Dempsey and as he did this is best left told by the author, Peter Brantley of the California Digital Library (error: he is with the Digital Library Federation). Astounding writing. Link.
Google Print: A Numbers Game
The following post is written by Andrew Grabois who worked with me at Bowker and has (among other things) compiled bibliographic stats out of the Books In Print database for a number of years. His contact details are at the bottom of this article.
On February 6th, Google announced that the Princeton University library system agreed to participate in their Book Search Library Project. According to the announcement, Princeton and Google will identify one million works in the public domain for digitization. This follows the January 19th announcement that the University of Texas libraries, the fifth largest library in the U.S., also climbed on board the Library Project. Very quietly, the number of major research libraries participating in the project has more than doubled to twelve in the last two years. The seven new libraries will add millions of printed items to the tens of millions already held by the original five, and more fuel to the legal fire surrounding Google’s plan to scan library holdings and make the full texts searchable on the web.
The public discussion has been mostly one-sided, with Google supporters trying to hold the high moral ground. Their basic argument goes something like this: The universe of published works in the U.S. consists of some 32 million books. They argue that while 80 percent of these books were published after 1923, and, therefore, potentially protected by copyright, only 3 million of them are still in-print and available for sale. As a result, mountains of books have been unnecessarily consigned to obscurity.
No one has yet challenged the basic assumptions supporting this argument. Perhaps they’ve been scared off by Google’s reputation for creating clever algorithms that “organize the world’s information”. This one, though, doesn’t stand up to serious scrutiny.
The figures used by supporters of the Library Project come from a 2005 study undertaken by the Online Computer Library Center (OCLC), the largest consortium of libraries in the U.S. According to the OCLC study, its 20,000 member libraries hold 31,923,000 print books; the original five research libraries participating in the Google library scanning project hold over 18 million.
OCLC did not actually count physical books. They searched their massive database of one billion library holdings and isolated 55 million catalog records describing “language-based monographs”. This was further refined (eliminating duplicates) to 32 million “unique manifestations”, not including government publications, theses and dissertations. The reality of library classification, however, is such that “monographs” often include things like pamphlets, unbound documents, reports, manuals, and ephemera that we don’t usually think of as commercially published books.
The notion that 32 million U.S. published books languish on library shelves is absurd. Just do the math. That works out to more than 80,000 new books published every year since the first English settlement in Jamestown in 1607. Historical book production figures clearly show that the 80,000-threshold was not crossed until the 1980’s, after hovering around 10,000 for fifty years between 1910 to1958. The OCLC study showed, moreover, that member libraries added a staggering 17 million items (half of all print collections) since 1980. That averages out to 680,000 new print items acquired every year for 25 years, or more than the combined national outputs of the U.S., U.K., China, and Japan in 2004.
Not only will Google have to sift through printed collections to identify books, and then determine if they are in the public domain, but they will also have to separate out those published in the U.S. (assuming that their priority is scanning U.S.-based English-language books) from the sea of books published elsewhere. The OCLC study clearly showed that most printed materials held by U.S. libraries were not published in the U.S. The study counted more than 400 languages system-wide, and more than 3 million print materials published in French and German alone in the original Google Five. English-language print materials accounted for only 52% of holdings system-wide, and 49% in the Google Five. Since more than a few works were probably published in the United Kingdom, the total number of English-language books published in the U.S. will constitute less than half of all print collections, both system-wide and in Google libraries.
So how many U.S.-published books are there in our libraries? Annual book production figures show that some 4 million books have been published in the 125 years since figures were regularly compiled in 1880. If, very conservatively, we add an additional 1.5 million books to cover the pre-1880 years, and another 1.5 million to cover books published after 1880 that might have been missed, we get a much more realistic total of 7 million.
Using the lower baseline for published books tells a very different story than the dark one (that the universe of books consists of works that are out-of-print, in the public domain, or “orphaned” in copyright limbo) told by Google and their supporters. With some 3 million U.S. books in print, the inconvenient truth here is that 40% of all books ever published in the U.S. could still be protected by copyright. That would appear to jive with the OCLC finding that 75% of print items held by U.S. libraries were published after 1945, and 50% after 1974.
If we’re going to have a debate that may end up rewriting copyright law, let’s have one based on facts, not wishful thinking.
Andrew Grabois is a consultant to the publishing industry. He has compiled U.S. book production statistics since 1999. He can be reached at the following email address: agrabois@yahoo.com
Clarification update from Andrew: My post is not intended to be a criticism of the OCLC study ("Anatomy of Aggregate Collections: The Example of Google Print for Libraries") by Brian Lavoie et al, which is a valuable and timely look at print collections held by OCLC member libraries. What I am attempting to do here is point out how friends of the Google library project have misinterpreted the paper and cherry-picked findings and conclusions out of context to support their arguments.
Related articles:
Google Book Project (3/6/07)
Qualified Metadata (2/22/07)
On February 6th, Google announced that the Princeton University library system agreed to participate in their Book Search Library Project. According to the announcement, Princeton and Google will identify one million works in the public domain for digitization. This follows the January 19th announcement that the University of Texas libraries, the fifth largest library in the U.S., also climbed on board the Library Project. Very quietly, the number of major research libraries participating in the project has more than doubled to twelve in the last two years. The seven new libraries will add millions of printed items to the tens of millions already held by the original five, and more fuel to the legal fire surrounding Google’s plan to scan library holdings and make the full texts searchable on the web.
The public discussion has been mostly one-sided, with Google supporters trying to hold the high moral ground. Their basic argument goes something like this: The universe of published works in the U.S. consists of some 32 million books. They argue that while 80 percent of these books were published after 1923, and, therefore, potentially protected by copyright, only 3 million of them are still in-print and available for sale. As a result, mountains of books have been unnecessarily consigned to obscurity.
No one has yet challenged the basic assumptions supporting this argument. Perhaps they’ve been scared off by Google’s reputation for creating clever algorithms that “organize the world’s information”. This one, though, doesn’t stand up to serious scrutiny.
The figures used by supporters of the Library Project come from a 2005 study undertaken by the Online Computer Library Center (OCLC), the largest consortium of libraries in the U.S. According to the OCLC study, its 20,000 member libraries hold 31,923,000 print books; the original five research libraries participating in the Google library scanning project hold over 18 million.
OCLC did not actually count physical books. They searched their massive database of one billion library holdings and isolated 55 million catalog records describing “language-based monographs”. This was further refined (eliminating duplicates) to 32 million “unique manifestations”, not including government publications, theses and dissertations. The reality of library classification, however, is such that “monographs” often include things like pamphlets, unbound documents, reports, manuals, and ephemera that we don’t usually think of as commercially published books.
The notion that 32 million U.S. published books languish on library shelves is absurd. Just do the math. That works out to more than 80,000 new books published every year since the first English settlement in Jamestown in 1607. Historical book production figures clearly show that the 80,000-threshold was not crossed until the 1980’s, after hovering around 10,000 for fifty years between 1910 to1958. The OCLC study showed, moreover, that member libraries added a staggering 17 million items (half of all print collections) since 1980. That averages out to 680,000 new print items acquired every year for 25 years, or more than the combined national outputs of the U.S., U.K., China, and Japan in 2004.
Not only will Google have to sift through printed collections to identify books, and then determine if they are in the public domain, but they will also have to separate out those published in the U.S. (assuming that their priority is scanning U.S.-based English-language books) from the sea of books published elsewhere. The OCLC study clearly showed that most printed materials held by U.S. libraries were not published in the U.S. The study counted more than 400 languages system-wide, and more than 3 million print materials published in French and German alone in the original Google Five. English-language print materials accounted for only 52% of holdings system-wide, and 49% in the Google Five. Since more than a few works were probably published in the United Kingdom, the total number of English-language books published in the U.S. will constitute less than half of all print collections, both system-wide and in Google libraries.
So how many U.S.-published books are there in our libraries? Annual book production figures show that some 4 million books have been published in the 125 years since figures were regularly compiled in 1880. If, very conservatively, we add an additional 1.5 million books to cover the pre-1880 years, and another 1.5 million to cover books published after 1880 that might have been missed, we get a much more realistic total of 7 million.
Using the lower baseline for published books tells a very different story than the dark one (that the universe of books consists of works that are out-of-print, in the public domain, or “orphaned” in copyright limbo) told by Google and their supporters. With some 3 million U.S. books in print, the inconvenient truth here is that 40% of all books ever published in the U.S. could still be protected by copyright. That would appear to jive with the OCLC finding that 75% of print items held by U.S. libraries were published after 1945, and 50% after 1974.
If we’re going to have a debate that may end up rewriting copyright law, let’s have one based on facts, not wishful thinking.
Andrew Grabois is a consultant to the publishing industry. He has compiled U.S. book production statistics since 1999. He can be reached at the following email address: agrabois@yahoo.com
Clarification update from Andrew: My post is not intended to be a criticism of the OCLC study ("Anatomy of Aggregate Collections: The Example of Google Print for Libraries") by Brian Lavoie et al, which is a valuable and timely look at print collections held by OCLC member libraries. What I am attempting to do here is point out how friends of the Google library project have misinterpreted the paper and cherry-picked findings and conclusions out of context to support their arguments.
Related articles:
Google Book Project (3/6/07)
Qualified Metadata (2/22/07)
Subscribe to:
Comments (Atom)