The Gilbane Group Survey for Book Publishing Professionals: Take it Today!
The Gilbane Group Web-based survey of book publishing professionals is now active!. This survey is one of the research mechanisms for our upcoming study "A Blueprint for Book Publishing Transformation: Seven Essential Processes to Re-Invent Publishing." The study will be published in June 2010, and all participants in this survey will have full access to the full-length study posted on The Gilbane Group website.
This survey, which will take most participants between 10-to-15 minutes to complete, seeks to gain a clearer picture of ebook and related digital publishing efforts underway among the full spectrum of book publishers. Furthermore, the analyst team at The Gilbane Group seeks to identify a number of "pain points" or barriers encountered by book publishers when it comes to developing or expanding digital publishing programs, including areas such as royalties, digital format choices, and distribution problems.
Wednesday, April 28, 2010
Tuesday, April 27, 2010
BISG's Making Information Pay is a publishing industry conference for senior executives in operations, sales and marketing who are responsible for increasing sales and reducing costs. For the past six years, book industry professionals have attended Making Information Pay in order to learn the best practices driving the success of industry leaders today.Mike Shatzkin has a full run down of the presentations here.
Register today to discover the technological "points of no return" currently facing our industry and identify the new technologies dictating advanced ways in which books -- both digital and physical -- are being acquired, produced, distributed, marketed and sold.
Sunday, April 25, 2010
For this reason, their technology expertise notwithstanding, it is not a given that the professional publishing companies will dominate the media landscape. The organizational foundations of professional publishing companies are expensive to support. Their cost structures require that information also be expensive. However, Thomson Reuters, Reed Elsevier, and Bloomberg have no monopoly on the technology, and the technology to create new information and news delivery platforms is not expensive. In this respect, the professional publishers have returned to being merely ordinary companies. In the future, they will compete with new rivals without the structural advantages that have until now fueled their growth and reinforced their market position. This is good news for the American business community, and for the future development of a vital, healthy, and prosperous news and information industry in the United States.The Financial Post (Canada) and provides a friendly venue for an interview with Heather Reisman (FP):
Q: Now that Amazon will be allowed to have its own distribution centre here, do you believe Ottawa should ease foreign ownership legislation in a way that would allow you to enter into international partnerships?Jason Pinter in the HuffPo (again) discusses his frustration over reports that men don't read (HuffPo):
A: Amazon had a full-on distribution centre here before, but it was run by Canada Post. The change is they are going to run it themselves. In my opinion, once the government [in 2002] allowed Amazon to operate 100% as a major bookseller in Canada with no Canadian ownership, they were de facto saying that they believe in this day and age that you do not have to be Canadian to own a book-retailing company [in Canada].
Q: Do you intend to take that issue up with Industry Canada? How is it that Kobo can have international partners with a substantial stake?
A: I don't have any reason to take it up because I am not looking at selling the Indigo business. Our Kobo business is a global business. But I think the government realizes that you cannot put legislation on a digital business - what are you going to do? You just can't.
Why do I bring this up? Because if you've worked in publishing, you've heard the tired old maxim: Men Don't Read. Try to acquire or sell a book aimed predominantly at men, and odds are you'll be told Men Don't Read. This story is not an isolated incident, but merely a microcosm of a huge problem within the industry. If you keep telling yourself something, regardless of its validity, eventually you'll begin to believe it. So because publishers rarely publish for men and don't market towards men, somehow that equates to our entire gender having given up on the reading books. THIS MUST END.
This NPR piece three years ago came to the conclusion that women read more fiction than men by a 4-1 margin. Articles like this madden me because I think they miss the big picture, or perhaps are even ignoring it purposefully. It's like discussing global warming, while completely ignoring the fact that hey, maybe we have something to do with it.
In my opinion, this empty mantra has begotten a vicious cycle. I was hesitant to write this article, mainly because in no way do I want to be perceived as diminishing the talents of many, many brilliant women in publishing. That is not the aim of this piece, nor is it my opinion in any way. This is a critique of the system, not those who work within it.
The comments are quite good as well. (I see a 't-shirt': I'm A Man and I Read! Could even add some swear words in there as well).
Xplana (Missouri Book Service) released a report looking at the potential for digital textbook sales over the next five years:
While digital textbook sales currently represent a small portion of the overall textbook market – approximately 0.5% – year-over-year increases show strong and steady growth.
- CourseSmart, a joint venture of five large college textbook publishers, reported a 400% increase in sales in 2009 from the year before;1
- MBS Direct, representing 900 client institutions and 34 academic publishers, showed increases in digital textbooks sales of more than 100% in 2009;2
- Interviews with representatives from leading textbook publishers reveal year-over-year increases between 80%-100% for the past three years, with sales growth in 2009 topping 100%;
- According to “On Campus Research Student Watch 2010,” a 16,000-student survey released by the National Association of College Stores in fall of 2009, about 42 percent of students have either purchased or at least seen an e-textbook. That’s an increase of 24 percentage points from 2007.3
These growth numbers for digital textbooks are also consistent with the increase in the general digital trade book market. The International Digital Publishing Forum (IDPF) reported that in 2009, e-books accounted for 3.31% of all trade book sales in the U.S. up from only 1.19% in 2008. Sales of wholesale e-books for February 2010 were $28,900,000 for February, a 339.3% increase over February 2009 ($6,600,000). Calendar Year to Date sales are up + 292.2% from 2009.4
From the twitter:
Alan Sillitoe dies at 82 Guardian Saturday Night and Sunday Morning, Loneliness of the Long Distance Runner
Stones in Exile Documentary: The Stones and the true story of Exile on Main St ObserverArsTechnica
Thursday, April 22, 2010
Following is that further analysis: In this report, I have estimated the market opportunity that the Google Book database could represent and I have organized my review based on which customers are likely to purchase the product, how much and to what degree customers will purchase and I also explore how Google might go about selling and marketing the product. I have excerpted the management summary section below and the full report is available in pdf here. (It is approximately 20 pages).
Anyone interested in discussing this report in more detail is encouraged to set up a conference call or meeting with me and I can review in more detail my methodology and explore the options available to Google as they roll out this product.
Almost five years ago, Google embarked on the most ambitious library development project ever conceived: To create a “Noah’s Ark” of every book ever published and to start by digitizing books held by a rarefied group of five major academic libraries. The immediate response from US publishers was muted, until the implications of the project became clear: That Google proposed no boundaries to the digitization effort and initiated the scanning of books both in and out of copyright and in and out of print. Adding to publisher’s concerns, Google planned to display “snippets” (small selections) of the book’s content in search results. Despite some hurried conversations among publishers, author groups and Google, Google remained convinced that what they were doing represented a social ‘good’ and the partial display of the scanned books was legally within the boundaries of fair use.
From the publisher perspective, this was a make-or-break moment, and the implications were more acutely felt by trade publishers who saw the potential for their business models to be obliterated by easy and ready access to high-quality content via a Google search over which they would exert little or no control. Even worse was the fear that rampant piracy of content would also develop – a debated and contentious point - given the easy access to a digitized version of a work that could be e-mailed or printed at will. The publishers determined that if Google were to ‘get away with it’ without challenge, then anyone would be able to digitize publisher content and possibly replicate what has been going on in the music and motion picture industries for almost ten years. In mid-2005, prompted by a law suit filed by The Authors Guild, the Association of American Publishers (AAP) led by four primary publishers filed suit against Google in an effort to halt the scanning of in-copyright materials. (The Authors Guild and AAP ultimately combined their filings).
The initial Google Book Settlement (GBS) agreement, given preliminary approval by a court in October 2008, generated a vast amount of argument both in support of the agreement and in challenges to it. A revised agreement was drafted after the Federal District Court of Southern New York and Judge Chin agreed to delay the adjudication and final arguments which were heard in late February 2010. To date, Judge Chin has not given a timetable nor an indication of when and how he will decide the case.
From the perspective of the early leading library participants, Google’s arrival and promise to digitize their purposefully conserved print collections looked like a miracle. Faced with forced declines in the dollars spent on monographs and the ever-rising expense of maintaining over 100 years of print archives, the Google digitization program provided a possible solution to many problems. All libraries believe they hold a social covenant to collect, maintain and preserve the most relevant materials of interest to their communities but maintaining that covenant becomes a challenge in an environment of increasing expenses while also enduring the challenges of migrating to an on-line world(1).
The library world is typically segmented into public and academic institutions and while these often varied ‘communities’ may differ in their philosophy towards, for example, collection development or preservation, they do share some common practices. Most importantly, all libraries are committed to resource sharing and while materials use has historically and primarily been ‘local’ to the library, every institution wants to make its collections available to virtually any patron and institution who requests them. In short, these library collections were always ‘accessible’ to all regardless of geography or copyright: First US Mail, FedEx, e-mail and then the Internet progressively made this sharing easier but, until Google arrived with their digitization program, any sharing beyond the local institution was via physical distribution(2) . In effect, it could be argued that the Google scanning program simply makes an existing practice vastly more efficient.
Even though, the approval of the Google Book Settlement (GBS) hangs in the balance under review by Judge Chin of the Federal District Court of Southern New York, an Executive Director has been named to head the Book Rights Registry (BRR) (3) and is preparing the groundwork to establish the organization (BRR) in advance of approval. This report represents an attempt to analyze the market size opportunity for Google as it seeks to exploit the Google Book Settlement.
Following are our summary findings which are discussed in more detail in the ensuing pages of this report.
Summary Findings of the Report:
- Libraries will see tremendous advantages – both immediate and over time - from the GBS, although concerns have been voiced (notably from Robert Darnton of Harvard)(4)
- Google’s annual subscription revenue for licensing to libraries could approach $260mm by year three of launch
- Over time, publishers (and content owners) will recognize the GBS service as an effective way to reach the library community and are likely to add titles to the service(5)
- Google will add services and may open the platform for other application providers to enhance and broaden the user experience
- The manner in which the GBS deals with orphan works will provide a roadmap for other communities of ‘orphans’ in photography, arts, and similar content and intellectual property
 Resource sharing and improvements in the ‘logistics’ provided by OCLC (WorldCat) or via consortia such as OhioLink has made physical distribution effective and comparatively efficient.
 The BRR is the management body tasked with administering the GBS and representing the interests of authors and publishers once approval has been granted by the court.
 Robert Darnton, NY Review of Books
 The settlement doesn’t provide for adding content prior to 1/5/09; however, we are suggesting that, by mutual consent, additional published content may be added as an expedient method of reaching the library market.
A Database of Riches_Michael Cairns
Wednesday, April 21, 2010
From the article:
The OBO tool is essentially a straightforward, hyperlinked collection of professionally-produced, peer-reviewed bibliographies in different subject areas—sort of a giant, interactive syllabus put together by OUP and teams of scholars in different disciplines. Users can drill down to a specific bibliographic entry, which contains some descriptive text and a list of references that link to either Google Books or to a subscribing library's own catalog entries, by either browsing or searching. Each entry is written by a scholar working in the relevant field and vetted by a peer review process. The idea is to alleviate the twin problems of Google-induced data overload, on the one hand, and Wikipedia-driven GIGO (garbage in, garbage out), on the other.
"We did about 18 months of pretty intensive research with scholars and students and librarians to explore how their research practices were changing with the proliferation of online sources," Damon Zucca, OUP’s Executive Editor, Reference, told Ars. "The one thing we heard over and over again is that people were drowning in scholarly information, and drowning in information in general. So it takes twice as much time for people to begin their research."
To trust OBO's content, you have to trust its selection and vetting process. To that end, OUP is making the list of contributing scholars and editors freely available. Each subject area has an Editor in Chief who's a top scholar in the field, and an editorial board of around 15 to 20 scholars. The EIC and editorial board either write the bibliographic entries themselves, or they select other scholars to do the work.
Tuesday, April 20, 2010
Springer Science+Business Media has expanded its service on the website www.AuthorMapper.com, a free analytical online tool for discerning trends, patterns and subject experts within scientific research. AuthorMapper was launched a year ago offering searchable content from all Springer and BioMed Central journals. The platform now offers eBook data as well. Currently the tool can retrieve useful information across all disciplines, from more than three million journal articles and more than 742,000 book chapters from 29,000 eBooks. Adding eBook data allows all the benefits of specific eBook analysis just as the success of the journal data has shown in the past.
The AuthorMapper tool can provide a variety of analyses, such as keyword tag clouds and "Top 5" bar charts for various important metrics, and includes an interactive world map of the results. AuthorMapper.com’s advanced search function also allows complex queries using keyword, discipline, institution, journal, publisher and author. The results can identify new and historic scientific trends through timeline graphs and bar charts of top statistics, allowing for identification of trends in the literature, discovery of wider scientific relationships, and locating other experts in a field of study.
The trend timeline graph, for example, allows authors to see whether their area of expertise is growing or has already peaked. Users that are only interested in open access content can restrict their searches accordingly, and all search results provide link-outs to content on SpringerLink.For graduates, post-docs and emerging researchers, AuthorMapper.com shows which institutions are the most prolific in specific research areas and allows for their comparison. AuthorMapper.com can even be useful for members of the general public seeking to identify experts, for example, medical specialists, working close to where they are located.
Monday, April 19, 2010
Media Week (Vol 3) 16: Mad Women, Nigella, Kipling, Review Scams, Elsevier's Peer Review, Higher Ed Retention
The violent and feral Bertha Rochester in Jane Eyre, the mysterious Woman in White whose escape from an asylum begins Wilkie Collins's gripping thriller, and the terminally delusional Emma in Madame Bovary.Nigella Lawson is the latest TV chef to release an iPhone cooking app, Nigella's Quick Collection from Random House, which contains 70 recipes, along with videos. The app costs £5 and is reviewed by The Times:
But were they really mad? Would we today recognise them as mentally ill or were our heroines merely misunderstood, not to mention a tad inconvenient?
For Radio 4 documentary, Madwomen in the Attic, medical historians, psychiatrists and literary specialists gave their diagnoses of our troubled heroines.
The first generation of cooking apps has been big on numbers (America’s Big Oven boasts of having 170,000 recipes on its database) but not so great on design, but Jamie Oliver’s 20-minute meals app, launched last year, set a new benchmark. With its sexy graphics, slick videos and cheeky chat, it’s not just a list of recipes with synchronised shopping lists but a way of aligning ourselves with a brand.Kipling's Jungle Book is to appear as a new animated series. I'm not sure about the 52 episodes however since its hard enough to keep track of The Old House. The Times:
From this week, we can cosy up to Nigella in our pockets too. The Nigella Quick Collection contains 60 of her speediest, easiest recipes. Here, we exclusively reveal five of the dishes that will have you rustling up supper in no time.
Mr Andrew said: “The world of the jungle is looking glorious in the series and will reintroduce this brand to a generation who might not know this fabulous story”.Several authors with too much time on their hands get into bother over online reviews. This isn't the first time this has happened but why are the leading academics so dumb? Telegraph:
Others were not so sure, however. Sharad Keskar, chairman of the Kipling Society, a registered charity that guards the author’s legacy said it was doubtful that the new series would be faithful to his book.
He said: “We’re used to this kind of thing. The poor man has often been maligned. The Disney one just wasn’t Kipling, it was amusing and light. Although The Jungle Book is ostensibly written for children, it is quite a scholarly book.
“I don’t think anyone is strongly against these adaptations, but there is general light-hearted disapproval. The text isn’t really represented properly,” he added. Tapaas Chakravarti, chief executive of DQ Entertainment, said: “We are thrilled that Mowgli and all these much-loved characters will be returning to the UK screens in the near future.
“Considerable time and effort has been given to produce an animation series worthy of the rich heritage The Jungle Book represents.”
Higher Ed looks at how retention can be improved for long distance education programs (HigherEd):
The row has sent shock waves through the normally genteel world of academia as claim and counter-claim have been circulated by email to other top writers.
Prof Service, a biographer of Lenin, Stalin and Trotsky; Kate Summerscale, author of The Suspicions of Mr Whicher; and Dr Polonsky were the three writers targeted by Dr Palmer's distinctly unfavourable 'customer reviews'.
Questions were first raised by Dr Polonsky after she read comments on her latest work, Molotov's Magic Lantern, on Amazon's UK site.
A growing body of research has all but obliterated the notion that distance education is inherently less effective than classroom education. But even the most ardent distance-ed evangelists cannot deny persistent evidence suggesting that students are more likely to drop out of online programs than traditional ones. The phenomenon has many explanations, not least the fact that what often makes students choose the flexibility of online learning -- being too busy to enroll in a classroom course -- can also make it harder for them to keep up with their studies.
A new process for peer review is presented by Elsevier (FT):
But Hersh believes there is another major factor driving the gap between retention rates in face-to-face programs and those in the rapidly growing world of distance education: the lack of a human touch.
And unlike the reality of adult students’ busy lives, Hersh says the human-touch problem can be solved. In fact, he thinks he knows how.
Hersh’s solution is to incorporate more video and audio components into the course-delivery mechanism. Most professors who teach online already incorporate short video and audio clips into their courses, according to a 2009 survey by the Campus Computing Project. But it is rarer, Hersh says, for professors to use video of themselves to teach or interact with their online students -- largely because the purveyors of major learning management systems do not orient their platforms to feature that method of delivery.
From the twitter:
Indeed, the reliability of peer review is increasingly in doubt. A paper in Nature in 2006 suggested that it was impossible for peer reviewers to detect all fraudulent, falsified or plagiarised research. The writer also noted that feedback from reviewers can be unhelpful, and that ideas from rejected papers can be stolen by editors and used for their own purposes. Many such researchers seem to keep quiet for fear of ruining their prospects by complaining.
Meanwhile, scientific integrity has been called into question on a broader scale. We have had the recent scandal about e-mails between researchers on climate change that appeared to suggest they wanted to alter results. And in the US, an editor of an orthopaedic journal earned $20m in royalties from an implant manufacturer who received favourable press.
InfoToday's review of the revised WorldCat Record Use Policy Changes at OCLC http://bit.ly/aT62Ud Also, LJ follows up with some of the comments that have been generated by the community since the policy was released (LJ):
On that official feedback page and elsewhere, many in the bibliographic community agree that the revised policy features much improved language and clarity (though some have questioned whether that clarity breaks down when it comes to specific use cases).
Jennifer Younger, President-Elect, OCLC Global Council and Edward H. Arnold Director of Libraries, University of Notre Dame (and Record Use Policy Council's co-chair), told LJ the Council has been pleased with the commentary they've seen so far "through the community forum on the policy, through individual blogs, tweets, Webinars and e-mails directed to the Council and to individual Council members. We are hearing from a wide range of constituencies and partners in the library community, and we encourage continuing input. Steady input from the community has enabled us to continue to expand the FAQ."
The proposed policy will be discussed with the Global Council next week, she said, adding that the Records Council "will be meeting through mid-May to address questions, concerns and comments that have been coming to us since we posted the policy draft last week.”
The Illustrated London News goes on line today. http://bit.ly/d7JQQ9
And in sports, there is slim but slightly more hope this week for MU and the championship. Also the Australian Open 1976 (Flickr).
Friday, April 16, 2010
Well, Elizabeth Browning may not have put it quite that way, but she might have done if her annual bonus depended on it.
Moves are afoot to revise the way in which publishing industry stats are computed but, as we all recognize the industry is changing, so we should be anticipating new benchmarks and methodologies for calculating success in tomorrow’s publishing industry.
In years past, some publishing executives’ annual compensation was partially dependent on how many best sellers and the level of sales they achieved, or how their revenues and expenses compared with their competitors. Historically, those calculations would have been straightforward - just add up the best sellers in the New York Times, or take a look at the annual AAP statistics. It didn’t matter that the “Times methodology” was later called into question because, by then, we had Bookscan and the industry continued to use the AAP numbers even as the industry became more complex. With all the standard measures in use, there was always “leakage” and, just like that above-ground pool that loses a little more water with each passing season, book industry sales have been spread across a wider array of outlets which have not been computed in the industry numbers. Moreover, self –reporting (a component of the AAP’s reports) was also spotty and/or inconsistent as the business grew in complexity. Add the increasing prevalence of used book sales in major retail channels and defining the real level of sales for today’s publishing industry is very difficult indeed. And, of course, some companies refuse to report at all.
On a discussion list I belong to a minor scuffle erupted recently over defining the ‘real’ level of publisher revenues for the industry. (See my recent Frankfurt Supply Chain presentation). Qualifications abound regarding segmentation, used book sales, front list/back list, consumer versus wholesaler - you name it. It is very difficult to pinpoint the real number. To address this issue, BISG will evaluate and define a new methodology for tracking publishing industry sales numbers for Book Industry TRENDS and the first results will be published in June. This is a laudable project that should be completed, and I think everyone in the industry will be looking forward to hearing about and analyzing these reports when they are published. (AAP continues to publish their own set of data for the industry).
Attempting to calculate today’s performance metrics will be simple compared to our collective future which is likely to be far more complex and confusing if we don’t get in front of the issue. We will also need to cooperate if - as we all like to do – we wish to generalize about the size of our market, measures of success and whether one type of content could be considered the “best of” anything. Recently, there have been some harbingers of how complex the future may be. For example, Overdrive released a slew of data indicating how rapidly eBook downloads are growing and Fictionwise say they have 'served 2 trillion words'. Stats like these are quickly becoming markers in our conscious view of publishing success. How soon will it be that we casually mention that so and so had 100,000s of downloads rather than (or in addition to) retail sales? But those references to downloads, pageviews, comments, searches, hits, subscribers (and on and on) will not mean enough unless we, as an industry, have some mechanism of comparison or some degree of standardization in how we reference these data points.
The publishing industry is a relatively small media segment but the boundaries increasingly blur. To give some indication of how complex the measurements may become you only have to hear about some of the numbers thrown out at this year’s Consumer Electronics Show in Las Vegas. Sony announced that the Playstation network has 17 million registered users and added 2.1 million accounts in December. Games on the network have sold in the millions and users number in the 100,000s for each game. ABC.com also stated that they delivered, via their episode player 500 million episodes and 1 billion ad views. (Paidcontent) Hulu.com, the site for NBC and Newscorp content, receives 3.8 million visits per day (24 million uniques in October) and users are streaming over 63 million videos per month. These numbers gloss over the fundamental change in how media counts itself and, as the change in publishing accelerates, it will no longer be enough to count books sold via a cash register.
We will need a wholesale revision of our thinking and our perspective if we are to retain any semblance of cohesive, representative reporting as we move into the 21st century.
Thursday, April 15, 2010
At the heart of Armstrong’s stance is the fundamental belief that, despite the transition from analog to digital culture, the foundation of copyright hasn’t changed. It has only created a greater and more urgent need for expeditious means of licensing the material -– or clearing copyright.Topically, The Economist takes us on a historical tour of copyright law and also makes an argument for radical change (Economist):
“I agree with the statement that everyone is now a publisher,” she says, “and what that means is a tremendous proliferation of material that is copyrighted and can be licensed.” She describes this as the “atomization” of content –- books being offered as individual chapters and paragraphs, computer software being parsed into individual lines of code –- a phenomenon that is causing exponential growth in the number of “granular” elements that are available to be licensed.
Of course, she adds, “the market is not infinitely elastic” -– and notes that there is plenty of information that will be offered for free, or will have to be.
The notion that lengthening copyright increases creativity is questionable, however. Authors and artists do not generally consult the statute books before deciding whether or not to pick up pen or paintbrush. And overlong copyrights often limit, rather than encourage, a work’s dissemination, impact and influence. It can be difficult to locate copyright holders to obtain the rights to reuse old material. As a result, much content ends up in legal limbo (and in the case of old movies and sound recordings, is left to deteriorate—copying them in order to preserve them may constitute an act of infringement). The penalties even for inadvertent infringement are so punishing that creators routinely have to self-censor their work. Nor does the advent of digital technology strengthen the case for extending the period of protection. Copyright protection is needed partly to cover the costs of creating and distributing works in physical form. Digital technology slashes such costs, and thus reduces the argument for protection.
A return to the 28-year copyrights of the Statute of Anne would be in many ways arbitrary, but not unreasonable. If there is a case for longer terms, they should be on a renewal basis, so that content is not locked up automatically.
Forecasting the demise of the paper book is to some as straightforward as it was to those who forecasted the demise of the CD. Yet reality will be more complicated. While music publishers were slow to react and took more than their fair share of missteps, change and adapt they did and, as a result, they've begun to exert a little direct influence on a market that was in free fall. The physical CD isn't 'back', but the format may now be a managed item in a portfolio of options available for music purchasers. As a result, the CD may have a long life yet.
Many believe the physical book will disappear within in the next ten years yet the example of the music CD suggests the future of the book may be more nuanced. The availability of electronic versions of trade content will approach 100% in less than ten years: In my view, five years for all but the smallest publishers is more likely. Despite the availability however, electronic content is likely to represent only one of a number of ways consumers will engage with book content. Whether that percentage is 25% or 50% matters less than how publishers will manage the process. Book publishers can (and are) avoiding many of the mistakes that music publishers made when they were effectively out of control. Book publishers can 'skip ahead' to the point where they proactively manage the further development of print - as music publishers are now doing with the CD - and, in doing so, publishers will buy time as they adapt to the changes in their business brought about by the migration to electronic content. Rather than disappear, the lowly print book may retain a position of wide distribution (not universal) and become the focal point of a facilitated interaction with numerous content acquisition options for consumers. Maybe the book has stronger legs than suggested.
FULL LINE-UP OF 12 NEW SPEAKERS ANNOUNCED
The next instalment in the “canon tales” series, which sees speakers from across the book industry showcase their stories and projects with rapid visuals, is taking full shape with the line-up for the fourth “chapter” set to go at London’s Free Word centre, on April 22nd.
James Bridle (Publisher, jamesbridle.com)
Dylan Calder (Director, StarLit festival)
Tram-Anh Din (Paperbacks Editor,
David Godwin (Managing Director, David Godwin Associates)
Ben Hammersley (Budding, Editor at Large WIRED
Ramy Habeeb (Director, co-founder Kotobarabia)
Iain Millar (Marketing Manager, Quercus)
Stefanie Posavec (Cover Designer, Penguin and itsbeenreal)
Sophie Rochester (Content Editor, Man Booker Prize)
Ross Sutherland (Poet, Aisle 16)
Kate Wilson (Managing Director, Nosy Crow)
Emma Young (To Hell With Publishing)
There may yet be an as yet unnamed special guest who could take the stage on the night…
The canon tales series has, in only three events, had an illustrious and energetic range of speakers who have offered an entertaining perspective on their creativity.
Free Word Centre,
Registration at www.thesyp.org.uk/canontales
Doors at 6pm, first speaker 7pm
For more information, contact:
Jon Slack firstname.lastname@example.org, @jonslack
Doug Wallace email@example.com , @twittizenkane
Wednesday, April 14, 2010
Sunday, April 11, 2010
Like many other parts of the media industry, publishing is being radically reshaped by the growth of the internet. Online retailers are already among the biggest distributors of books. Now e-books threaten to undermine sales of the old-fashioned kind. In response, publishers are trying to shore up their conventional business while preparing for a future in which e-books will represent a much bigger chunk of sales.In the second article the newspaper comments on The endangered bookstore and suggests that the sickest part of the book business is the store that supplies them:
Quite how big is the subject of much debate. PricewaterhouseCoopers, a consultancy, reckons e-books will represent about 6% of consumer book sales in North America by 2013, up from 1.5% last year (see chart). Carolyn Reidy, the boss of Simon & Schuster, another big publisher, thinks they could account for 25% of the industry’s sales in America within three to five years.
Indeed, many publishing executives like to argue that the digital revolution could usher in a golden age of reading in which many more people will be exposed to digital texts. They also point out that new technologies such as print on demand, which makes printing short runs of physical books more economical, should help them squeeze more money out of the old-fashioned format. And they insist that the shift away from printed books will be slow, giving them more time to adapt to the brave new digital world.
Perhaps. But there are still plenty of inefficiencies in the supply chain for conventional books that firms such as Amazon and Apple can exploit. Many publishers, for example, still take far too long to get books to market in print or electronic form, missing valuable opportunities. Ms Reidy at Simon & Schuster says she has brought functions such as typesetting in-house to boost efficiency. At Sourcebooks responsibility for making books has even been shifted from the editorial team to the firm’s head of technology, underlining the need to think digitally right from the start of the commissioning process.
Will bookshops disappear completely, as music shops seem to be doing? Most are pinning their hopes on giving people more reasons to come inside. “Consumers will need some entity to help them make sense of the morass,” says William Lynch, the new boss of Barnes & Noble, which plans to put a renewed emphasis on service, including advice on e-books. Many shops have started to offer free internet access to keep customers there longer and to enable them to download e-books. Other survival strategies include hosting book clubs or other community groups and selling a wider variety of goods, such as wrapping paper, jewellery, cards and toys.In the same issue (obviously an un-explained abundance of attention toward publishing), the paper also takes a look at how the recent economic downturn is impacting how micro economics and therefore textbooks are changing. What they don't point out is how immediate this revision could be facilitated if the books were subject to electronic updates and revisions. In fact, the subject could have served as a case book example about how the inefficiencies in the development and production of publishing products mitigate some of the opportunities publishers have in addressing variable business opportunities. No matter. From the article:
Independent bookshops face a particularly grave threat, because they are unable to match bigger rivals’ prices. Many are branching out by offering new services, such as creative-writing classes. BookPeople, a bookshop in Austin, Texas, runs a literary summer camp for around 450 children. Steve Bercu, the shop’s co-owner, says that independent booksellers can still thrive, provided they “reinvent themselves”.
Revised textbooks will soon find their way into bookshops. Charles Jones of Stanford University has put out an update of his textbook with two new chapters designed to help students think through the crisis, and is now working on incorporating these ideas into the body of the book. A new edition of Mr Mankiw’s book should be out in about a year. And Mr Blinder’s publishers aim to have his revised text on sale by June.In the UK - the home of the celebrity "bio" - there is a new segment of publishing works that are doing well. The books about celebrity pets. (I wonder if there's one in Charlie?) Independent
Courses in many leading universities are already being amended. Mr Laibson says he has chosen to teach his course without leaning on any standard texts. Francesco Giavazzi of the Massachusetts Institute of Technology is now devoting about two-fifths of the semester’s classes to talking about how things are different during a crisis, and how the effects of policy differ when the economy hits boundaries like zero interest rates. Discussion of the “liquidity trap”, in which standard easing of monetary policy may cease to have any effect, had fallen out of vogue in undergraduate courses but seems to be back with a vengeance. Asset-price bubbles are also gaining more prominence.
Ever since James Lever earned a Booker Prize nomination for the spoof life story Me Cheeta, which was written from the perspective of an ageing silver-screen chimpanzee who starred in Hollywood's Tarzan films, a spate of fake confessionals has followed. They each simultaneously look askance at celebrity culture, while benefiting from the public's appetite for it.Lever's novel has sold more than 50,000 copies since its publication last year. Shortly after it came another spoof memoir. Bubbles: My Secret Diary, From Swaziland to Neverland is a variation on Lever's theme, and is based on the eventful life story of Michael Jackson's pet chimpanzee, organised as a collection of "very personal and honest entries from Bubbles' diary". The book sparked a bidding war in America and Australia, and its publisher John Blake suggested its contents would shine a light on a troubled mind – Bubbles' that is, not Jackson's.On second thought, I don't want to see a tell all pet book about the PND home front. Could cause some problems.
OCLC and Jisc have collaborated on a report the synthesizes several reports on "The Digital Information Seeker" (JISC)
The Digital Information Seeker: Report of findings from selected OCLC, RIN and JISC user behaviour projectsFrom the twitter (@personanondata)
There are numerous user studies published in the literature and available on the web. There are studies that specifically address the behaviours of scholars while others identify the behaviours of the general public. Some studies address the information-seeking behaviours of scholars within specific disciplines while others identify the behaviours of scholars of multiple disciplines. There are studies that only address undergraduate, graduate, or post graduate students or compare these individual groups’ information-seeking behaviours to those of scholars. Still other studies address the behaviors of young adults (Screenagers (Rushkoff 1996) and Millennials).
In the interest of analyzing and synthesizing several user behaviour studies conducted in the US and the UK twelve studies were identified. These twelve selected studies were commissioned and/or supported by non- profit organizations and government agencies; therefore, they have little dependence upon the outcomes of the studies. The studies were reviewed by two researchers who analyzed the findings, compared their analyses, and identified the overlapping and contradictory findings. This report is not intended to be the definitive work on user behaviour studies, but rather to provide a synthesized document to make it easier for information professionals to better understand the information-seeking behaviours of the libraries’ intended users and to review the issues associated with the development of information services and systems that will best meet these users’ needs.
Observer: Profile of novelist David Mitchell: The magician of modern fictionManUtd's season looks over after a flaccid performance in Germany and a less than United like loss to Chelski. Well done Phil.
The Age: The ghostwriter who turned to crime fiction Australian crime writer Michael Robotham.
The Observer Lorrie Moore talks about A Gate at the Stairs
NYT: The Godfather of the E-Reader: Bob Brown: “a bloody revolution of the word.”
Telegraph: Wuthering Heights quadruple double thanks to Twilight effect
Library Journal OCLC Proposes New WorldCat Records Policy, Revamping Content and Approach
NYT: Visual Artists to Sue Google Over Vast Library Project
Inside HEd: New Battleground for Publishers Online tools add to students ability to learn.
Friday, April 09, 2010
Is it time to revise the manner in which the publishing industry establishes standards for the industry? The pace at which the industry is moving suggests that the model of serial committee meetings staffed by over worked volunteers may no longer be an optimal solution.
Into a vacuum does a 'standard' establish itself and I believe the RFID situation in the library community is just one example. In the absence of a universal approach to RFID tagging in the publishing and library community we now have several vendor specific 'standards' that mitigate some or all of the benefits of the technology itself. Time to deliberate and debate ad nausea is a luxury we can't afford when digital content and transaction models are changing rapidly so I was interested to see the following comment from BISG regarding digital content:
The committee will work to find solutions that will benefit the entire book industry – publishers, retailers, search engines, authors, wholesalers and distributors – by improving the process by which online book content reaches consumers. To expedite standards development at a time when the book industry is moving rapidly forward, the Committee will start its work using a briefing paper, requirements, and draft specification that were developed within the Association of American Publishers (AAP) to serve as frameworks for further work.It will be interesting to see how this develops; however, just making the old system work faster may not be enough. An alternative approach could be to establish a forward thinking (anticipatory) approach to new standards development. Importantly, a small 'reconnaissance' team that sits permanently could identify new standards needs and establish a minimalist framework for these new standards. This framework could include the identification of less than 10 data elements and with definitions that would immediately enable standardization at a very basic level. This group would generate standards projects based on submissions from the community as well as from their own initiatives.
Once the framework was completed the new born standard would be published and passed on to the committee best suited to expand on it and extend its relevance. In some cases, the standard could remain dormant and/or industry participants could submit their own amendments and additions to the standard rather than wait for the committee to define new data elements and requirements.
Wednesday, April 07, 2010
From their press release:
Since March 8th over 15,000 people have visited to nominate more than five hundred blogs and microblogs written by authors. The resulting shortlist has been created from the web feeds which received the most nominations, with the last few weeks seeing tech-savvy authors mobilising their fans and followers across many social networks.
The result is a diverse mix of authors blogging from very different perspectives. It includes superstar authors such as Neil Gaiman, newcomers to the publishing world such as Gavin James Bower, and a number of yet-to-be-published bloggers such as Jane Alexander. Please see the full shortlist below.
The awards now enter the voting phase where the public are invited to take a look at the shortlisted blogs and vote for their favourite. The Author Blog Awards aim to recognise and highlight the writers who use their blogs to connect with readers in the most imaginative, engaging and inspiring ways. Winners will be selected from the shortlist and announced at the London Book Fair official Tweetup on 21st April.
The Author Blog Awards are organised by CompletelyNovel.com and Jon Slack, in partnership with publishers including The Random House Group, Simon & Schuster UK, Quartet Books, Penguin, Bloomsbury, Allison & Busby, Faber & Faber, Mills & Boon and Headline.
These publishers are offering hundreds of books as prizes for the people voting for a blog on the shortlist. More information about the Awards and prizes can be found on the Author Blog Awards website at http://www.authorblogawards.com.
I'm a full-time writer and freelance literature development worker. That means I write books, teach creative writing and blogging workshops, organise literature events and projects and edit manuscripts.
The tweetings of Neil Gaiman, author of ‘The Graveyard Book’ and many more.
The blog follows the progress of my books as I attempt to write, paint and bring up two children, balancing life and work in a strange pattern where I often find that life mirrors art mirrors life. Centred around my studio the blog wanders off onto beaches, cliffs seeking inspiration.
Thoughts on writing, tips & advice, and general rambling nonsense from children’s horror author, Barry Hutchison. Follow his journey from unpublished hopeful, through the publication of his first series, INVISIBLE FIENDS, and beyond...
A blog about a dad in a mum's world, Bringing up Charlie charts the day-to-day life of stay-at-home dad and author Tim Atkinson, as his wife returns to work - leaving him holding the baby and changing the nappies!
This is the blog of the books The Blue Cabin and Still On The Sound: snapshots of life on the otherwise uninhabited island of Islandmore, Strangford Lough, Northern Ireland..
Alice Griffin is a writer living on a boat in England. She also describes herself as a wife, mother, traveller, daydreamer and sometime crafty girl; hopefully her blog reflects all. Author of ~ Tales from a Travelling Mum ~ Alice’s second travel book will be published in November 2010.
I work in my Mum's chandelier shop where customers come in for therapy and the occasional light bulb. My blog has been published as a book 'Shop Girl Diaries' and is coming shortly as a film...
Your official invitation into the african american section of the bookstore! A sometimes serious, sometimes light-hearted plea for everybody to give a black writer a try.
The wiki and journal of Cleolilnda Jones, author of 'Movies in Fifteen Minutes'.
Slightly eccentric hints and tips on writing, latest news on my books and where I'll be talking about them, as well as stuff that's going on in the wider children's book world.
This blog is about the fictional character, Jade del Cameron (www.suzannearruda.com), and the historical time period in which she lives.
In September 2005, two weeks before I was due to start a PhD in Linguistics, I watched an interview on Richard & Judy where they referred to someone as a ‘nearly woman’. I can’t remember who that person was, but it was the moment when everything in my life started to jigsaw into place...
Diary of a Desperate Exmoor Woman. Juggling work, life, motherhood and marriage - and frequently dropping the balls.
Nicola morgan is proud to be the first google result for "crabbit old bat" and offers crabbitly honest expertise to writers with talent and a burning need to be published.
My life as a gardener of words. Visit Planting Words to read about cats, cake, the things I learn, Buddhism, other people’s poems, the things I get wrong, and occasionally I even remember to write about being a writer.
Gavin James Bower
I’m a writer, a Northerner and, for now at least, a Londoner. My first novel, Dazed & Aroused, was published in 2009 and I’ve recently finished my second, Made in Britain.
Blog of an award-winning romance author.
The Portugese and English tweets of the mighty Paulo Coelho, author of ‘The Alchemist’ and many more.
Writer. Latest books: We Need to Talk About Kelvin, Afterglow of Creation & Felicity Frobisher and the Three-Headed Aldebaran Dust Devil.
Richard Jay Parker
Dark thriller STOP ME by Richard Jay Parker just published by Allison and Busby.
Blog of a murder mystery writer.
Lynn Flewelling Muses on Writing, Living, and Shameless Self Promotion.
My journal is informally known as Sam's Cafe and is read by people of many religions, political beliefs, and ethnic backgrounds. Come in, sit down, and have a pastry. I made them myself.
This blog was set up in 2006 as a resource for parents of multiple birth children.But it has moved on to include journalism, fiction, media requests and advice.
Nikesh Shukla/Yam Boy is an author, film-maker and poet caught between the cityscapes of Bombay and the low-swinging chariots of London.
Little is known about the origins of Michell as they are shrouded (or at least covered with a moth-eaten towel) by the mists of time. What is known is largely obscure and often contradictory. Oh and he sometimes speaks about himself in the third person.
Twitter account of the president of New Marketing Labs and social media extraordinaire.
CompletelyNovel.com, founded in 2008 by Oliver Brooks and Anna Lewis, is a social reading and publishing platform. CompletelyNovel links writers to online publishing tools and print-on-demand, to offer a slick and affordable self-publishing service. Readers can read thousands of books for free online, build up their own online library and support new writers by offering feedback and buying their books.
Tuesday, April 06, 2010
This time both public and academic libraries in the Cologne, Germany area are offering cataloging data
From the Announcement
Cologne-based libraries and the Library Centre of Rhineland-Palatinate (LBZ) in cooperation with the North Rhine-Westphalian Library Service Center (hbz) are the first German libraries to adopt the idea of Open Access for bibliographic data by publishing their catalog data for free public use. The University and Public Library of Cologne (USB), the Library of the Academy of Media Arts Cologne, the University Library of the University of Applied Science of Cologne and the LBZ are taking the lead by releasing their data. The Public Library of Cologne has announced to follow shortly. The release of bibliographic data forms a basis for linking that data with data from other domains in the Semantic Web.
Libraries have been involved with the Open Access movement for a long time. The objective of this movement is to provide free access to knowledge to everybody via the internet. Until now, only few libraries have done so with their own data. Rolf Thiele, deputy director of the USB Cologne, states: “Libraries appreciate the Open Access movement because they themselves feel obliged to provide access to knowledge without barriers. Providing this kind of access for bibliographic data, thus applying the idea of Open Access to their own products, has been disregarded until now. Up to this point, it was not possible to download library catalogues as a whole. This will now be possible. We are taking a first step towards a worldwide visibility of library holdings on the internet.” The library of the European Organization for Nuclear Research (CERN) has already published its data under a public domain license in January.
The North Rhine-Westphalian Library Service Center has recently begun evaluating the possibilities to transform data from library catalogs in such a way that it can become a part of the emerging Semantic Web. The liberalization of bibliographic data provides the legal background to perform this transformation in a cooperative, open, and transparent way. Currently there are discussions with other member libraries of the hbz library network to publish their data. Moreover, “Open Data” and “Semantic Web” are topics that are gaining perception in the international library world.
Additional information in English
Additional Info (in German) and Links to Access Data
Monday, April 05, 2010
Texas education is dominated by centralized planning that, in recent weeks, has looked Stalinist in its apparatchik-like ability to re-write history. In one example, and with little or no debate, one ignorant school board member was able to effectively rewrite Latin American social history simply because she hadn’t heard of a key participant. Other board members might, perhaps, have pointed out that that’s the point of teaching history but, alas, they did not. In Dallas recently, the school board there decided to “go rogue”--disregarding both the evidence and the testimony of experts and parents-- and select materials for their schools that were characterized by the Dallas Morning News as being ‘riddled with errors’.
Texas seems to revel in its gargantuan-market-sized ability to influence what publishers place in their textbooks. In the words of full-time dentist and part-time Texas Board of Education Chairman Dr. Don McLeroy, board members like him are there to correct the ‘liberal bias of experts’ in the creation of educational texts. In so doing, Texas educators conspire in an almost narcissistic endeavor to create a mélange of fuzzy math, pseudo-science and revisionist materials for their schools. Despite the headlines from Dallas in recent weeks and the resultant slow awakening of faculty, students and parents, the situation is unlikely to change appreciably. Especially when you consider that Dr. McLeroy is from Austin, arguably the most liberal locale in Texas.
Today (April 2) is the day the NGA is closing the comment period for their draft Core Standards document. This set of guidelines for math and English language arts represents an attempt by the states (not the Federal Government) to ensure consistency across the US for students preparing for higher eduction. From their press release:
These standards define the knowledge and skills students should have within their K-12 education careers so that they will graduate high school able to succeed in entry-level, credit-bearing academic college courses and in workforce training programs. The standards are:No doubt that last one caused consternation in Texas but, if you read the guidelines as is, they are not revolutionary in scope. Where they do differ from prior practice is that the states have decided to determine their own destinies and not be forced to accept federal dictates on educational reform. In the No Child Left Behind programs (which set assessment and evaluation criteria and then rewarded achievement with money), the states played a limited role in setting the standards. No Child Left Behind is now widely viewed as a very expensive failure and the Obama administration has determined that education policy must change to improve students’ ability to reach college (with a uniform understanding of certain key topics) and to enable America to compete with other countries.
• Aligned with college and work expectations;
• Clear, understandable and consistent;
• Include rigorous content and application of knowledge through high-order skills;
• Build upon strengths and lessons of current state standards;
• Informed by other top performing countries, so that all students are prepared to succeed in our global economy and society; and
• Evidence- and research-based.
The proactive steps taken by the NGA should be actively supported by all who see education policy as a shared responsibility between the states and the federal government. Hopefully, by so doing, individual states like Texas and Alaska will no longer be able to short-change their students future by imposing their flat world view on education.
Note: How the Texas Board Works and What it Does (Video)
Sunday, April 04, 2010
The Economist wonders why Nordic detectives are so successful (Economist):
Larsson and Mr Mankell are the best-known Nordic crime writers outside the region. But several others are also beginning to gain recognition abroad, including K.O. Dahl and Karin Fossum from Norway and Ake Edwardson and Hakan Nesser of Sweden. Iceland, a Nordic country that is not strictly part of Scandinavia, boasts an award winner too. Arnaldur Indridason’s “Silence of the Grave” won the British Crime Writers’ Association Gold Dagger award in 2005. “The Devil’s Star” by a Norwegian, Jo Nesbo, is published in America this month at the same time as a more recent novel, “The Snowman”, is coming out in Britain. A previous work, “Nemesis”, was nominated for the prestigious Edgar Allan Poe crime-writing award, a prize generally dominated by American authors.In the UK one of the recommendations to improve library service could allow patrons to order any book (Independent):
Three factors underpin the success of Nordic crime fiction: language, heroes and setting. Niclas Salomonsson, a literary agent who represents almost all the up and coming Scandinavian crime writers, reckons it is the style of the books, “realistic, simple and precise…and stripped of unnecessary words”, that has a lot to do with it. The plain, direct writing, devoid of metaphor, suits the genre well.
The Nordic detective is often careworn and rumpled. Mr Mankell’s Wallander is gloomy, troubled and ambivalent about his father. Mr Indridason’s Inspector Erlendur lives alone after a failed marriage, haunted by the death of his younger brother many years before in a blizzard that he survived. Mr Nesbo’s leading man, Inspector Harry Hole—often horribly drunk—is defiant of his superiors yet loyal to his favoured colleagues.
Library-goers should have the right to order any book – including out-of-print editions – and free access to e-books under a new plan for the future of the library service. Free internet use and membership of all libraries in England are also recommended under proposals outlined by Culture Minister Margaret Hodge. The public library modernisation review policy statement sets out a series of "core" features which would ensure the service meets the challenges of the 21st century. It says that the right to borrow books free of charge must remain at the heart of the library service. And the paper sets out ways in which libraries tackle a decline in use of current services while grasping the opportunities of the digital world. The statement says all libraries should be "digitally inclusive" with easier, free access to the internet. And the document proposes local authorities set out their own "local offer" including commitments on their stock of books, events and extra services such as CD and DVD loans. The Government wants library authorities to have these in place by the end of this year.From the twitter (@personanondata)
The NYTimes' ethicist says its OK to illegally download a book if you've legitimately purchased a copy already (NYTimes)"- E-Book Dodge: When its OK to illegally download.
An Op-Ed in The NYTimes argues that mash-ups require a re-evaluation of permissions and copyright in The End of History (Books) - (NYTimes)
OCLC publish a report on the future of MARC and it's not very bright (OCLC)
Jordon Edmiston report that media M&A is back on the rise (MinOnline)
Friday, April 02, 2010
Libraries rely on MARC data for library inventory control, but users do their discovery elsewhere.5
• MARC is a niche data communication format approaching the end of its life cycle. Delivery of the inventory from the library will likely be mitigated by the availability of digitized works, especially for those in the public domain. The RLG PartnersHIP MARC Tag Usage Working Group’s view on MARC’s future:
• Future systems, if they are to be able to meet users’ needs in the ways documented in the Functional Requirements for Bibliographic Records6
• Future encoding schemas will need to have a robust MARC crosswalk to ingest the millions of legacy records we now have. and to take advantage of linked data as envisioned by the new Resource Description and Access standard, will need a more relational approach to data storage. MARC is not the solution.
• Ask ourselves: How would we create, capture, structure, store, search, retrieve, and display objects and metadata if we didn’t have to use MARC and if we weren’t limited by MARC-centric library systems?
• Consider how best to take advantage of linked data and avoid creating the same redundant metadata in individual records. Consider sources outside the traditional library environment.
• Rather than enhancing MARC and MARC-based systems, let’s give priority to interoperability with other encoding schemas and systems. We need to meet the demands that have arisen from the rest of the information universe.
Thursday, April 01, 2010
Bibliographic data is in the midst of a major transition. With the emergence of the Semantic Web, the very purpose of this data is evolving from an entity meant for interpretation by humans to one meant for interpretation by machines.
Karen Coyle, digital library consultant and bibliographic data expert, will discuss the future of Metadata and its role in bibliographic data and the semantic web. With major transformations in the use and structure of data already occurring, Karen will discuss what these changes mean for libraries, and what librarians can do to prepare, adapt, and take advantage of new possibilities that emerge.
Topics will include:
- Defining metadata
- Bibliographic data and the semantic web
- Future directions of library data
Attendees will be given the opportunity to participate through Q and A and discussion.
Please join us on Thursday, April 1st at 3pm Central (4pm Eastern, 1pm Pacific) for this exciting event!
Register Here: Meeting