Friday, August 13, 2010

Repost: Book Insurance

Originally posted July 14, 2009.

Few in the book world can see an end to DRM on book content even as glimmers of a new dawn in the music world seem to indicate there may be a different future on the horizon from the one that book publishers are trying so desperately to avoid. Rampant file sharing and ineffectual (even legal) efforts to halt copyright infringement represents the atomic winter that consumer book publishers fear and thus they believe the only way to preclude that future is to do impose severe restrictions on a consumers ability to use the electronic content they have purchased.

It's not news to anyone paying attention that the 'rights' a buyer has when they buy a physical book are proactively eliminated in the eBook world. For example, in the eBook world it becomes difficult to lend my book to someone else to read (friend or family member) or to sell the book on the second hand market. I really don't own it in the traditional sense. Things can become even trickier if I buy from multiple eBook providers or change 'platforms' or even, (strangely) if I loose my credit card since some vendors attribute your purchases to a specific credit card.

In an environment where DRM places limits on interoperability moving from one platform and keeping your library of books becomes difficult. If you liked the SONY but something better comes along you may have to keep the SONY ready to go for years even though you and the technology has moved on. It would be a far better experience for users if the book was the constant not the technology. As long as booksellers and publishers maintain this cabal over DRM there are no easy solutions for buyers who find themselves tethered to the technology and not the content.

Perhaps an unlikely solution would be to provide a type of digital insurance. Some (new) third party would offer this service to content buyers as a type of insurance or escrow policy. On purchasing content, I would register the purchase as part of my profile. Obviously, I would have to provide some proof that I had made the purchase but the transaction would sit in this profile as long as I paid my monthly premiums. The amount paid by the consumer wouldn't have to be a lot because only a small amount of the 'members' would ever make use of the insurance. (It becomes an actuary exercise).

Circumstances arising whereby a user would make use of the insurance could be anything from 'passing your library on to a family member' to simply moving over to a new platform. Depending on how the insurance company was set up (as a pseudo-retailer possibly) they wouldn't host this content but they would allow the consumer to 're-purchase' the content and then submit a 'claim' for the purchase. For each 're-purchase' they would get a refund just like a traditional insurance company. (And maybe the following year your premiums go up also). There maybe other benefits to this solution including the return of the right of first sale: As registered owners maybe we build a secondary market for e-Books.

Yes, even I think this is a pretty wacky idea but with e-Book content still less than 10% of total revenues, with publishers exhibiting apparent limited interest in pushing growth faster and the likelihood of formats and technology remaining fluid for a long time, consumers will increasingly become dissatisfied and disgruntled over the limitations (mistrust as well) that publishers and retailers are imposing on their purchases. There could be a better solution.

Insurance anyone?
Reblog this post [with Zemanta]

Thursday, August 12, 2010

Ironwork Escape: New York 1993

Ironwork Escape, New York 1993
A weekly image from my archive. Click on the image to make it larger.

The iron work hangs on many of these lower east side buildings like decorative ornaments. Oddly there aren't any air conditioners in any of the windows.

Wednesday, August 11, 2010

PND Technology: twilio

This is week two of my recap of some of the interesting technology I've heard about at the tech meet-ups I've been going to (NYTech)

Over the years, I've had the dubious distinction of being responsible for several office moves and, aside from the bickering over who gets the bigger office and what type of furniture we buy, some of the more problematic issues related to dealing with the old telephone pbx. Twilio can't help with the baser issues but they have eliminated the hardware problems inherent in the old phone systems and pushed a powerful and easy to use set of applications to the cloud that can manage the most sophisticated phone applications.

Here's how they explain how their system works:
We're always building web applications, and sometimes we want those apps to be able to interact with phone callers. Maybe we want a customer to be able to call in and get information, or maybe we need to coordinate our employees more efficiently. Before Twilio, you would have had to learn some foreign telecom programming languages, or set up an entire stack of PBX software to do this. At which point, you'd say "aw, forget it!" Twilio lets you use your existing web development skills, existing code, existing servers, existing databases and existing karma to solve these problems quickly and reliably. We provide the infrastructure, you provide the business logic via HTTP, and together we rule the world.
The demos at NY Tech meet-up are only five minutes long however in a demonstration of how easy their tool is to use they wrote a simple script that created a dial in conference call, selected (purchased) a specific phone number and then created an invite to which they asked all the audience to dial in to. Programmed in simple xml this took 2mins of fast typing. The system naturally collects all the dial in numbers and as a follow-up demo they used the application to call back everyone in the room who had dialed in to the conference number.

There are all kinds of business applications that can be created almost on the fly and certainly specifically directed to a business issue or situation. Some of the examples include, polling, status updates such as weather problems or power outages, reminders such as appointments, as well as the typical voicemail transcription and sms functionality.

There are many more practical examples noted on their blog including:
MedTaker takes advantage of the ubiquity of the phone to help people remember to take their medication while at the same time periodically checking in on their wellbeing.
Life is full of so many little details that need attending to all the time. Would you rather be coding up your next prize winner Twilio app, or assembling IKEA furniture? What about getting a ride to the airport, or grocery shopping. Fortunately, with TaskRabbit you can delegate these tasks to "runners" who you pay by the hour to help you get things done.

Diner Connection is a complete online solution for restaurants. You can contact your customers via text messaging, collect patron visit information and connect with your patrons more effectively using Diner Connection.

DropConf is an on-demand conference calling application -- you pay per conference call. The idea is that small business or freelancers for example might only need one or two conference calls a month. Some months they might not need any conference calls. All the other paid options out there have monthly fees -- so people are paying for a service they don't need.
And many, many more.

Tuesday, August 10, 2010

Curating Research Data at Elsevier

Elsevier announced a partnership with Pangaea which is a 'data library' that links primary research data with journal articles in earth and environmental science. As I mentioned last week, information and academic publishers like Elsevier have long organized themselves around content areas but are now 'widening' their content 'silos' to accommodate tools, techniques and proprietary data provided by third parties. This is a good example of how the Elsevier 'platform' can and is being leveraged beyond what may have originally been envisioned as a closed system.

From their press release they note that this initiative extends one announced in February,

This next step follows the introduction, last February, of 'reciprocal linking' - automatically linking research data sets deposited at PANGAEA to corresponding articles in Elsevier journals on its electronic platform ScienceDirect and vice versa. The new feature adds a map to every ScienceDirect article that has associated research data at PANGAEA; it displays all geographical locations for which such data is available. A single click then brings the user from the ScienceDirect article to the research data set at PANGAEA.

"With an increasing interest in the preservation of research data, it is very important to make those data clearly visible in the context of the formal research publications," commented Jan Aalbersberg, Vice President Content Innovation at Elsevier. "Elsevier is committed to advance science by investing in such collaborations with data set repositories. This new feature will allow readers to easily go beyond the content of an article, and drill down to the research data sets."

As the press release goes on to say, we are starting to see how the web, the use of api's and other methods are eliminating the inefficiencies in sharing research data and analysis which academics have had to navigate around for many years. Ironically, while these large information companies may 'open' their platforms to produce much more utility for their subscribers, they may also be strengthening their positions as the clear leaders in providing information, analysis tools and other key functionality for their users. Their strategy continues to reflect the curation model I've discussed before although the evidence of this now extends far beyond the concentration of topic based content.

Prior post on Massive Data Sets
Posts on Content Curation

Monday, August 09, 2010

Investment in the library leads to grant funding

Elsevier conducted a research study to determine the value of an academic library to the institution and concluded that there is a strong correlation between investment and the ability to generate grant income (press release)

Of the 8 institutions participating from around the globe, 6 demonstrated a greater than one-to-one (1:1) return in grant funding, with results ranging from 15.54:1 to 0.64:1. Equally significant is the result that 2 institutions showed a significant positive correlation between an increase in library investment over time and an increase in grant funding to the university.

Dr. Carol Tenopir, Director of the Center for Information and Communication Studies at the University of Tennessee (http://www.cci.utk.edu/cics/), led a team of investigators over a 16-month period. "Libraries bring value and returns on institutional investments in many ways," explains Dr. Tenopir. "Although the exact monetary amount of the returns in grants varies with the mission of the institution, our research shows that the collections and services of all university libraries help faculty write better grant proposals and articles and help them do better research."

The results of the study, funded by Elsevier, are available through a newly published Elsevier Connect white paper, University Investment in the Library, Phase II: An International Study of the Library's Value to the Grants Process (http://libraryconnect.elsevier.com/whitepapers/roi2/lcwp021001.html) (http://libraryconnect.elsevier.com).

"The results reinforce the contribution of libraries and information to the research enterprise," notes Chrysanne Lowe, Elsevier's Vice President of Customer Development and Engagement. "Universities have always known this, but it's useful to see value articulated in terms of grant income ROI as well."

Saturday, August 07, 2010

MediaWeek (Vol 3, No 32): End of Print, Education, Dom DeLillo, Quercus, Changing Libraries,

Contemplating the end of print books in Newsweek:
Paperbacks and public libraries made books cheap or free but certainly available to millions who might otherwise not have been able to afford them, and all that happened long before I was born. Nevertheless, I was brought up by people who had been taught—and who taught me—that books were valuable things, things to be cared for and cherished, and I have owned some volumes for close to half a century (almost none of them, I should point out, qualify as “collectible” or valuable to an antiquarian book collector; owning a rare book makes me nervous. I like books I can hold, read, and even—here my mother is spinning in her grave—write in). I come from a generation for whom the books and records on the shelf signaled, in some way, who you were (starting with the fact that you were a person who owned books or records or CDs). If you visited a friend, you took the first chance you had to surreptitiously scan that friend’s shelves to get a handle on the person. I suppose I could sneak a peek at a friend’s Kindle, but is that the same? And try that kind of snooping on a bus or in a coffee shop and you’ll probably get arrested. For a sense of the diminution of this sort of information gathering, click through this Tumblr of covers (scroll until you get to the e-reader included in the mix, to fully plumb the difference).

Looking at course (learning) management systems in higher ed (Gartner):
Campus Technology recently spoke with Gartner Research Director Marti Harris, who focuses on the higher education market, about an annual report from Gartner, "Gartner Higher Education E-Learning Survey 2007: Clear Movements in the Market," by Harris and two other Gartner higher education research analysts. Campus Technology: In the survey, Gartner found "clear movement in the market" toward more open-source platforms in 2007--26 percent of platforms on surveyed campuses were on open source e-learning system such as Moodle or Sakai, and Gartner projects that number will grow to 35 percent by the end of 2008. .....
CT: What is it about open source in general that appeals in higher education? Harris: There are several things. For one, there is sometimes the perception that open source is cheaper. But we really don't know that's the case yet, other than the fact that [institutions] are not paying a license fee. Certainly, unless it's something that's turnkey or ready out-of-the-box, [any system] will require additional resources to keep development going. You do have to determine how you're going to handle service and support in any case. Some of the open source products, like Moodle, have third-party providers that you can contract for service, support, and even for further development. We've yet to really know how much cheaper these open source apps are. We haven't been doing this long enough to really know the total price tag on migration, for one thing, and then the ongoing total cost of ownership.
From Australia but of relevance to all markets - Libraries and ebooks: tough issues that it’s time to debate (ABS&P):

So far, libraries’ digital activity has mostly been confined to research uses. The prevalence of the cumbersome PC as the main reading platform means the bread and butter of the book trade, fiction and general non-fiction, has barely been touched. But mobile reading devices and a surge in availability of popular ebooks are pushing libraries into the digital mainstream. The few libraries experimenting today with ebook downloads typically have very thin collections. This is partly due to tight budgets but also stems from concerns by publishers and authors about how—indeed whether—libraries should lend digital editions of their books. It’s the latter that has prompted the UK government to legislate so that patrons in libraries can download digital editions to their ebook readers without libraries infringing copyright. At the same time, it will issue an order under legislation “preventing libraries from charging for ebooks lending of any sort, including remotely.”

From OCLC a series of videos from ALA on The Future of Publishing: Libraries and the changing role of consumers and creators (OCLC)
From scholarly journals to eBooks to print-on-demand “vending” machines, publishing is more complicated than it once was. Thousands of individuals, companies, schools and businesses have taken the tools of literary and scholarly production into their own hands. How does the role of the library change when our users go from consumers of content to creators? What do these changes mean for academic activities such as peer-review, collection development and inventory management? How will new publishing platforms—from Amazon to the iPad—alter the public’s expectations for reading, writing and sharing?
Don DeLillo, in a rare interview, talks about living the American dream (Observer)

DeLillo has devoted his writing to the shadow side of American life, painting a dysfunctional freaks' gallery of the wrecked (David Bell in Americana), the sick (Bill Gray in Mao II), the mad (Lee Harvey Oswald in Libra) and the suicidal (Eric Packer in Cosmopolis). In White Noise, the protagonist, Jack, who teaches Hitler studies, riffs hilariously on death and mass murder. It is said that DeLillo used to keep two files on his writing table, labelled "Art" and "Terror". In Mao II, he writes: "I used to think it was possible for an artist to alter the inner life of the culture. Now bomb-makers and gunmen have taken that territory." On some readings, his characters occupy this no-man's-land. His vision has been described as "paranoid" in the sense that it connects everything about his society.

In the process of exploring America, DeLillo has become credited with extraordinary powers of literary clairvoyance. The war on terror is said to be foreshadowed in Mao II. The planes that flew into the Twin Towers are possibly alluded to on the cover of Underworld. Parts of White Noise are echoed in the anthrax scare of 2001, and so on.

Fellow writers talk with admiration of DeLillo's creative radar. The truth is that DeLillo is wired into contemporary America from the ground up, spookily attuned to the weird vibrations of popular culture and the buzz of everyday, ordinary conversations on bus and subway. According to Joyce Carol Oates, he is "a man of frightening perception", an all-American writer who sees and hears his country like no other.

The publishing house that Stieg Larsson built (Independent)
Quercus started life modestly in 2004 after Mark Smith and Wayne Davies defected from Orion Publishing Group. Suitably, for a company that would later publish a phenomenon in crime fiction, they rented a small office round the corner from the fictional premises of Sherlock Holmes on Baker Street. "I wanted to start my own business and foolishly thought it would be easy," Smith recalls. The company focused on non-fiction books that could be nicely illustrated. Its first success was Universe, followed by Speeches that Changed the World. But Smith had an appetite for risk and two years after launch moved into fiction, signing 10 titles from first-time authors. One of its early successes was The Tenderness of Wolves by Stef Penney, a mystery set in the snowy wastes of Canada in 1867. The novel won the Costa Book Award in 2007, driving it up the bestseller charts and allowing its publisher to expand. What had been a staff of 15 people has since grown to 40. The turning point for Smith came when he recruited Christopher MacLehose, who had a reputation as a master at finding foreign fiction by writers such as Henning Mankell and Haruki Murakami and turning them into English language hits.

On the twitter this week (@personanondata) Some Colleges to Test Dual-Screen E-Reader Devices - Wired Campus Chronicle New IEEE Standards Initiative Aims at “Digital Personal Property” Copyright and Technology And comments 'Hollywood: A Third Memoir' by Larry McMurtry First class, private planes, cash what's not to like? LATimes Why The Next Big Pop-Culture Wave After Cupcakes Might Be Libraries NPR Random House CEO on the E-Book Age: 'The Printed Book Will Still Dominate for a Long Time to Come' Spiegel Online Frankfurt SPARKS "conferences and events on the future of media and the creative industries" Frankfurt Book Fair In arts this week: Photography. New York City’s Waterfronts, Covered - NYTimes. This is my contribution.

Friday, August 06, 2010

Inside Google Books: Books of the world, stand up and be counted! All 129,864,880 of you.

Google takes a stab at counting all the books in the world: Google.
Our definition is very close to what ISBNs (International Standard Book Numbers) are supposed to represent, so why can’t we just count those? First, ISBNs (and their SBN precursors) have been around only since the mid 1960s, and were not widely adopted until the early-to-mid seventies. They also remain a mostly western phenomenon. So most books printed earlier, and those not intended for commercial distribution or printed in other regions of the world, have never been assigned an ISBN.

The other reason we can’t rely on ISBNs alone is that ever since they became an accepted standard, they have been used in non-standard ways. They have sometimes been assigned to multiple books: we’ve seen anywhere from two to 1,500 books assigned the same ISBN. They are also often assigned to things other than books. Even though they are intended to represent “books and book-like products,” unique ISBNs have been assigned to anything from CDs to bookmarks to t-shirts.

What about other well-known identifiers, for example those assigned by Library of Congress (Library of Congress Control Numbers) or OCLC (WorldCat accession numbers)? Rather than identifying books, these identify records that describe bibliographic entities. For example the bibliographic record for Lecture Notes in Mathematics (a monographic series with thousands of volumes) is assigned a single OCLC number. This makes sense when organizing library catalogs, but does not help us to count individual volumes. This practice also causes duplication: a particular book can be assigned one number when cataloged as part of a series or a set and another when cataloged alone. The duplication is further exacerbated by the difficulty of aggregating multiple library catalogs that use different cataloging rules. For example, a single Italian edition of “Angels and Demons” has been assigned no fewer than 5 OCLC numbers.

So what does Google do? We collect metadata from many providers (more than 150 and counting) that include libraries, WorldCat, national union catalogs and commercial providers. At the moment we have close to a billion unique raw records. We then further analyze these records to reduce the level of duplication within each provider, bringing us down to close to 600 million records.

Repost: Digital Platforms & Distribution

Originally posted April 12, 2007

Over the last 100 years (probably) US publishers have dithered over whether to use their facilities for the exclusive warehouse, fulfillment and distribution of their books or to offer 'publisher services' to other publishers. In recent years we have seen as many large publishers give up publisher services as adopt them. Some publishers think the headaches out weigh the potential marginal income and others in turn believe these publisher service functions to be core strengths and tasks they can leverage.

Recent announcements by Random House and Harpercollins indicate that there will be an application of the physical 'publisher services' model in the digital world. Clearly here the opportunity and the economics will be significantly different than in the physical world. Other players are entering the market as well: Both Ingram and Gardners (UK) have or are entering this segment. Gardners announced today, and will expand on this business opportunity at London Bookfair, a 'digital warehouse' which is "designed to provide a comprehensive range of e-commerce services for booksellers and publishers." Further,
Gardners Digital Warehouse will supply the capability for Publishers to link their existing digital files, eBooks, Audio Downloads, and extended bibliographic content such as ‘search inside’ to Gardners Books range of Internet and high street retailers. Publishers can also utilise a range of digitisation services designed to enable any size of Publisher to create digital content economically and to use it for publicity and eBook sales with all of Gardners customers.
The vast majority of publishers in the UK and US are small and do not have the depth of experience or financial capacity to support their own back office functions which is why 'publisher service' programs by larger publishers and companies like NBS and PGW exist. Similar issues will exist in the digital world and perhaps the financial aspects and the knowledge gap will be even more stark as processes and applications become more technology driven. Regrettably, as digital distribution becomes a basic service it will simply be out of the reach of the less sophisticated publisher. And this is where Harpercollins, Random House and others will step in to offer a range of digital services to support this market.

The issues these publishers will face will be different than those they faced as physical distributors but intuitively I have to believe the margins will be greater and the services they can offer the publishers and authors will be materially better. It is early days yet and the current offerings are fairly basic (not to be critical) but there are some tantalizing possibilities.

Other than the big fiction authors who get loads of attention, marketing money and have brand equity the vast majority of titles go unsupported almost immediately after launch. Successful titles in this environment are often driven by the desire and resourcefulness of the author. Imagine in a digital world where the author can use the digital platform to create their own marketing program, interact with stores and buyers, build communities with consumers and in many ways manage the sales and marketing for their own titles. It will happen. Adding social networking and other interactive 'modules' to the platforms offered by Harpercollins, Random House, Ingram and others will achieve this and I suspect some derivation of these ideas are in the works. The advantages for smaller publishers and authors is the scale that these platforms will offer both in terms of financial considerations and that they will become destination sites for consumers of books.

Thursday, August 05, 2010

OCLC Respond to Skyriver Suit

Sent to OCLC members this afternoon:
On July 29, SkyRiver Technology Solutions and Innovative Interfaces, Inc. filed suit against OCLC, alleging anticompetitive practices. We at OCLC believe the lawsuit is without merit, and we will vigorously defend the policies and practices of the cooperative.

OCLC’s General Counsel, working with trial counsel, will respond to this regrettable action by SkyRiver and Innovative Interfaces following procedures and timetables dictated by the court. This process will likely take months or even years, not days.

In the meantime, we want to assure the OCLC membership and all 72,000 libraries that use one or more OCLC services that these spurious allegations will not divert us from our current plans and activities. These include maintaining and enhancing existing services, pursuing an ambitious agenda in library research and advocacy, and introducing new Web-scale (cloud) services. Indeed, OCLC has been a global leader in providing cloud-based services for libraries since 1971, and the next generation of these services holds great promise for reducing member library costs.

It is worth noting that our current strategy represents a collective effort by librarians around the world, developed through ongoing dialogue and consultation with the Board of Trustees, Global Council, and Regional Councils in the Americas, Asia Pacific, and Europe, the Middle East and Africa. We will continue our active engagement with OCLC members and governance participants as, together, we move our cooperative forward.

Inclusion, reciprocity, trust and the highest standard of ethical conduct have guided the OCLC cooperative in the past and will guide us in the future. As always, OCLC’s public purposes of furthering access to the world’s information and reducing the rate of rise of library costs remain paramount.

—Larry Alford, Chair, OCLC Board of Trustees

—Jay Jordan, OCLC President and CEO

Beirut: Overhead 1972

Beirut Overhead, 1972
A weekly image from my archive. Click on the image to make it larger.

This photo was taken in 1972 from yet another Pan Am plane window and clearly shows the famous corniche that fronts the city. Beirut was referred to as the Paris of the middle east in the 1950s and 60s because it was so cosmopolitan. On this journey we only stopped for fuel but in 1968, the family spent two days here on the way to our first overseas home in Bangkok. If my navigation is correct one of those hotels in the center of the image was the famous Phoenicia Intercontinental which is where we stayed in 1968. I am fairly certain it and some of the buildings in the image were destroyed during the civil war. I would like to visit Beirut again some day. A few more on Flickr.

Wednesday, August 04, 2010

PND Technology: Parse.ly

Frequent readers will recall that I occasionally report on my attendance at the New York Tech meet-up which is a once a month showcase of new and interesting technology and applications in early development (mostly). I also recently attended a similar group meeting in Hoboken and I am considering reporting on what I find interesting at these meetings on a more regular basis. So here goes.

I am interested in curated content and Parse.ly is a product that helps content owners curate content for their users. At a recent meeting I attended one of the founders of the company took the audience through their product showing how users on traditional media sites are treated like strangers even though they may be frequent visitors to the website. The parse.ly product "connects users with content they’ll love through personalized recommendations. Our technology gives publishers the power to quickly and easily recommend relevant content to users based on what they’ve read in the past and what other, similar users are reading now." It is a cool and elegant application.

By understanding what the user has looked at and interacted with over time and what other users with similar habits have also viewed the parse.ly tool is able to serve up a more concentrated and particular set of content that the user will find interesting. Perhaps a good example of how this process works and how it could be implemented is represented in a current test the company is running with a major newspaper. Parse.ly has suggested that pre-packaged topic-based email subscriptions are too generic and that the Parse.ly tool can craft topic collections based expressly on the needs/interests of a particular individual. So do away with the generic email subscriptions and implement a parse.ly solution that is more relevant to the user.

Parse.ly is available in various forms with the most powerful being full integration with a clients' content. The company is working with some major media clients on enterprise level contracts but is also available to general users so check it out.

Tuesday, August 03, 2010

Confusing a Silo with a Business

The strategy of organizing content around a common topic such as legal or medical information is mature in information publishing. As other publishers mimic the strategy of organizing their content into silos they would be wise not to confuse their efforts with community building or market making. Users are interested in accessing validated, useful and important topical information but this could just as easily be web based content as it is published content. Often it is just that.

Whereas information companies formally organized their businesses around topics (medical, tax, legal, etc.) more than 15 years ago they quickly understood that their customers needed more. Initially, it was often the integration across what had been independent databases that produced the most utility for their users and, their early work led to the development of taxonomies, search techniques and applications which enabled work flow integration. But nothing stands still and as the information business continues to evolve what is happening currently in information should be of interest to all publishers. In short, their experience suggests it may be simplistic to believe establishing a silo of content will produce a community of willing publishing consumers.

Having built platforms supporting information products, information companies now recognize that their customers are looking for integration across subject areas. Importantly, the customers are looking for ways to validate a much wider pool (ocean) of potentially useful and important information. To Thomson Reuters (and others) the silo increasingly looks like a pyramid and they have have begun to conceptualize the management of information and data using this framework. In part, this has to do with the excessive growth of information: Increasingly information providers are as useful to their customers as filters of a vast catalog of information as they are providers of tools, techniques and proprietary data. Consequently, information providers are beginning to see themselves providing access to as much content and information as possible - available on their platforms - and then progressively adding value to the consumer as they move up the pyramid in terms of need and application.

At the top of the pyramid are those publisher specific technologies and content that provide the most value to customers. Companies like Thomson Reuters recognize customers have broad needs and thus there is business logic to providing different services at each level of this pyramid as well as integration points with companies outside the Thomson Reuters family. Inherent in this approach is the recognition by Thomson Reuters and others that it may not be possible to operate in a closed environment any longer. The information space is simply too large to organize in the manner in which information aggregated content in the 1990s. The more addressable issue is to provide consumers with the information critical to their needs and filter that information or content such that it is unambiguous.

The lesson for less advanced publishers is that building a concentration around siloed content is not enough; in-fact, aggregating consumer interest and appeal around publishing content will fail unless that concentration includes content from the web, television, radio, newspapers, magazines, etc. which is also organized, validated and served up in the most effective manner for the consumer. Information publishers have been able to evolve their model to support the needs of their professional customers but the consumer market is more anarchic and it remains to be seen whether trade publishers can pull it off. Silos may not be worth the effort.