Wednesday, August 18, 2010

PND Technology: KnowMore

This is week three in my recap of some of the interesting technology I've heard about at the tech meet-ups I've been going to (NYTech)

Knowmore is still in beta but it looks promising and interesting to anyone who juggles many social networking relationships. And increasingly that is many of us.

Knowmore has created on view of all your social network relationships and presents that content in various streams that you as a user establish. (Here is the video but sadly the audio is bad but good enough that you can still understand the presenter). Knowmore doesn't care which network supplies the content rather they are focused on presenting all the content you and your social network is interacting with in a more logical and consistent way. For example, you are able to set up streams that collect all the videos and photos that your network is looking at or commenting on regardless of where they were located so you can see a concentrated and focused itemization of this content. Additionally, Knowmore has incorporated a 'social search' function so that you can look at and search everything your network has shared. As they say in their presentation at NYTech, "who better to trust than the people you know and love to tell you what you should be interested in".

It is difficult to determine whether 'aggregation' of our social networks will become a long term play; however, Knowmore is an interesting starting point and once they come out of private beta it may be fun to play around with. Longer term this functionality could be incorporated into your browser but that ignores the ingenuity of companies like Knowmore to add layers and value to their aggregation solution over time.

Tuesday, August 17, 2010

Series: Content Curation

Over the past two months, I've looked at content curation as a theme and I thought I would summarize my posts. As I've mentioned, the practice of curation is not a new one as any librarian, or television program director or editor could attest; however, as media outlets become ubiquitous and content becomes overwhelming the need for better curation is vastly under-estimated as a business model and a consumer need.

My posts on this issue are as follows:

The Curator and the Docent:
Recently, as I wandered around a museum with overwhelming breadth and depth of content, I was lucky to be guided in my travels by a professional. When she introduced herself to me, she used the term ‘docent’ to describe her function. A docent is a ‘knowledgeable guide’ and the function seems to me to perfectly complement the process of curation. In an online world, where more and more content appears to “carry the same weight,” we will look to and pay for the combination of curator and docent – sometimes the same person or entity – who can organize and manage a range of content and also engage with the user so they gain insight and meaning from the material. At Mywire.com, we intentionally approached branded media companies because they were recognized as experts in their segments. These are the companies which should be able to build revenue models around the curation of content to offer subscribers a materially different experience than simply performing a Google search query delivering up generic news and semi-relevant content.
Confusing a Silo with a Business
The lesson for less advanced publishers is that building a concentration around siloed content is not enough; in-fact, aggregating consumer interest and appeal around publishing content will fail unless that concentration includes content from the web, television, radio, newspapers, magazines, etc. which is also organized, validated and served up in the most effective manner for the consumer. Information publishers have been able to evolve their model to support the needs of their professional customers but the consumer market is more anarchic and it remains to be seen whether trade publishers can pull it off. Silos may not be worth the effort.
United Artists Redux

Amidst radical change forced on them by major advances in technology (largely out of their control), a small group of leading media producers have joined together to establish their own (insert word): broadcaster, publisher, studio, agency. Unlikely? Not now, because the functions that support these traditional media companies are increasingly becoming commoditised, enabling the creative producers (writers, authors, producers, etc.) to potentially collect more of the revenues generated from their creative output. While individual authors have gained some attention by 'going direct,' either by working through Amazon (J.A. Konrath) or direct to consumers via the iPad (Ryu Mirakami), it may be that traditional publishers have more to fear from groups of authors, editors and agents conspiring to establish their own media companies. These new companies would leverage the available low-cost 'back office' functions and the readily available supply-chain provision to dis-intermediate the traditional publishing monolith.
Silos of Curation - Repost
Something similar to the platform approach may take shape in a different way with intermediaries playing the role of curator. This is an approach that companies such as Publisher’s Weekly or The New York Review of Books might have adopted if they had been more prescient. The capability to guide consumers to the best books, stories and professional content within a specific segment (without regard to publisher or commerce) may come to define publishing in the years to 2020. (See Monday’s post). Expert curation can simplify the selection process for consumers, aggregate interest around topics and build homogeneous markets for commerce. As an added benefit to these intermediaries’ customers, publishers will chose to focus intensely on each segment and offer specialized value-adds particular to that segment. As content provision expands – witness the delivery of all the books in the Google Book project – readers will become increasingly confused and looking for help. It seems inevitable that intermediaries between publisher and e-commerce will meet that need.
Curating Research Data at Elsevier

Elsevier announced a partnership with Pangaea which is a 'data library' that links primary research data with journal articles in earth and environmental science. As I mentioned last week, information and academic publishers like Elsevier have long organized themselves around content areas but are now 'widening' their content 'silos' to accommodate tools, techniques and proprietary data provided by third parties. This is a good example of how the Elsevier 'platform' can and is being leveraged beyond what may have originally been envisioned as a closed system.
Oh My Curation! It's about the Librarian

I chanced on a very interesting article in Harvard Magazine this week as I was doing research for a presentation I am making next week. Titled Gutenberg 2.0: Harvard's Libraries Deal With Disruptive Change, the article is written by the magazine's managing editor Jonathan Shaw and is one of the more thought provoking articles I have read about the impact(s) of our transition from print to online.

Several themes come out of this article: Firstly, traditional publishing is ill-equipped to manage the huge onslaught of data and information. Specific examples note the medical discipline. Secondly, training for the consumers and students who have access to databases and information is inadequate; moreover, this negatively impacts job effectiveness. Examples, here include medical and legal professions. Thirdly, librarians may retain specific skills that bridge the gap between the generic content where 'everything carries the same weight' and a 'consciously curated and controlled artifact' managed to the benefit of a librarian's constituency

Sunday, August 15, 2010

MediaWeek (Vol 3, No 33): Lynd Ward, Report on eBooks in Libraries, OCLC WMS, Arcade Fire

From the Seattle PI Blog:
In what just might be one of the publishing surprises and hi-spots of 2010, The Library of America will release a 2 volume boxed set featuring the six woodcut novels of Lynd Ward. God's Man, Ward's first book published on the eve of the stock market collapse of 1929, was the first wordless book-length novel to be published in the United States. By the end of 1937 Ward would publish five more novels in woodcuts: Madman's Drum (1930) Wild Pilgrimage (1932) Prelude to a Million Years (1933) Song Without Words (1936) Vertigo (1937) If one is looking for the origins of the graphic novel in the United States one must begin with Ward. His work has influenced a generation of artists, poets and illustrators and continues to inspire those seeking justice and equality for all.
Library Journal report on a study commissioned by COSLA that takes a look at eBooks in libraries. It is a very interesting report with both situational analysis and recommendations. LJ's summary is good but the entire report is worth a read (LJ):
A provocative new report released today by the Chief Officers of State Library Agencies (COSLA) on the upheaval caused by ebooks asks, "Is it different this time?" The answer, in "eBook Feasibility Study for Public Libraries," is a resounding yes, including a call for a national buying pool to buy ebooks--a tactic likely to face pushback from publishers and distributors. Still, the report serves as a rallying cry. "We want to create our destiny," COSLA says of the venture. "We want to be ready. We are tired of allowing others to decide these things for public libraries." The 53-page report consists of findings collected from interviews with ten library managers, covering a variety of topics and concerns--which were then discussed with other industry experts. Given the potential for e-reading to change the emphasis from libraries away from repositories of print, the report also suggests public libraries emphasize their role as community centers for learning and events. ... The paltry nature of ebook collections available to libraries in comparison to consumer offerings prompts the report's most action-oriented suggestion: A single, national purchasing point for eBooks combined with expert selection, tough negotiation, and data mining that gives members a compelling story for local funders is a different beast from consortia that mostly fill operations or content gaps for have-not libraries. It forces a reckoning and concentrates eBook access to create real leverage. But it's a steep climb from where we are. Inspiration and leadership will be key. Indeed, major concerns about redirecting local funds to such an umbrella effort have been raised. The slightly weaker--though far more prevalent--formulation offered is to increase pressure on vendors and publishers, thus pushing for thus pushing for lower prices, standardized formats, and fewer digital rights management (DRM) restrictions. But libraries face firm opposition, according to the report: "Publishers want library models that collect payment for every use"--as is the model in the UK--"lease access instead of sell objects, or have digital rights that enforce methods that worked for print, such as one copy one user."

OCLC's web-scale management system is in beta test with several libraries (AmLib):

The much-hyped OCLC Web-scale Management Services (WMS) moved from pilot phase to production last month with the release of acquisitions and circulation components to around 30 early adopters. The University of Tennessee at Chattanooga has posted an ambitious timeline that would make it the first institution to go live with the product on August 30; Pepperdine University Libraries in Malibu, California, is slated to come in second with a projected go-live date of October 11. Calling WMS “the future of the ILS,” UTC’s Jason Griffey, project lead for the WMS migration, told American Libraries that “using a centralized database of bibliographic records like WorldCat means that you simplify pretty much every other aspect of back-office procedures.” Web-scale Management Services moves acquisitions, circulation, and patron management into the cloud, putting those functions alongside WorldCat Local; the aim is to make workflows more efficient by automating critical back-office operations and reducing software support costs.

The New Yorker looks at how Arcade Fire represents both change and statis in the recording industry (New Yorker):

Well-known acts like Radiohead and Nine Inch Nails have taken widely publicized steps to conduct business outside the major-label system, sometimes in experimental ways, such as leaving tracks from upcoming albums on U.S.B. drives in bathrooms to be discovered by fans. But both bands had spent more than a decade on major labels, building their audiences with the marketing power of large corporations behind them. In the U.S., Arcade Fire has only ever worked with Merge Records, an independent label from North Carolina, which was started by the musicians Mac McCaughan and Laura Balance, in 1989. The band often records its albums in its own studios, to exacting and personal specifications, and retains ownership of the music, which it licenses to Merge. Its previous two albums have gone gold, or close to it, and “The Suburbs” is expected to do the same, or better. The new album is driven by the perfervid, jerry-rigged noise that has become Arcade Fire’s trademark, but it stretches over a deceptively calm sixty-four minutes. The lead singer, Win Butler, takes a surprising tack: the characters on this album aren’t all drowning, or caught in serial crises—they are getting on with things, and hoping to have children. Even as “The Suburbs” follows characters across lawns and through strip malls, it avoids obvious finger-wagging. Arcade Fire has previously worked in an epic mode, favoring anthems over smaller, more specific songs, but here its widely reported and entirely genuine energy is channeled, with nothing wasted—not a bonfire but a series of pilot lights. Watching an independent band sell out the Garden and top the charts while compromising very little—Arcade Fire released eight different album covers for “The Suburbs”—is inspiring, but it isn’t a complete revolution. The band still has a manager and a label who work on its behalf, commercially and artistically. Scott Rodger, Arcade Fire’s manager, described the label’s role as “manufacturing and distribution—floating the expense, executing the marketing and retail plans that we have approved, and insuring that the music is available on all credible D.S.P.s,” or digital service platforms.

From the twitter last week (@personanondata):

It's official: Trenton's four library branches are closed - Trentonian

Buenos Aires Herald What's going on in BA book retailing you ask? Their hot 20 titles.

Publishing Economics: A $625 Cookbook NPR

Borders Group lays off more employees at Ann Arbor headquarters - AnnArbor.com

"Heeere's Johnny!" Carson Entertainment Group Unveils 30-Year Carson Library Carson And Steven Wright

Friday, August 13, 2010

Repost: Book Insurance

Originally posted July 14, 2009.

Few in the book world can see an end to DRM on book content even as glimmers of a new dawn in the music world seem to indicate there may be a different future on the horizon from the one that book publishers are trying so desperately to avoid. Rampant file sharing and ineffectual (even legal) efforts to halt copyright infringement represents the atomic winter that consumer book publishers fear and thus they believe the only way to preclude that future is to do impose severe restrictions on a consumers ability to use the electronic content they have purchased.

It's not news to anyone paying attention that the 'rights' a buyer has when they buy a physical book are proactively eliminated in the eBook world. For example, in the eBook world it becomes difficult to lend my book to someone else to read (friend or family member) or to sell the book on the second hand market. I really don't own it in the traditional sense. Things can become even trickier if I buy from multiple eBook providers or change 'platforms' or even, (strangely) if I loose my credit card since some vendors attribute your purchases to a specific credit card.

In an environment where DRM places limits on interoperability moving from one platform and keeping your library of books becomes difficult. If you liked the SONY but something better comes along you may have to keep the SONY ready to go for years even though you and the technology has moved on. It would be a far better experience for users if the book was the constant not the technology. As long as booksellers and publishers maintain this cabal over DRM there are no easy solutions for buyers who find themselves tethered to the technology and not the content.

Perhaps an unlikely solution would be to provide a type of digital insurance. Some (new) third party would offer this service to content buyers as a type of insurance or escrow policy. On purchasing content, I would register the purchase as part of my profile. Obviously, I would have to provide some proof that I had made the purchase but the transaction would sit in this profile as long as I paid my monthly premiums. The amount paid by the consumer wouldn't have to be a lot because only a small amount of the 'members' would ever make use of the insurance. (It becomes an actuary exercise).

Circumstances arising whereby a user would make use of the insurance could be anything from 'passing your library on to a family member' to simply moving over to a new platform. Depending on how the insurance company was set up (as a pseudo-retailer possibly) they wouldn't host this content but they would allow the consumer to 're-purchase' the content and then submit a 'claim' for the purchase. For each 're-purchase' they would get a refund just like a traditional insurance company. (And maybe the following year your premiums go up also). There maybe other benefits to this solution including the return of the right of first sale: As registered owners maybe we build a secondary market for e-Books.

Yes, even I think this is a pretty wacky idea but with e-Book content still less than 10% of total revenues, with publishers exhibiting apparent limited interest in pushing growth faster and the likelihood of formats and technology remaining fluid for a long time, consumers will increasingly become dissatisfied and disgruntled over the limitations (mistrust as well) that publishers and retailers are imposing on their purchases. There could be a better solution.

Insurance anyone?
Reblog this post [with Zemanta]

Thursday, August 12, 2010

Ironwork Escape: New York 1993

Ironwork Escape, New York 1993
A weekly image from my archive. Click on the image to make it larger.

The iron work hangs on many of these lower east side buildings like decorative ornaments. Oddly there aren't any air conditioners in any of the windows.

Wednesday, August 11, 2010

PND Technology: twilio

This is week two of my recap of some of the interesting technology I've heard about at the tech meet-ups I've been going to (NYTech)

Over the years, I've had the dubious distinction of being responsible for several office moves and, aside from the bickering over who gets the bigger office and what type of furniture we buy, some of the more problematic issues related to dealing with the old telephone pbx. Twilio can't help with the baser issues but they have eliminated the hardware problems inherent in the old phone systems and pushed a powerful and easy to use set of applications to the cloud that can manage the most sophisticated phone applications.

Here's how they explain how their system works:
We're always building web applications, and sometimes we want those apps to be able to interact with phone callers. Maybe we want a customer to be able to call in and get information, or maybe we need to coordinate our employees more efficiently. Before Twilio, you would have had to learn some foreign telecom programming languages, or set up an entire stack of PBX software to do this. At which point, you'd say "aw, forget it!" Twilio lets you use your existing web development skills, existing code, existing servers, existing databases and existing karma to solve these problems quickly and reliably. We provide the infrastructure, you provide the business logic via HTTP, and together we rule the world.
The demos at NY Tech meet-up are only five minutes long however in a demonstration of how easy their tool is to use they wrote a simple script that created a dial in conference call, selected (purchased) a specific phone number and then created an invite to which they asked all the audience to dial in to. Programmed in simple xml this took 2mins of fast typing. The system naturally collects all the dial in numbers and as a follow-up demo they used the application to call back everyone in the room who had dialed in to the conference number.

There are all kinds of business applications that can be created almost on the fly and certainly specifically directed to a business issue or situation. Some of the examples include, polling, status updates such as weather problems or power outages, reminders such as appointments, as well as the typical voicemail transcription and sms functionality.

There are many more practical examples noted on their blog including:
MedTaker takes advantage of the ubiquity of the phone to help people remember to take their medication while at the same time periodically checking in on their wellbeing.
Life is full of so many little details that need attending to all the time. Would you rather be coding up your next prize winner Twilio app, or assembling IKEA furniture? What about getting a ride to the airport, or grocery shopping. Fortunately, with TaskRabbit you can delegate these tasks to "runners" who you pay by the hour to help you get things done.

Diner Connection is a complete online solution for restaurants. You can contact your customers via text messaging, collect patron visit information and connect with your patrons more effectively using Diner Connection.

DropConf is an on-demand conference calling application -- you pay per conference call. The idea is that small business or freelancers for example might only need one or two conference calls a month. Some months they might not need any conference calls. All the other paid options out there have monthly fees -- so people are paying for a service they don't need.
And many, many more.

Tuesday, August 10, 2010

Curating Research Data at Elsevier

Elsevier announced a partnership with Pangaea which is a 'data library' that links primary research data with journal articles in earth and environmental science. As I mentioned last week, information and academic publishers like Elsevier have long organized themselves around content areas but are now 'widening' their content 'silos' to accommodate tools, techniques and proprietary data provided by third parties. This is a good example of how the Elsevier 'platform' can and is being leveraged beyond what may have originally been envisioned as a closed system.

From their press release they note that this initiative extends one announced in February,

This next step follows the introduction, last February, of 'reciprocal linking' - automatically linking research data sets deposited at PANGAEA to corresponding articles in Elsevier journals on its electronic platform ScienceDirect and vice versa. The new feature adds a map to every ScienceDirect article that has associated research data at PANGAEA; it displays all geographical locations for which such data is available. A single click then brings the user from the ScienceDirect article to the research data set at PANGAEA.

"With an increasing interest in the preservation of research data, it is very important to make those data clearly visible in the context of the formal research publications," commented Jan Aalbersberg, Vice President Content Innovation at Elsevier. "Elsevier is committed to advance science by investing in such collaborations with data set repositories. This new feature will allow readers to easily go beyond the content of an article, and drill down to the research data sets."

As the press release goes on to say, we are starting to see how the web, the use of api's and other methods are eliminating the inefficiencies in sharing research data and analysis which academics have had to navigate around for many years. Ironically, while these large information companies may 'open' their platforms to produce much more utility for their subscribers, they may also be strengthening their positions as the clear leaders in providing information, analysis tools and other key functionality for their users. Their strategy continues to reflect the curation model I've discussed before although the evidence of this now extends far beyond the concentration of topic based content.

Prior post on Massive Data Sets
Posts on Content Curation