Tuesday, September 12, 2006
Never Saw it Coming
Who would have thunk it, the Senate Intelligence Committee has confirmed that the justification for war with Iraq was based on a number of erroneous conclusions (ed. lies, surely?) Well... whatever..., the Senate Intelligence Committee is moving on to confirming the existence of Santa Claus, Snow White and Gandalf. Stay tuned.
Monday, September 11, 2006
Disaster Books
This year has been a year of disaster books for me. In a small way. Earlier this year I read A Crack in the Edge of the World By Simon Winchester about the San Francisco earthquake, and I have just finished Curse Of the Narrows by Laura MacDonald which is the true story of a massive explosion in Halifax, Nova Scotia in 1917. Both books were excellent and prove interesting reading given the current situation of New Orleans. Most know about the San Francisco earthquake and the resulting fire that destroyed most of the city but in his book Winchester – a geologist by formal education – spends much of the book describing the geological background to the earthquake. Using as a narrative tool his drive across the United States, he describes the geology and geography of the country and provides background on other lesser known earthquakes and geologic points of interest. He actually finishes his journey in Alaska and ‘passes through’ San Francisco to describe the Earthquake. What is comparatively interesting about San Francisco’s reaction to the disaster is the manner in which the city leadership went about dealing with the immediate aftermath and reconstruction. Almost as a circumstance of location and the timeframe in which they lived the city leaders knew instinctively that they couldn’t rely on federal government help and that they needed to take rapid responsibility for their own wellbeing. Help soon arrived and there was an organized mechanism for disbursing and rebuilding the city which got going rapidly. Additionally, it was always assumed that San Francisco was vital to the economy of the west and there was never any doubt of the economic viability and need to rebuild the city.
In December 1917, a munitions ship collided in Halifax harbor with another ship which set off an explosion that remains earths largest conventional explosion ever. The ship exploded in ‘downtown’ Halifax and the force was so strong that Robert Oppenhiemer studied the effects while researching the A-Bomb in 1944. Thousands died and the town was leveled. To make matters worse a blizzard, rain/flooding and another blizzard followed over the next five days and obviously further hampered rescue efforts. Help was sent from the US particularly Massachusetts. Local doctors, who themselves were in shock, were forced to work in terrible conditions for many days as residents were dug out or suffered burns from the ensuing fires. Eye wounds were particularly prevalent because the ship’s explosion was preceded by a fire which ignited the explosives. Many people were caught watching as the shock wave from the explosion blasted every window in town into the faces of the on-lookers. As relief flowed in a citizens emergency group was formed to manage the rebuilding and recovery of the town and a concerted effort was made to take responsibility away from politicians. This was one learning that was gained from the San Francisco recovery effort which some believed had been slowed by politics.
In all these were interesting well written books which are relevant today given the real recovery issues faced in New Orleans. It is fascinating to note that with so much less resource in these two cases, results were fast, early and effective in dealing with the problem at hand. In both cases, the cities were happy for the assistance but they weren’t waiting for someone else to set the priorities and do the job for them. They got stuck in immediately.
Lastly, Laura MacDonald quotes from Disasters a book by J. Byron Deacon published in 1918 which struck me as relevant to our current approach to disasters:
In December 1917, a munitions ship collided in Halifax harbor with another ship which set off an explosion that remains earths largest conventional explosion ever. The ship exploded in ‘downtown’ Halifax and the force was so strong that Robert Oppenhiemer studied the effects while researching the A-Bomb in 1944. Thousands died and the town was leveled. To make matters worse a blizzard, rain/flooding and another blizzard followed over the next five days and obviously further hampered rescue efforts. Help was sent from the US particularly Massachusetts. Local doctors, who themselves were in shock, were forced to work in terrible conditions for many days as residents were dug out or suffered burns from the ensuing fires. Eye wounds were particularly prevalent because the ship’s explosion was preceded by a fire which ignited the explosives. Many people were caught watching as the shock wave from the explosion blasted every window in town into the faces of the on-lookers. As relief flowed in a citizens emergency group was formed to manage the rebuilding and recovery of the town and a concerted effort was made to take responsibility away from politicians. This was one learning that was gained from the San Francisco recovery effort which some believed had been slowed by politics.
In all these were interesting well written books which are relevant today given the real recovery issues faced in New Orleans. It is fascinating to note that with so much less resource in these two cases, results were fast, early and effective in dealing with the problem at hand. In both cases, the cities were happy for the assistance but they weren’t waiting for someone else to set the priorities and do the job for them. They got stuck in immediately.
Lastly, Laura MacDonald quotes from Disasters a book by J. Byron Deacon published in 1918 which struck me as relevant to our current approach to disasters:
“It is the province of emergency relief to provide for immediate, common
needs. The promptness and completeness with which they are met are the
sole tests of efficiency. The province of rehabilitation is to help each
family meet the needs peculiar to it and return to its normal manner of
life. Its efficiency is tested by the degree to which it succeeds in
accomplishing these results. Emergency relief plans and acts to meet
present needs, rehabilitation plans and acts for ultimate welfare. All
disaster relief should be a process of evolving from dealings with its victims
en masse to treatment of them as individual families…need, not loss, is the
basis of relief; there must be the fullest possible utilization of community and
family resources for self-help; accurate determination of need, family by
family, is the only basis for a just and effective distribution of relief; in
addition to the needs which can be met by monthly gifts, there are others which
can be met only by wise counsel and devoted intelligent personal service.”
Thursday, September 07, 2006
More on Google Archive
Eion Purcell had a forthright and not disagreeable commentary on the Google Archive announcement. I have a similar view that indeed newspapers are realizing they need the traffic to support the web presence and having allowed Google to index their content is great for us users (with a caveat) but also a monumental shift for how these newspaper publishers view themselves. That would be especially true for the New York Times which has visions of being the Nation’s (some think the World’s) newspaper and a destination themselves. I think that this announcement is also a harbinger of things to come and all database providers may find themselves having to open up to Google (and the others) and be indexed. That is just the way things will be.
The library and information database business is currently characterized by monolithic “packages” and all the largest publishers have invested huge amounts to create “platforms” and “solutions” that represent delivery mechanisms for their proprietary content. Google Indexing will become a large federated search engine for all this content progressively (not immediately and maybe not universally) undermining the ‘platform’ approach that publishers have pushed. Having said that, Google Indexing (for want of a better term) is not the total answer and in fact is – in the example of Google Archive – missing a key element. Missing is a navigation tool/enabler that allows a searcher to identify content during their search that they have rights to access via their public or academic library (or other contract with the data owner). This represents the caveat that I mention above.
The technology called ‘link resolver’ has been around for many years and if implemented between the search query and the location of the material would enable the searcher to ‘skip’ the part where they would otherwise have to pay. Authentication that the user has access is as easy as inputting the users library card number. Ideally and logically this only needs to be done once so that the searcher can conduct another search in three weeks and skip even this step.
Now, it is early days in this initiative and I expect improvements will be made rapidly. I did however wonder what libraries were saying about this announcement. Universally, the list serve comments on Web4Lib were that they were disappointed with the implementation. Comments include “..the predominate number of articles were not free but pay-per-view..” or “…people will end up paying for things they have access to” or “..the search doesn’t return anywhere like amount of content available via the library.” (If you want to read them here is the link). As I said above, this is early days and I think the general public will enjoy playing with this Archive. For libraries, I think this represents another opportunity to ride the Google coat tails and via link resolver bring searchers into the library and turn them into patrons.
The library and information database business is currently characterized by monolithic “packages” and all the largest publishers have invested huge amounts to create “platforms” and “solutions” that represent delivery mechanisms for their proprietary content. Google Indexing will become a large federated search engine for all this content progressively (not immediately and maybe not universally) undermining the ‘platform’ approach that publishers have pushed. Having said that, Google Indexing (for want of a better term) is not the total answer and in fact is – in the example of Google Archive – missing a key element. Missing is a navigation tool/enabler that allows a searcher to identify content during their search that they have rights to access via their public or academic library (or other contract with the data owner). This represents the caveat that I mention above.
The technology called ‘link resolver’ has been around for many years and if implemented between the search query and the location of the material would enable the searcher to ‘skip’ the part where they would otherwise have to pay. Authentication that the user has access is as easy as inputting the users library card number. Ideally and logically this only needs to be done once so that the searcher can conduct another search in three weeks and skip even this step.
Now, it is early days in this initiative and I expect improvements will be made rapidly. I did however wonder what libraries were saying about this announcement. Universally, the list serve comments on Web4Lib were that they were disappointed with the implementation. Comments include “..the predominate number of articles were not free but pay-per-view..” or “…people will end up paying for things they have access to” or “..the search doesn’t return anywhere like amount of content available via the library.” (If you want to read them here is the link). As I said above, this is early days and I think the general public will enjoy playing with this Archive. For libraries, I think this represents another opportunity to ride the Google coat tails and via link resolver bring searchers into the library and turn them into patrons.
Wednesday, September 06, 2006
More Amazon Movie News, Google Newspapers, Bertelsmann
I wrote a post about Amazons new 'movie platform' (my words) and the The LA Times has a story on the anticipated Amazon Movie Service. Here it is. (It is interesting the correlation to the EPIC 2015 video I linked to last week).
Again, the Google factor at play generating huge coverage this morning, but when I heard this story about providing search users with access to digital archives of the New York Times, Wall Street Journal and others, I wondered what happens to Proquest which relies so much on revenues derived from thier newspaper databases.
As many news outlets are reporting this morning Vivendi has purchased Bertelsmann Music Publishing division for over $2.0billion.
Again, the Google factor at play generating huge coverage this morning, but when I heard this story about providing search users with access to digital archives of the New York Times, Wall Street Journal and others, I wondered what happens to Proquest which relies so much on revenues derived from thier newspaper databases.
As many news outlets are reporting this morning Vivendi has purchased Bertelsmann Music Publishing division for over $2.0billion.
Tuesday, September 05, 2006
Ads In Textbooks
A number of recent articles about advertising supported textbooks got some media attention recently. I recall ads in travel guides and they never did well – perhaps it doesn’t help that many travel guide purchasers are arm-chair travelers. It was also difficult to manage the currency of the advertising. I am doubtful that ad supported textbooks will have much success either but I did wonder whether this idea could be taken a little further.
As long ago as 1995, TV Guide were producing as many as 52 separate weekly editions of their guide. Their desire to do this was to create local versions of the guide to gain local advertising (on top of the national advertising in all editions). Print production should allow multiple (economic) versions of a textbook. The question is would publishers as a group be interested in including advertising in their text books? If there was interest and the costs of incorporating the ads was significantly less than the revenue – both big ifs – then a market for the advertising inventory would need to be established. Since the publishing schedule for textbooks is highly seasonal and inventory expires at a certain point it could be relatively straight forward to set up an auction site for textbook ad inventory. (In a perverse way, could advertising in textbooks drive the students need to have the current year's edition...hmm?)
Key to this market would be how automated the activities could be. This would reduce expenses as much as possible. Guidelines on page layout, ad size, image resolution, content, payment, etc. would be easy to establish and using a formulated process such as ebay would also reduce expense by leveraging existing processes. Recently, the advertising industry began experimenting using Ebay as a marketplace for broadcast advertising.
It would seem more likely that an advertising model that enabled an advertiser to reach across multiple markets using multiple publishers and titles would have a greater chance of success than trying to create a publishing program based solely on advertising to justify a titles viability. Who knows? It would seem to me that ad based textbooks sounds interesting when everyone is debating a publishers right to make a reasonable profit but in reality the idea is a sideshow.
As long ago as 1995, TV Guide were producing as many as 52 separate weekly editions of their guide. Their desire to do this was to create local versions of the guide to gain local advertising (on top of the national advertising in all editions). Print production should allow multiple (economic) versions of a textbook. The question is would publishers as a group be interested in including advertising in their text books? If there was interest and the costs of incorporating the ads was significantly less than the revenue – both big ifs – then a market for the advertising inventory would need to be established. Since the publishing schedule for textbooks is highly seasonal and inventory expires at a certain point it could be relatively straight forward to set up an auction site for textbook ad inventory. (In a perverse way, could advertising in textbooks drive the students need to have the current year's edition...hmm?)
Key to this market would be how automated the activities could be. This would reduce expenses as much as possible. Guidelines on page layout, ad size, image resolution, content, payment, etc. would be easy to establish and using a formulated process such as ebay would also reduce expense by leveraging existing processes. Recently, the advertising industry began experimenting using Ebay as a marketplace for broadcast advertising.
It would seem more likely that an advertising model that enabled an advertiser to reach across multiple markets using multiple publishers and titles would have a greater chance of success than trying to create a publishing program based solely on advertising to justify a titles viability. Who knows? It would seem to me that ad based textbooks sounds interesting when everyone is debating a publishers right to make a reasonable profit but in reality the idea is a sideshow.
Saturday, September 02, 2006
Google Lets it All Out
Any time the word Google is attached to anything everyone reacts like it is the second coming. Google opened access to the public domain titles they have scanned as if as Mr Charkin points out there aren't enough opportunities to get these already. Here thanks to a link at Library TechBytes is a vblog from Mobuzz tv that takes a surprising view point in support of the library catalog.
Also, I still wonder about those 'out of copywright' titles with introductions penned in the fifties, sixties and seventies. What's with that?
And since we are on the topic (Google), you may have seen this both really cool and kinda frightenting view of the future c2014. Well now they have updated it by a year. Here is the link to EPIC2015. Off to the Google Grid...
Also, I still wonder about those 'out of copywright' titles with introductions penned in the fifties, sixties and seventies. What's with that?
And since we are on the topic (Google), you may have seen this both really cool and kinda frightenting view of the future c2014. Well now they have updated it by a year. Here is the link to EPIC2015. Off to the Google Grid...
Friday, September 01, 2006
US Open, Andre Agassi and Video Line Calls
The US Open has been great so far, with one of the best and most exciting games played in recent memory between Baghdartis and Agassi. Agassi's match with Blake last year was pretty good to but this one was a true classic. Watching it live until 1:30 in the morning and jumping around the living room was exhausting.
Agassi has said this is his last tournament and I wonder if he is going to publish his biography in the coming years. He is certainly a personality that could move some units. Whereas he has traditionally been very closed about his upbringing and sporting life, he recently expressed more of himself in an article in Sport Illustrated.
This year marks the introduction of video line calls. A player gets to challenge via instant replay a set number of calls per set. When I heard about this it seemed to me games would become similar to The Price is Right with the fans screaming advice to the players. In fact, the implementation has been far better than that, but I am not a fan of introducing this type of technology into sport. I don't approve of goal line video or the camera used in cricketfor runouts. I don't want to seem old fashioned but the ref is as much a part of the game as are the players. The ref gets it right and wrong just like the players and as such the human element adds to the enjoyment and frustration of the game. If we wanted it perfect we should put a bunch of robots out there who never put a pass wrong, always score and are never bowled. Now how much fun would that be to watch? Sure England would have beaten Portugal in the European championship but it is the element of chance and unpredictability that makes sports so fun and interesting.
I can almost guarantee that someone is going to say the technology used to predict where the ball landed isn't good enough and will want improvements. Next thing you know there won't be any refs actually at the games they will all be in a dark room watching remotely as a computer makes the decisions.
Agassi has said this is his last tournament and I wonder if he is going to publish his biography in the coming years. He is certainly a personality that could move some units. Whereas he has traditionally been very closed about his upbringing and sporting life, he recently expressed more of himself in an article in Sport Illustrated.
This year marks the introduction of video line calls. A player gets to challenge via instant replay a set number of calls per set. When I heard about this it seemed to me games would become similar to The Price is Right with the fans screaming advice to the players. In fact, the implementation has been far better than that, but I am not a fan of introducing this type of technology into sport. I don't approve of goal line video or the camera used in cricketfor runouts. I don't want to seem old fashioned but the ref is as much a part of the game as are the players. The ref gets it right and wrong just like the players and as such the human element adds to the enjoyment and frustration of the game. If we wanted it perfect we should put a bunch of robots out there who never put a pass wrong, always score and are never bowled. Now how much fun would that be to watch? Sure England would have beaten Portugal in the European championship but it is the element of chance and unpredictability that makes sports so fun and interesting.
I can almost guarantee that someone is going to say the technology used to predict where the ball landed isn't good enough and will want improvements. Next thing you know there won't be any refs actually at the games they will all be in a dark room watching remotely as a computer makes the decisions.
Subscribe to:
Comments (Atom)