All content owners are being encouraged to implement version 1 of the protocol and Times Online announced that they have implemented ACAP on their site. From the Associated Press:
The proposal, unveiled by a consortium of publishers at the global headquarters of The Associated Press, seeks to have those extra commands — and more — apply across the board. Sites, for instance, could try to limit how long search engines may retain copies in their indexes, or tell the crawler not to follow any of the links that appear within a Web page. The current system doesn't give sites "enough flexibility to express our terms and conditions on access and use of content," said Angela Mills Wade, executive director of the European Publishers Council, one of the organizations behind the proposal. "That is not surprising. It was invented in the 1990s and things move on."
Personally, I was initially skeptical about this initiative but they have delivered on their time table, retained their broad support and even have some in the search community actively supporting the initiative.
ACAP organizers tested their system with French search engine Exalead Inc. but had only informal discussions with others. Google, Yahoo and Microsoft Corp. sent representatives to the announcement, and O'Reilly said their "lack of public endorsement has not meant any lack of involvement by them." Danny Sullivan, editor in chief of the industry Web site Search Engine Land, said robots.txt "certainly is long overdue for some improvements."Associated Press
1 comment:
Arguably you should stick with your scepticism a bit longer. That the Times has 'implemented' the protocols doesnt really mean that ACAP is implemented. As we know it takes two to tango. When one of the search engines implements ACAP then we could have something that matters. Lauren Weinstein remains sceptical, but not dismissive
http://lauren.vortex.com/archive/000333.html
Post a Comment