Showing posts with label Professional publishing. Show all posts
Showing posts with label Professional publishing. Show all posts

Monday, May 15, 2017

Moody's to Buy Amsterdam based publisher Bureau Van Dijk (BVD) for $3.3Billion

From Reuters:

Credit ratings agency Moody's Corp (MCO.N) said on Monday it would buy Dutch financial information provider Bureau van Dijk for about $3.3 billion, to extend its risk data and analytical businesses. 
Moody's will fund the deal through a combination of offshore cash and new debt financing.
Amsterdam-based Bureau van Dijk, owned by the fund EQT VI, distributes financial information and private company datasets of 220 million companies.
The deal is expected to benefit Moody's revenue and earnings in 2019, while adjusted earnings in 2018 is expected to see an uptick.
From the press release:
Moody’s Corporation (NYSE:MCO) announced today that it has entered into a definitive agreement to acquire Bureau van Dijk, a global provider of business intelligence and company information, for €3.0 billion (approximately $3.27 billion). The acquisition extends Moody’s position as a leader in risk data and analytical insight.

“Bureau van Dijk is a high growth information aggregator and distributor that positions Moody’s at the center of a unique network of global risk data,” said Raymond McDaniel, President and Chief Executive Officer of Moody’s. “This acquisition provides significant opportunities for Moody’s Analytics to offer complementary products, create new risk solutions and extend its reach to new and evolving market segments.”
“Moody’s is a highly regarded, authoritative source of credit ratings and analytical tools, with a strong brand and global reach,” said Mark Schwerzel, Deputy CEO of Bureau van Dijk. “The addition of Bureau van Dijk’s powerful information platform to Moody’s Analytics’ suite of risk management solutions presents a wide range of opportunities for us to better serve our combined customer base.”
Bureau van Dijk, operating from its Amsterdam headquarters, aggregates, standardizes and distributes one of the world’s most extensive private company datasets, with coverage exceeding 220 million companies. Over 30 years, the company has built partnerships with more than 160 independent information providers, creating a platform that connects customers with data that addresses a wide range of business challenges.
Bureau van Dijk’s solutions support the credit analysis, investment research, tax risk, transfer pricing, compliance and third-party due diligence needs of financial institutions, corporations, professional services firms and governmental authorities worldwide.
In 2016, Bureau van Dijk generated revenue of $281 million and EBITDA of $144 million. Bureau van Dijk will be reported as part of Moody’s Analytics’ Research, Data & Analytics (RD&A) unit. Moody’s expects approximately $45 million of annual revenue and expense synergies by 2019, and $80 million by 2021. On a GAAP basis, the acquisition is expected to be accretive to Moody’s EPS in 2019. Excluding purchase price amortization and one-time integration costs, it is expected to be accretive to EPS in 2018.
Moody’s will fund the transaction through a combination of offshore cash and new debt financing. The acquisition is subject to regulatory approval in the European Union and is expected to close late in the third quarter of 2017.
Bureau van Dijk is owned by the fund EQT VI, part of EQT, a leading alternative investment firm with approximately €35 billion in raised capital across 22 funds. EQT funds have portfolio companies in Europe, Asia and the U.S. EQT works with portfolio companies to achieve sustainable growth, operational excellence and market leadership.
"We are very pleased with Bureau van Dijk's development under EQT ownership and want to thank management and employees for their hard work and dedication. We see an excellent fit between Bureau van Dijk and Moody’s Analytics, and congratulate Moody’s on acquiring this uniquely positioned company," said Kristiaan Nieuwenburg, Partner at EQT.
 
Some older stories on BVD from the blog:

Private Equity owners but the company up for sale in 2007 and had trouble selling at the time.  Reports suggested it sold for about $1.0B so quite an impressive return over 10 years for some group of owners.

Wednesday, July 31, 2013

White House Sponsored DataJam Promotes Open Data Initiatives

There is a lot of machine-readable data coming from the federal government, as a result of the Obama Administration’s open-data initiatives. Open data was a platform initiative of the President’s first presidential campaign and, in his second term, he has reinvigorated this policy with a new spate of policy initiatives, executive orders and community outreach.

The result of the outreach program was on display at a recent Datajam event I attended at the White House Conference Center (near, not at, the White House). Sponsored by the White House Office of Science and Technology and CENDI, the event invited technologists, researchers, publishers and data owners to weigh up ways of using the data and content rapidly being made accessible by almost every federal agency. As a group, we were challenged by White House Chief Technology Officer Todd Park to engage our inner entrepreneurial spirits and think up new and innovative uses for government data. In a lively opening speech, Park promised the datajam audience member who came up with a viable idea or product within 60 days that he would “make [him or her] famous.” He pointed to several examples of health data related products which had come out of similar meetings recently, and was enthusiastic about this group’s ability to produce some interesting ideas.

Government information and data is a by-product of the taxes we pay to maintain the federal agencies, and, Park confirmed “the administration is looking to maximize tax payer return of government produced data by opening it up to so many people can access and use it.” Citing as a guiding principle ‘Joy’s law’--that the smartest people in the world will always be working for someone else and that collaboration is an imperative if real progress needs to be made--Park suggested that bringing people together in groups like ours is important for both promotion of open data and for actually devising worthwhile uses for it. Furthermore, he made the point that “open data by itself is useless and only useful if it gets applied to something and produces value.” He encouraged us to “use ‘our data’ to produce awesomeness - where stuff can actually happen.”

Earlier this year, President Obama signed an executive order requiring all federal agencies to open access to all government-sponsored published content by the end of 2013. This has produced a frenzy of activity at some agencies to determine what they have and how to make the materials accessible. Some agencies are more mature in this respect than others but, CTO Park confirmed, the President is passionate about open data and has made specific commitments to fulfilling this policy.

In a speech in Austin, TX, in May, the President cited several examples of start-up companies working with open data: “StormPulse uses government-produced weather data to help businesses anticipate disruptions in service. Another company based in Virginia called OPower uses government trend data to save consumers $200m on their energy bills.” Obama also mentioned an app called iTriage, founded by a pair of doctors, that uses data from the Department of Health and Human Services to help define symptoms and find appropriate health care for the patient. In the same speech, the President announced that his administration is making even more government data available and he expected that this and his other open-data initiatives would help launch even more start-ups similar to the ones he mentioned. 



The data available would help “more entrepreneurs come up with products and services that we haven’t even imagined yet”.

More recently, in a speech just two weeks ago, President Obama suggested that we are part of a process to build a better, more open-data America. Hyperbole aside, there will be no coming back once these policies are in place and this policy by the executive and agencies of the Federal government is likely to have profound changes on how we, as citizens, interact with the government. Frequent examples of open data initiatives cite the use of satellite imagery and data from the National Oceanic and Atmospheric Administration (NOAA) which have produced the now ubiquitous mapping and weather apps; however, given the “fire hose” nature of the data and information on offer, these early examples are likely to represent only isolated examples of the opportunities represented by the government’s open data initiatives.

Commercial publishers of government-funded and/or -produced content and data are spooked by some of these moves by the government. During our meeting it was mentioned that one only needs to search for “aspirin” on Google to see how accessing government-produced content via API can produce content that looks very much like a drug handbook entry from Elsevier, Wolters Kluwer or some other commercial medical publisher. The government often makes reference-like content a requirement of various approval processes, and thus, we may be about to see professional reference content undergo some profound changes. And that is just one small example of what could happen to commercial publishing.

As a direct result of the open-data initiatives both in the US and Europe, the Association of American Publishers and Society of Scholarly Publishers (in collaboration with partner CrossRef) have established an initiative named CHORUS. (The EU is said to be about to press for greater open-data requirements than we have in the US.) Through CHORUS, publishers aim to avoid a PubMed situation and manage the open data and open access content requirements themselves; publishers who publish content which is also available on PubMed see significant decreases in traffic once PubMed opens access to the same content. Publishers believe that, by setting up their own open-access service, they will be able to fulfill the government’s open-access requirement and mitigate the impact(s) on their own business models.

Regardless of the risks to current publishers and their business models, it appears that the government produces a lot more content and data than is currently being commercialized by publishers. The sheer amount of data is overwhelming and, as long as the President continues to promote open data, we’ll see hundreds of new products and services develop in the short term as entrepreneurs take CTO Park up on his promise to make them famous.

Wednesday, July 24, 2013

A Funny Thing Happened on the way to History

It is hard not to find the humor in this announcement from the American Historical Association which in a recent policy statement suggests that universities seek up to a six year embargo on publishing digital dissertations.  Windowing hasn't proven effective in trade publishing and it is just as less likely to work here.  The nub of their argument is that digital content can be so widely distributed in contrast to those pesky paper dissertations or UMI versions, that academics risk harming their opportunities for tenure if they can't get their work published.  There is some logic to the policy in that the society wants to protect scholarship by providing a financial incentive for both the author and publisher but for all practical purposes this is wrong headed.  Having a dissertation widely available in advance of a proposed book, it is argued, would torpedo a potential book deal.

The policy runs counter to the growing calls for more open access which often comes from researchers and academics but not publishers.   It would seem that rather than think of a potential solution that looks forward taking full advantage of current media culture and technology the AHA has chosen the historical approach.  In the process, they have attempted to, at a very late date, turn back the clock on scholarly publishing.

That would then lead me to conclude that any young researcher and/or PhD student would view the association as not one which they would want to actively participate in.  Ever.  And that's history for you.

In their announcement they cut straight to the chase:
The American Historical Association strongly encourages graduate programs and university libraries to adopt a policy that allows the embargoing of completed history PhD dissertations in digital form for as many as six years. Because many universities no longer keep hard copies of dissertations deposited in their libraries, more and more institutions are requiring that all successfully defended dissertations be posted online, so that they are free and accessible to anyone who wants to read them. At the same time, however, an increasing number of university presses are reluctant to offer a publishing contract to newly minted PhDs whose dissertations have been freely available via online sources. Presumably, online readers will become familiar with an author’s particular argument, methodology, and archival sources, and will feel no need to buy the book once it is available. As a result, students who must post their dissertations online immediately after they receive their degree can find themselves at a serious disadvantage in their effort to get their first book published; it is not unusual for an early-career historian to spend five or six years revising a dissertation and preparing the manuscript for submission to a press for consideration. During that period, the scholar typically builds on the raw material presented in the dissertation, refines the argument, and improves the presentation itself. Thus, although there is so close a relationship between the dissertation and the book that presses often consider them competitors, the book is the measure of scholarly competence used by tenure committees.

Friday, March 01, 2013

Presentation at NFAIS Conference on 2013 Predictions


Nfais 2013 from Michael Cairns

Presentation to NFAIS Annual Conference, February 25th, 2013

Thank you for inviting me.

Slide 1: Many years ago, I moderated a strategy workshop with a group of executives.  To kick things off, I wrote 12 potential business scenarios that could impact the future of the business and placed each separately on posters around the room.  I then asked each of the participating executives to agree or disagree with the premise of each scenario which they were to do without speaking to each other.  Once done we convened and discussed the results.  This exercise can be lots of fun and drive intense discussion about strategy and is always useful in breaking the ice if you have a group of executives who don’t know each other that well.  My client was a trade publisher and one of the scenarios was titled “Oprah is elected President” which was intended to drive discussion about what would happen when Oprah’s book club ended.  Such was Oprah’s power at the time that all the participants agreed she would be elected. 

Slide 2 I bring this up now because when we predict the future, which is what I am about to do, we sometimes leave ourselves open to ridicule later on.  I hope I don’t leave you all laughing by the end of this presentation.  I’ve been blogging for seven years and each year I spend some time thinking about the industry and post my predictions in the first week of January.  It is not intended to be comprehensive; just what interests me about what I see happening.  I have around 10,000 subscribers – who they are I have no idea - but I am fairly confident that 1% of this number are actual readers.  I say that facetiously, but if you happen to be one of the 1%, firstly thanks and secondly, I apologize that some of what I’ll be talking about today is duplicative.  Thanks to Jill O’Neil, who is clearly one of the 1% and, who asked me to speak today.

Slide 3 I’ve am very interested in the concept of content delivery ‘platforms’ which aggregate, serve and engage users around specific types of activity.  Several years ago I spoke at the Frankfurt Bookfair where I used the example of LexisNexis to show how they had used the platform construct to radically redefine their competitive marketplace.  Delivery platforms aren’t a new idea and professional publishers have done a lot of work with the concept over the past ten years.  It is still true however that many content owners and publishers have trouble with the idea that their traditional product – books, journals, etc.  – must be extensible to include applications, source data, user data, third-party content and “functionality”. 

Slide 4: Even using the word “functionality” together with ‘book’ or ‘journal’ bemuses them.
As I thought about this year’s predictions, I was especially interested in how the platform construct would apply to educational publishing.  This is actually my little secret: Blogging, in particular the longer pieces I’ve written – predictions being an example – flow from my need to make sense of what I see going on in the markets where I work. 

Slide 5: Blogging represents an important aspect of my knowledge and understanding of the business.  My specific interest in higher education has been fueled by my recent work with several education companies.  I’ve seen firsthand how the influences I will speak about are starting to become main stream.   Before we get into that, let’s catch-up on what has happened in publishing over the past year or so.

Slide 6: Most of the innovation and change in publishing is happening on the edges of the publishing industry and we’ll get to that in a minute.  To most traditionalists – or those still clinging to traditional publishing - it might seem that we’ve entered a period of stasis as publishing transforms itself.  At the end of 2011, it seemed to me that in their routine operations many publishers had realized the transition to electronic content delivery and absorbed the implications.  (That’s not the same thing as saying they have solved their problems.) So, perhaps, the past twelve months have been about catching our collective breath given the huge changes the Kindle and the iPad forced on publishers. 

Slide 7: That said, anyone who thinks the big changes are behind us is probably fooling himself, and may be lulling himself into catastrophic inaction.  Harbingers of dislocation and change are easy to see: You don’t have to go far. 

Slide 8: In the second half of 2012, we saw a slowdown in the growth rate of eBook unit sales; indications of a possibly significant substitution of tablets for eBook readers; a reconfirmation in several examples of a lack of enthusiasm by students for eBook based learning; a major strategic publishing merger destined to create a trade publishing goliath; and the sale of one of the big three education companies. 

Slide 9: Each of these would be significant in their own right but taken as a group suggest to me that more-- rather than fewer-- changes are on their way.  The expectation that the big trade houses would consolidate has persisted for at least five years: In fact, it is more surprising that the Random House/Penguin deal didn’t happen sooner, and now that it has, it’s a foregone conclusion that there will be another trade merger announced in the next few months, involving some combination of Harpercollins, Simon & Schuster and Hachette.  Perhaps all three will combine which would equal the deal announced last year in scale and significance.  But that’s unlikely.  One publisher will almost certainly end up the “odd one out” and it will be interesting to see which it is and what they do next.

Slide 10: To segue slightly and to think about the merger activity, the justification for a merger is often presented as an opportunity to save cost and expense, apply economies of scale and/or gain access to a new market.  At this point, expense and efficiency gains are more likely to be the primary drivers in both the McGraw-Hill and Random House Penguin cases. 

Slide 11: Each publisher will reduce headcount, facilities, distribution and other areas in order to deliver the same total quantity of titles.  They will be able to apply their investment over a larger number of products particularly important as they take full advantage of the move to digital.  In all publishing segments the value chain is compacting, making it far easier for content producers/authors to reach consumers directly.  This in turn, is changing the financial model on which publishing is based.  The functional areas where publishers added margin in order to make a profit – overhead, distribution, marketing & sales--are becoming less important (though not unimportant).  The implications of these changes for publishing houses in the context of the transition to digital have been clear for many years, but addressing how their businesses must change to cope with them is nowhere near complete in the larger houses in both trade and educational publishing.  Smaller, more nimble trade publishing companies like Hay House and SourceBooks have travelled much further down this path and I should make a clarifying point here.  Professional publishing is far, far down Transition Highway.  I’ve frequently used examples from professional publishing such as LexisNexis to show what change may look like to some of the laggards in the other segments.

Slide 12: On the education front, there has been widespread speculation that some merger of Cengage and McGraw-Hill Education will take place this year, since the two companies may end up with a common owner.  If they do, there may not be a full combination in the short term but some trading of assets may take place immediately to rationalize the respective businesses with deeper integration to come, perhaps, in 2014-2015.  Ultimately, 2013 may bring more significant change in the trade and educational landscape than we’ve seen in many recent years.  There will be a lot of focus on the big trade merger and, the industry’s other players will have to fight aggressively not to lose any advantage.  “Bigger will be better” when it comes to applying economies of scale in a business whose underlying business model is changing radically.  In education, we may be paying attention to McGraw-Hill and Cengage but Pearson, as the market leader, is likely to embark on even more aggressive strategies this year.  Under its new CEO, and with the divestiture of Penguin and possible sale of the FT Group, the company has forcefully declared education to be its focus.  In summary, a fairly active last 12 months with indications that there is more to come.  Now, I’d like to return to discussing the changes in education and the potential for change in educational publishing. 

Slide 13: As noted, I expect the platform construct to impact educational publishing.  In fact, Pearson began their adoption of the theory as long as five years ago.  You’d have to be living in a hole in the ground – or certainly somewhere without Internet – not to know there are vast changes underway in the higher education market.  These changes will alter everything we currently know and assume about how higher education functions such as “what, where, how, and when”.  Indeed, ‘who’ a student is may be one of the most fundamental changes we’ll see, relative to how we define a student today.

Slide 14: In education more broadly, all education-content companies (other than Pearson) are only at the beginning of their transition from content providers to embedded content and services providers.  Professional information publishers such as Bloomberg, Thomson and Elsevier have long been able to provide aggregated content and services at the point of need and education publishers will be doing the same thing in the not-too-distant future.  At the Consumer Electronics Show in January, McGraw-Hill made some interesting announcements about product development investments they have been making which presage how this “services approach” may take shape.  But it is still early days.  We will see an aggregation model emerge in education, where content ‘platforms’ deliver content and services based on a different financial model than the current retail or ‘student buys the book’ model.  Publishers are being pushed by some important customers: Initiatives underway in California, Minnesota and Indiana for example show that experimentation is starting to happen with more frequency and publishers are being challenged to think differently about their market.  As I prepared this presentation, I used my predictions post as the framework but I also did some additional research.  Not least because if I had relied entirely on my blog post we would be done by now.  One of the research nuggets I came across concerned the effectiveness of education. 

Slide 15: A study found that 45 percent of students surveyed said they had had no significant gain in knowledge after their first two years of college.  That is the students saying they haven’t learned anything!

Slide 16: Higher education is straining to prove its relevance and effectiveness in the 21st century while simultaneously saddling the average student with more than $100,000 in tuition debt, which the student will then strain to pay because she hasn’t acquired the right skills for employment and has to take a low paying job.  We are starting to see how the Internet and technology are helping to drive change in education to break this cycle.  Of course, change can be worrying especially for the incumbents with the most to lose and no one takes on change willingly if it hurts.  In education, we have an industry that is especially structured and entrenched where ‘tradition’ is almost its defining characteristic.  For a lot of players, this is a cushy existence but it will not last much longer. 

Slide 17: Using the example of other industries, transformation has often shown that change can be liberating and has the capacity to unleash new economic value.  As more and more experimentation in education takes place, steadfast resistance to change will wilt as new models, wider access and better outcomes help create new economic value.  I couldn’t find hard data on how much new economic ‘value’ Craigslist unleashed as it redefined the newspaper classified advertising business.  Maybe the data doesn’t exist, but I think the value considerable.  Craigslist is easy, cheap and measurably effective and newspapers failed by comparison.  Anyone know AirBnB? Using AirBnB you can turn your spare room – or your pool house - into a hotel room.  AirBnB has been around for about four years and is booking more room nights than Hilton.  Think about that.  My friend’s pool house in Beverly Hills was unmonetized but now it helps pay the mortgage. 

Slide 18: ZipCar is another example and there are many others.  In the media world, Facebook, Amazon and iTunes are the obvious examples but WalMart could also be considered a “platform”. 

Slide 19: What platforms do is ‘normalize’ a set of behaviors that occur when people/customers communicate and transact information, goods and services.  As the platform attracts more users – presumably because they create value for the user – the cost of providing the platform becomes cheaper.  The delivery of education is also a network of transactions between suppliers, faculty (university) and students.  Most of these transactions are ‘physical’ but are rapidly going electronic and in the process they take advantage of ‘network’ effects that make communication, transactions and services easier, more affordable and widely available.
The examples of recent radical change in disparate businesses such as music, newspapers, airlines and advertising confirms the inevitability that education will become yet another industry to evolve in the same fashion.  Investment money is flowing to new companies seeking to take advantage of a business in transition which is why private-equity investment in education is rapidly increasing year on year from $100 million in 2007 to nearly $400 million last year.  What is happening in education is very exciting.  The manner in which teaching is delivered, how content is created and how success is measured are all under stress.  A primary enabler of this change is technology, and specifically, the Web – which will be obvious to most of us here. 

Slide 20: Over the past 18 months, the higher education establishment has been rocked by the development of these Massive Open Online Courses or MOOCs.  This direct-to-student model isn’t encumbered by the physical limitations of a traditional campus – nor, it should be said, by things like accreditation, student outcomes or a business model.  At least not yet. 
So compelling are the opportunities to launch MOOC-based ‘institutions’ that high-profile faculty have quit their boring professorships and started new companies delivering MOOCs.  Even big-name traditional schools have banded together (like a Big East or PAC10 for MOOCs).  You will have heard of these companies with names like Coursera, EdX and Udacity.  Has anyone signed up for a course? I think we all should.  On the content side, the textbook still reigns; however, faculty are seeking more choice and power over the course materials they assign their students and, increasingly are looking for custom solutions from their primary textbook publisher.  Permissions revenues – for individual chapters and journal articles – are growing faster than overall textbook revenues, signaling that faculty are making more specific content choices for assignments.  Custom textbook publishing is also growing faster than the overall education market as the largest publishers have upped their game by being able to provide tailored products to their customers.  Additionally, new technology-based companies are emerging, such as Ginkgotree, Symtext and CourseLoad, which offer faculty-driven solutions for the creation and delivery of customized learning materials that support text, video and audio formats delivered to the student in print or digital versions.

Slide 21: Assessment and adaptive learning tools also garner significant attention but mostly in K-12.  That’s not an area where I spend a lot of my time.  But while K-12 hogs the limelight at the moment, it is my belief that assessment in higher ed. will eventually be bigger than anything we will see in K-12 because only through assessment and adaptive learning will we be able to bridge the gap between higher education and industry.  Assessment in higher ed. will be used to evaluate and test a student’s mastery of what they learned in college as a basic criteria for the career they want to start.  As students navigate through college they and their faculty will be able to monitor performance and remediate where needed.  The basis of their ‘assessments’ will be more closely tied to their career objectives.  Adaptive learning tools will also enable students to see how their approach and behavior impacts their ability to learn.  In the old world, students have to wait to be graded but it is conceivable that these new tools will lead students to take more responsibility for their own education, empowering them to ensure success.  There should be little surprise that assessment will be used for career advancement in more fundamental ways and to support education programs for people already advanced in their careers.  This is what I referred to when I speculated about the change in ‘who’ we will think of as a “student”.  To this end, we are beginning to see deeper collaboration between education and business to correct a very particular problem - that students are not being taught the right stuff.  There are already many examples of community colleges collaborating with local businesses to produce workers for them, and new companies, like UniversityNow, are developing cost-effective degree programs correlated to industry and business requirements.  There will be many more.

Slide 22: The rapid rise of the MOOC suggests big opportunities when education can be ‘freed-up’ outside the constraints of the traditional model.  In simple terms, what MOOCs address is the disparity between supply and demand.  Stanford can only accept so many students; but on the Web, all bets are off.  To give you perspective, some of the early classes registered over 150,000 students.  In one Stanford MOOC, of the top ten students who completed the class, none were full-time “Stanford” students.  Not only could Stanford not address this audience but when they did some of the students performed better than the ‘real’ Stanford students.  The reason many elite schools jumped so quickly on the MOOC band wagon and formed the companies I mentioned earlier is that they know they must be positioned to leverage their ‘brands’ on a global scale.  They don’t want to be locked out of markets serving China, the Middle East and India, which represent vast new student populations they can suddenly reach effectively.  It is very early days yet for the MOOC movement and there are some particular issues that need to be addressed including the revenue model, accreditation, certification/degree granting, cheating and security.  But since this movement, as we know it now, is less than 24 months old, some latitude is due in addressing what don’t appear to be insurmountable problems.

Slide 23: To summarize, here’s a quote from Nathan Harden in the American Interest Magazine from last month:
“In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist.  The technology driving this change is already at work, and nothing can stop it.  The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.
That’s because recent history shows us that the Internet is a great destroyer of any traditional business that relies on the sale of information.  The Internet destroyed the livelihoods of traditional stock brokers and bonds salesmen by giving everyone access to the proprietary information they used to sell.  The same technology enabled bankers and financiers to develop new products and methods, but, as it turned out, the experience necessary to manage it all did not keep up.”
While his predication is truly dire for educators (except Harvard), I actually believe the change will take place faster than Harden’s 50 years.  There is just too much pressure from businesses that can’t get qualified candidates, the student debt issue, unaccountable administrators, public funding problems and the increasing amount of high-quality learning material that can be accessed for free.  There are also strong challenges to the idea that education has to be personal.  Real-life experience from Stanford shows that technology-enabled learning can be as effective as in-class delivery.  Additional research conducted at Carnegie Mellon – also noted in Harden’s article – found that when machine-guided learning is combined with traditional classroom instruction, students can learn material in half the time.  It was the technology, not the in-class component, that drove that efficiency. 

Slide 24:
As the younger generation become students – having grown up with social networking – the effectiveness of technology-driven tools and machine-based social interaction will only improve, pushing education even harder.  Let me give you an example:  In 2011, I heard about what Indiana University was doing with educational content.  The school realized that they could both influence the cost of content assigned to students and exert some control over what content is assigned on campuses. 

Slide 25: To do so, Indiana decided they would work with publishers directly and licensed a ‘platform’ from a startup company named Courseload.  The Courseload platform is built so that content can be added to it and then accessed by students as needed for classes.  Indiana negotiates directly with the education publishers to make all their content available on the platform and they pay the publishers based on headcount.  In this model, every student gets access to all the materials assigned by their professor which also means no returns, no used books and 100% sell through.  The publisher ‘pays’ if you will, with a bigger discount.  This is considered an experiment at Indiana and as they continue to tweak the model other schools are taking notice.  You may wonder who the ‘customer’ is on campus for these initiatives and it varies widely from campus to campus.  At Indiana, this initiative came out of the Chief Technologist’s Office. 

Slide 27: As this model evolves, academic librarians, college bookstores and universities will be offered an extensive database of educational material from which faculty can choose the material – possibly pre-selected, topic-driven packages – that is best suited to their classes.  Faculty will need some help here, and who will deliver this help is an open question, but it could be the TA, librarian, bookstore or publisher.  I believe these existing ‘experts’ on campus will position themselves to add new services and capability in support of their faculty using the platform solutions provided by their vendors.  Platform providers such as Amazon, Blackboard, Pearson and EBSCO may soon be the only efficient way for publishers to reach students.  The winners will be in a unique position to provide the audience for publishers unable to compete in the platform stakes.  Providers will negotiate distribution agreements with other content providers and providers will compete against each other to offer the best combination of content.  But a more likely and important point of differentiation will be the unique services and level of integration they can provide faculty, administrators and students.  Perhaps, instead of Pearson and EBSCO, we should think Reuters and Bloomberg as directionally indicative of what will happen. 

Slide 27: In the context of Indiana it becomes easy to see how the Courseload platform can become a product catalog, library, archive and publishing platform.  It may support tools for collaboration, assessment and remediation and via API a front door for ‘value-added’ partners supplying other products and services of value to users.  It’s not there yet but it’s entirely possible.  As the Courseload experiment suggests, we will see changes in revenue models.  For example, instead of profit models based on revenue per book, think “per head” or “per desk”.  In addition, an all in revenue model may also put paid to the argument for DRM protection in education.  A very positive byproduct of this change in content provision will be a complete integration of library resources, institutional resources and the adoption of the consortia buying/negotiation model that together, will create more effective options for students and administrators.  It seems odd (to me) that content sources, as they are currently supplied to students and faculty on campuses, often stand independently of each other and can only be ‘integrated’ through a manual, rudimentary process (by which I mean a copier and a stapler).  And it’s even odder when you consider that libraries have long been licensing tools and services from EBSCO and Serials Solutions which provide deep integration of and access to the databases and content the academic library licenses.  It will only be a matter of time before pan-university content assets – library licensed content, faculty and university produced materials and archived and professionally published content, etc.  - are brought together.  I expect the platform model will facilitate this change.

Slide 28: Opportunities for innovators will continue to emerge, as one would expect in a rapidly changing market.  I’ve mentioned only a few of the new companies that come up every day.  I do believe however, that many of the niche or narrow solutions currently on offer--whether they be assessment, content-delivery or search tools-- will ‘run out of market-space’ as these solutions become embedded in, subsumed by and/or offered as an attribute of the platform solution.  I see opportunity in the delivery of solutions that help specific users – say, university faculty – take full advantage of the integration of content and services that will occur on campus, since many user groups will need to change the way they conduct their usual activities.  The outcome of these work changes will be to generate more productivity and better solutions, but getting there will require ‘intelligent agents’ to facilitate—to help assemble content, training programs, workflow and productivity tools and similar applications to rewire their work environments.  These intelligent agents may be human (as noted earlier) but they could also be virtual.  In education, platforms like Blackboard and Desire2Learn may have an advantage, given their current installed base on campus, even though they don’t have deep content integration but they have ‘automated’ many workflows on campus.  Let me conclude with the following: At the beginning, I mentioned how companies like Craigslist had unlocked value.  So, an obvious question may be where do I see this occurring in education? There are probably many opportunities for this in an environment where our education ‘business’ is so broken.
I mentioned earlier that MOOCs don’t currently have a business model, but in the fast moving world of innovation, this isn’t necessarily true.  Some MOOCs are working with businesses and offering a paid service to match students with job openings.  The student ‘opts in’ to the program to make their class performance available to recruiters which then pay the MOOC for this access.  The win here is to improve the efficiency of finding qualified staff members for the business and reduce the on-the-job training they currently face from under-educated recruits.  As employees recruited in this manner are successful, the legitimacy of the MOOC as a filter to find qualified workers increases.  There is a huge opportunity in bridging this ‘gap’ between education and business and we’ll see new companies enter this market.  It’s not a sustainable model if you assume education will eventually get its act together to provide better education, but it could be a market opportunity for many years to come.

Slide 29: In conclusion, when I titled my predictions for this year I suggested that it was the “end of the middleman”; this is perhaps a little simplistic but, with some latitude, we are seeing a compacting of the value chain and many more options do exist for content owners to reach end users without the ‘benefits’ or ‘encumbrances’ of intermediaries.  Additionally, producers are also able to add more around the content to add value or understanding to the base product.  Here examples would be photo collections, data sets and managed communities, all of which would have been impossible in the “old world” and even in the digital world are difficult to manage if there are middlemen to accommodate.  My self-flagellation over my simplification has to do with a tendency in all of us to underestimate the ability of things to adapt and change fast.  Examples in professional publishing indicates that what we begin to see in platforms is not so much a repository of ‘ready-made’ solutions like books, journal articles, collections and the like but more a biosphere akin to an operating system supporting the end-user in everything from content creation and hosting to user and community engagement and in the case of education – life-long learning.  There is an exciting future to come in educational publishing and we are only just on the cusp of it.

Slide 30: Thank you and I would be happy to answer any questions.
 

Thursday, February 28, 2013

Pearson Reports Financial Results

From their press release:
Pearson accelerates global education strategy:
Restructuring and investment in digital, services and emerging markets for faster growth, larger market opportunity and greater impact on learning outcomes 
Financial highlights*
  • Sales up 5% at CER to £6.1bn (with digital and services businesses contributing 50% of sales)
  • Adjusted operating profit 1% higher at £936m
  • Adjusted EPS of 84.2p (86.5p in 2011)
  • Operating cash flow of £788m (£983m in 2011)
  • Return on invested capital of 9.1% (9.1% in 2011)
  • Dividend raised 7% to 45.0p.
Market conditions and industry change  
Market conditions generally weak in developed world and for print publishing businesses; generally strong in emerging economies and for digital and services businesses.  Continuing structural change in education funding, retail channels, consumer behaviour and content business models.  Considerable growth opportunity in education driven by rapidly-growing global middle class, adoption of learning technologies, the connection between education and career prospects and increasing consumer spend, especially in emerging economies.  
Strong competitive performance
  • North American Education revenues up 2% in a year when US School and Higher Education publishing revenues declined by 10% for the industry as a whole.
  • International Education revenues up 13% with emerging market revenues up 25%.
  • FT Group revenues up 4% with the Financial Times’ total paid print and online circulation up to 602,000; digital subscriptions exceed print circulation for the first time.
  • Penguin revenues up 1%, with strong publishing performance and eBooks now 17% of sales.
  • Accelerated shift to digital & services and to fast-growing economies
  • Pearson announces gross restructuring costs of approximately £150m in 2013 (£100m net of cost savings achieved in the year), focused on:
1. significantly accelerating the shift of Pearson’s education businesses towards fast-growing economies and digital and services businesses;
2. separating Penguin activities from Pearson central services and operations in preparation for the merger of Penguin and Random House.

Restructuring expected to generate annual cost savings of approximately £100m in 2014.
In 2014, £100m of cost savings to be reinvested in organic development of fast-growing education markets and categories and further restructuring, including the Penguin Random House integration. 
From 2015, restructuring programme expected to produce faster growth, improving margins and stronger cash generation. 
Outlook
Pearson expects tough trading conditions and structural industry change to continue in 2013.
Excluding restructuring costs and including Penguin for the full year, Pearson expects to achieve 2013 operating profit and adjusted EPS broadly level with 2012.

Investor presentation slides (pdf)

Also,

Pearson's Penguin Must Participate in E-Book Fixing Trial (Businessweek)
Pearson Launches EdTech Incubator for Startups (Mashable)
Pearson CEO says Financial Times is not for sale (FT)
Pearson Plans Shake-Up (WSJ)
EU to decide on Bertelsmann, Pearson publisher deal by April 5 (4Traders)

Friday, February 15, 2013

FASTR and Slower?: Proposed Open Access Bill

Yesterday the Fair Access to Science and Technology Research Act (FASTR) bill was introduced in Congress by U.S. Representatives Zoe Lofgren (D-CA), Mike Doyle (D-PA), and Kevin Yoder (R-KS) and the sponsors say the bill is designed to increase the openness, transparency, and accessibility of publicly funded research results. The bill would require public publication/access to all federally funded research to be provided if the federal agency has a research budget of more that $100million. From Rep Lofgren's press release:

Specifically, the Fair Access to Science and Technology Research Act (Text:pdf) would:
  • Require federal departments and agencies with an annual extramural research budget of $100 million or more, whether funded totally or partially by a government department or agency, to submit an electronic copy of the final manuscript that has been accepted for publication in a peer-reviewed journal.
  • Ensure that the manuscript is preserved in a stable digital repository maintained by that agency or in another suitable repository that permits free public access, interoperability, and long-term preservation.
  • Require that each taxpayer-funded manuscript be made available to the public online and without cost, no later than six months after the article has been published in a peer-reviewed journal.
  • Require agencies to examine whether introducing open licensing options for research papers they make publicly available as a result of the public access policy would promote productive reuse and computational analysis of those research papers.
An identical Senate counterpart of this legislation is also being introduced today by Senators John Cornyn (R-TX) and Ron Wyden (D-OR).

The federal government spends over $37Billion on federally funded research with most of this money spent by Department of Defense, Department of Energy, Department of Health and Human Services, National Aeronautics and Space Administration, National Aeronautics and Space Administration, National Science Foundation and U.S. Department of Agriculture. According to Lofgren:
"FASTR represents a giant step forward in making sure that the crucial information contained in these articles can be freely accessed and fully used by all members of the public," said Heather Joseph, Executive Director of the Scholarly Publishing Academic Research Coalition (SPARC). "It has the potential to truly revolutionize the scientific research process."

This legislation would unlock unclassified research funded by agencies like the Department of Agriculture, the Department of Commerce, the Department of Defense, the Department of Education, the Department of Energy, the Department of Health and Human Services, the Department of Homeland Security, the Department of Transportation, the Environmental Protection Agency, the National Aeronautics and Space Administration, the National Endowment for the Humanities, and the National Science Foundation.

The bill builds on the success of the first U.S. mandate for public access to the published results of publicly funded research at the National Institutes of Health (NIH). In 2008, the National Institutes of Health (NIH) implemented their public access policy. It is estimated that approximately 80,000 papers are published each year from NIH funds.
This is the fourth go-around for an open access bill but this one may have a better chance of getting to an eventual vote given the changing views on open access and therefore, more acceptance by members of Congress that this is something worth pursuing.

Tuesday, December 04, 2012

Off The Cuff: What are the important issues facing publishing?

At a soiree the other day someone (not in the industry) asked me the above question, so with martini in hand I threw off the following:


Firstly, the transformation from print to digital (obvious) but in that transformation the impact on every aspect of how a business is run: From author relationships to product delivery.  It is this latter piece that most executives & managers don't immediately understand.  Looking back retrospectively on some of the transformations I have gone through I am often amazed that we (as a management team) didn’t see some of the problems we faced but thankfully we became very attuned, very quickly to the different signals that present themselves in a digital publishing environment versus the print world.  As people suggest, it is like running two companies at once but I’ve found it is more than that because in the new world you have no frame of reference and you must form that very quickly.

Second, the 'unit' of sale is beginning to change.  For example, we see this in increased permissions revenues where users are proactively looking for (just) an article or chapter or business case.  This trend will manifest itself most immediately in the education market where content is becoming disaggregated and faculty (and administrators) execute more control over content choice.  At the opposite end of the value chain in content creation, the 'unit' may not be a book (as in the old world) but it could be a set of services providing deeper engagement with the content or a set of public appearances and direct connections with the author.  In truth, it’s likely to be both types and many other similar variations and changes to the ‘unit’.   Closely related to this paradigm change is the issue(s) of discoverability which often manifests itself in the depth and relevance of metadata.  Increasingly metadata will define success for content owners (even more important that it is now) because the best, most complete and comprehensive metadata will drive revenues.  As content becomes more flexible (XML workflow) in composition and delivery the metadata that describes this content will determine success of failure if the content can’t be discovered by the user when they need it.

Third customers are becoming more amorphous; publishers will still work with a buyer who buys a category for an entire chain but they are increasingly working directly with 'the wo/man on the street' who not only wants a direct relationship with the author and/or the content but also wants the content on multiple devices, in different contexts and possibly with different applications built in depending on what their objectives are.

Fourth, there is also the challenge of content pricing and in particular journal pricing.  This is a real issue but oddly less so for Big Dutch Publisher (BDP) because a very large publisher will have the resources to provide value-add to replace/offset the revenue that may be lost as more content is provided via free resources.   What may worry BDP is whether a community or marketplace could evolve around some of these free access points (PubMed for example) that, via collective effort, are somehow able to support/provide a similar level of value-added service that BDP does but also make those additions as free as the content.  That might be hard to image but not impossible.

 Not bad for off the cuff and all in all, a very exciting time to be in publishing.

Wednesday, November 16, 2011

Economist Profiles Springer's Digitization Efforts

The Economist takes a look at how Springer has approached the digitization of their entire backlist/archive of books. They already provide electronic access to 50,000 titles published since 2005 but now they are looking at the remaining archive of 65,000 titles. Springer has been at the forefront of book digitization efforts and some may remember in the early days of the Google Scholar effort they were frequently the most active participants in panel discussions on the subject. (Economist):
Scanning Springer's backlist proved no mean feat. First, the company had to figure out for which works Springer holds copyright, surveying records at all the firms swept up in recent years, says Thijs Willems, who heads the book-archiving project. To create a definitive list his group scoured old catalogs and national libraries. They eventually assembled an archive of 100,000 print books in English, Dutch and German, many of which were different editions of the same work. The firm arranged access from libraries to those that Springer had lost due to the vagaries of time, war, etc. It decided to scan only the last available edition of a given work; earlier editions might be added to the trove in the future.
and they end with this,
Springer has painstakingly produced the highest possible quality of scans, principally to avoid having to start from scratch when today's viewing technology is superseded by something dramatically better. Mr Willems and his team also embedded rich metadata—details like author, date of publication, number of pages, and so on—in standard formats which are likely to persist for a while. They took especial care in reproducing illustrations. These digital books are, after all, meant to last for ever.