Sunday, June 27, 2010

Global Warming of Linked Data in Libraries

Libraries are unusual social institutions in many respects; perhaps the most bizarre is their reverence for metadata and its evangelism. What other institution considers the production, protection and promulgation of metadata to be part of its public purpose?

The W3C's Linked Data activity shares this unusual mission. For the past decade, W3C has been developing a technology stack and methodology designed to support the publication and reuse of metadata; adoption of these technologies has been slow and steady, but the impact of this work has fallen short of its stated ambitions.

I've been at the American Library Association's Annual Meeting this weekend. Given the common purpose of libraries and Linked Data, you would think that Linked Data would be a hot topic of discussion. The weather here has been much hotter than Linked Data, which I would describe as "globally warming". I've attended two sessions covering Linked Data, each attended by between 50 and 100 delegates. These followed a day long, sold-out  preconference. John Phipps, one of the leaders in the effort to make library metadata compatible with the semantic web, remarked to me that these meeting would not have been possible even a year ago. Still, this attendance reflects only a tiny fraction of metadata workers at the conference; Linked Data has quite a ways to come. It's only a few months ago that the W3C formed a Library Linked Data Incubator Group.

On Friday morning, there was an "un-conference" organized by Corey Harper from NYU and Karen Coyle, a well-known consultant. I participated in a subgroup looking at use cases for library Linked Data. It took a while for us to get around to use cases though, as participants described that usage was occurring, but they weren't sure what for. Reports from OCLC (VIAF) and Library of Congress ( both indicated significant usage but little feedback. The VIVO project was described as one with a solid use case (giving faculty members a public web presence), but no one from VIVO was in attendance.

On Sunday morning, a meeting of the Association for Library Collections and Technical Services (ALCTS), Rebecca Guenther, Library of Congress, discussed, a service that enables both humans and machines to programatically access authority data at the Library of Congress. Perhaps the most significant thing about is not what it does but who is doing it. The Library of Congress provides leadership for the world of library cataloguing; what LC does is often slavishly imitated in libraries throughout the US and the rest of the world. started out as a research project but is now officually supported.

Sara Russell-Gonzalez of the University of Florida then presented the VIVO which has won a big chunk of funding from the National Center for Research Resources, a branch of NIH. The goal of VIVO is to build an "interdisciplinary national network enabling collaboration and discovery between scientists across all disciplines." VIVO started at Cornell and has garnered strong institutional support there, as evidenced by an impressive web site. If VIVO is able to gain similar support nationally and internationally, it could become an important component of an international research infrastructure. This is a big "if". I asked if VIVO had figured out how to handle cases where researchers change institutional affiliations; the answer was "No". My question was intentionally difficult; Ian Davis has written cogently about the difficulties RDF has in treating time-dependent relationships. It turns out that there are political issues as well. Cornell has had to deal with a case where an academic department wanted to expunge affiliation data for a researcher who left under cloudy circumstances.

At the un-conference, I urged my breakout group to consider linked data as a way to expose library resources outside of the library world as well as a model for use inside libraries. It's striking to me that libraries seem so focused on efforts such as RDA, which aim to move library data models into Semantic Web compatible formats. What they aren't doing is to make library data easily available in models understandable outside the library.

The two most significant applications of Linked Data technologies so far are Google's Rich Snippets and Facebook's Open Graph Protocol (whose user interface, the "Like" button, is perhaps the semantic webs most elegant and intuitive). Why aren't libraries paying more attention to making their OPAC results compatable with these application by embedding RDFa annotations in their web-facing systems? It seems to me that the entire point of metadata in libraries is to make collections accessible. How better to do this than to weave this metadata into peoples lives via Facebook and Google? Doing this will require the dumbing-down of library metadata and some hard swallowing, but it's access, not metadata quality, that's core to the reason that libraries exist.

Enhanced by Zemanta

Friday, June 25, 2010

Introducing the Totebag for eBooks

As we hurtle towards a future where books come on Kindles and iPads and Nooks, we tend to overlook the loss of many products and services attached to the print book ecosystem. Tens of thousands of people whose livelihoods depend on books will suffer tragic dislocations in their lives. While many bemoan the plight of bookstore workers, librarians, editors, and authors, there are other small industry segments no one ever thinks of.

Totebag manufacturing is just one of these overlooked industries. Half of the world's  novelty totebags for books are manufactured in a single town in China called Shu Bao (书包). Shu Bao is located in an inland area of China that has concentrated on book related products; neighboring towns specialize in bookmarks, dust covers and those little alphabet labels used in dictionary manufacture. At this weekend's American Library Association (ALA) meeting in Washington DC, I had a chance to speak with Shu Bau's mayor, Yi Rui-Da, who doubles as a sort of totebag ambassador and salesman to the world. Yi was in town to start getting the word out about digital book totebags.

Yi told me that the central committee of his town has been closely watching the shift to eReading for at least 10 years. They've seen one of the neighboring towns become quite wealthy by shifting their manufacturing to iPad covers, and hope to make a similar transition themselves. The lesson of what happened to buggy-whip manufacturers after the introduction of the Model T is known to the committee. Some committee members thought the town was in the luggage business, and preferred to stay in the luggage business. Other committee members, aware of the specialized fibers that must be added to their totebag fabrics, argued that the town was really in the information portability business; these voices prevailed.

To make the transition to transporting eBooks, the town had to nurture its programming talent, of which it has an abundance. Totebags are made in factories that employ hundreds of teenage girls. But it's not like the old days, when the girl were virtual slaves, sewing everything by hand. In a modern totebag factory, the girls program automated sewing robots using specialized smartphone apps. Over the past 5 years, the top sewing machine programmers have gone on to advanced operating system hacking; before, they would get bored with programming and get married.

The culmination of this program of training and development is the digital book totebag. I got a demo of this widget in a private suite at one of the conference hotels, but was not permitted to photograph it. The prototype looks nothing like a canvas totebag of course- it's more a mess of wires and connectors. The functionality is quite impressive, however. I was easily able to download an eBook from a Kindle to the "totebag" using a red suction-cup connector that came with some sort of special grease. I then attached an iPad using a USB connector and viewed the book in iBooks. I was also able to connect the totebag to my Google Books account and use the Kindle book there. Yi had a number of other devices to try; each of them had its own quirks, but more or less worked.

I asked Yi how this seeming magic had been accomplished; the most I could get out of him was that any book is "just another sewing pattern". I also asked him if standards for content and DRM would make ebooks portability possible without his digital totebag widget. We had a good long laugh at that one.
Enhanced by Zemanta

Thursday, June 24, 2010

Inter-Library Loan Reinvented for eBooks and Just-In-Time

My graduate school training was in engineering and in physics. In engineering, you put things together and try to get them to work. In physics, you smash things (the polite term is "perturbation") to help you understand how they had been working. I still use these approaches to help me understand the things I write about. You can learn a lot about a system be noting the bits that squawk when faced with a perturbation of the system.

I got a lot of interesting feedback on my article on patron-driven ebook acquisition. It seems that this perturbation in library processes could have wide ranging effects far outside of libraries and book publishing. Coincidentally, the patron-driven model, along with other changes in the library/publisher ecosystem, was discussed last weekend at a meeting of the American Association of University Publishers (AAUP). Publishers Weekly has a nice report. (See also a report in the Chronicle of Higher Education.

The biggest perturbation being imposed on this system is of course the reduction of library budgets, which has come down quite painfully on university presses and their monograph businesses. Still, speaker Joe Esposito was surprised that the strongest reaction to his talk was to his prediction that libraries would make up a shrinking fraction of the university presses' sales.

It seems that there is worry that a contraction or restructuring of monograph publishing could have repercussions for how scholars obtain tenure in the humanities:
The fact that monograph publishing exists to support tenure and the structure of academic employment is an inconvenient truth that can no longer be glossed by either the Academy its associated University Presses. At some point the Academy is either going to have to stop expecting University Presses to fulfill this need, or find a more honest and transparent way of funding it.
if that's the worst thing that happens, well, what's the big deal?

It won't be a shift to patron-driven acquisition that kills off monograph publishing, however. My reasoning is that from the point of view of economics, patron driven acquisition is roughly isomorphic with the current system of just-in-case purchasing coupled with inter-library loan (ILL).

Here's how things work for print monographs. Suppose a university press published an obscure but brilliant scholarly monograph five years ago. It might have sold 100 copies for $100 apiece, most of them to libraries. At $10,000 gross revenue, it was hard for the press to make much profit, but occasionally they get lucky and make enough to cover the losses on the rest of their catalog. Now here's the problem: Over the five years, there were only about 100 scholars in the entire world that really wanted to read the monograph. Unfortunately, only 50 of them worked at institutions that purchased the book. The libraries of the other 50 didn't purchase the book because the selectors in their libraries weren't omniscient or perfect, and they didn't have mind-reading abilities or the power of divination. Or maybe the libraries used an approval plan that hadn't been crafted with the obscure field of this monograph in mind.

But those 50 others still got to read the monograph, because of inter-library loan. For some libraries, ILL is even a revenue center, because their costs to lend are less than the fees they charge. Although publishers made money from the 50 libraries that bought the book and didn't use it, they don't capture any of the revenue from ILL activity. The libraries that spent money to buy the book right away are partially compensated for that expenditure by ILL revenue or reciprocal loans.

Now let's think about what happens in a future where just-in-time ebook acquisition dominates. The 100 users still get to use the monograph, but none of them need to wait for an ILL transaction to go through. The costs are assigned to the institutions that actually use the work. If we assume that the price of the monograph is unchanged, the publisher's revenue is also unchanged; except it's pushed out to the time of usage, which can be many years, especially in the humanities. The time value of this revenue stream is reduced- it takes longer to make back the money spent on producing the book.

The compensation for the publisher is that the revenue continues for as long as the work is still used. The book doesn't go out of print. In addition, since users can discover the monograph more widely, and obtain it immediately, there is the possibility of making additional sales to users who would never have requested the title via ILL.

In a sense, the patron-driven acquisition model is souped-up ILL, with usage fees accruing to the publisher. The comments of Macmillan's John Sargent earlier this year that publishers would like to see fees for library ebook lending don't seem so controversial when examined under this lens.

It's worth thinking through a publisher's pricing strategy. If libraries persist in their preference to remove price as a factor in the patron's decision to use an ebook, then publishers have no incentive to cut costs and keep prices moderate. If libraries allow automatic purchase of any ebook under $100, then publishers will price all of their products at $99. A similar dynamic in the US health care industry has not worked well for consumers, to say the least. Indeed, one university press publisher writing about patron-driven acquisition and the AAUP meeting has opined that patron selection will lead to higher monograph prices:
What this Patron Driven Access model means to university presses is that our future is likely to include two things—higher prices and fewer titles.

It's clear that there would be winners and losers under a just-in-time acquisition system. Librarians don't always select what their patrons really want to read. Controversial works might do quite well, as should engaging but hard-to-categorize works and works that don't break new ground but are readable and useful. Dry, unreadable, redundant works that sell well today because of the author's fame or because they fit into a "hot" field of research will be losers. A work that today is unread because it's too innovative and ahead of its time will eventually find its time under the just-in-time acquisition.

The huge change for monograph publishers will be in the way they market their products. The emphasis will shift from pre-publication marketing to libraries towards search engine optimization and post-publication marketing directly to users. Famous professors may find themselves awash in free ebooks as monograph publishers jockey for key citations and mentions; social networks and subject specific communities will be prime targets of monograph promotion. Publishers will abandon library convention exhibits like ALA in droves; parties and receptions for librarians will disappear.

I suppose we should have fun with the current system while it lasts, even as there are new and more efficient things to build.
Enhanced by Zemanta

Monday, June 21, 2010

Patron Driven eBook Acquisition: Crab Legs vs. Spinach

The showpiece of many Chinese Buffet restaurants is the crab legs. What a bargain! One fixed price for all the crab legs you can eat. How do they do it?

The secret to an effective yet profitable Chinese buffet is the presentation of foods with a moderate CBFOM (Chinese Buffet Figure Of Merit), which is computed using the following formula:

where P = Price, W = Weight, and SF = Shovel Factor, a measure of how fast a food can be consumed popularized in the early 80's by noted comic epicurean Garfield the Cat.
Crab legs, while expensive, are difficult to eat quickly, while soda (always offered before you attack the buffet) is easy to guzzle and very cheap.

Libraries are slowly getting used to the idea of providing all-you-can-eat services. It used to be that consumption of books in a library was sharply limited by supply. There was never any danger that patrons would bankrupt the library by reading too many literary crab legs, because once the tasty dish had been checked out by the first patron, other patrons had to settle for chicken nuggets and broccoli with garlic souce. The introduction of ebooks is changing everything. What used to be an obvious limit resulting from a book's physicality has become a completely arbitrary limit imposed by impenetrable licensing agreements. It's as if your Chinese Buffet restaurant installed one of those food replicators from Star Trek and the instructions came in Klingon.

Libraries dealing with ebooks have to reconcile their mission of providing access with their limited and declining budgets. One model for doing this is known as "Patron-Driven" (PDA) or "Demand-Driven" (DDA) Acquisition. In this model, the library offers access to a huge menu of content, but only pays for material actually used by patrons. Since 50% of print material acquired by academic libraries never gets used, this results in a 50% cost savings (or 100% increase in bang for the buck, assuming you have bucks).

At the University of Texas Libraries, Dennis Dillon, Associate Director for Research Services, is expecting continuing budget cuts through 2014. "I only want to spend money on books that have a fighting chance of being used" he told me recently. At UT Austin alone Dillon has to address the needs of 20,000 staff and 50,000 students with a book budget of a bit over one and a half million dollars, about $30 per student. $700,000 is spent by selectors, with per-department allocations designed to win faculty "hearts and minds". The rest is split between traditional print approval plans and ebooks.

UT's ebook budget is applied to a demand-driven plan offered by Ebook Library (EBL). About a 100,000 ebooks are offered at UT through EBL. Patrons can search and view any of the ebooks for 5 minutes without incurring any charge to the library. After five minutes, a window pops up, asking the patron if they wish to continue using the ebook. If the patron continues, the library is charged for a use of the ebook, but the patron never knows about the charge. The patron can continue to use the book for 10 days without the library incurring an additional use fee. On the fourth use of an ebook title in the library, an automatic "purchase" is made, and the ebook is added to the library's permanent collection.

Once purchased, an EBL ebook can be "used" up to 365 times per year. This has never happened at UT. You might think that a book assigned in a popular class might reach this threshold, but UT's experience is that even if a book is assigned as required reading in a 300 student class, the library will by lucky to get even 50 uses of a book. Their most popular book has gotten about a thousand uses.

Caviar: A Global History (Reaktion Books - Edible)A major benefit of this model is that all the ebooks on offer serve as a sort of digital book stacks. They can be searched and browsed just as if they were owned by the library. This sort of full text search combined with full text availability is something that even Google Books cannot deliver. (UT is an institutional participant in the Google Books Library Project.) UT manages the list of ebooks to minimize its budget risk. For example, all titles over a certain price are removed from the list. No caviar on the Longhorn ebook buffet!

According to Kari Paulson, President of EBL, the patron-driven model developed out of input from libraries who were advising them on ebook access models. In particular, Alison Sutherland from Curtin University in West Australia and Jens Vigen from the library at CERN both wanted to be able to put records in their catalog but buy them only when a patron requested them. CERN wanted everything available with a pay-per-view option with a purchase after a set number of accesses. The library at CERN serves some of the top physicists in the world, which meant that CERN had no issues trusting that their patrons knew best what they needed to read.

Patron-driven service models are gaining traction thoughout the library market. EBL has used demand-driven acquisition and pay-per-view as its key differentiators and is currently using the DDA model with about 150 libraries around the world. Netlibrary has been doing a form of demand-driven acquisition even longer with its Netlibrary on Demand service. Ingram's MyiLibrary service also has patron-driven acquisition options.

Over the last year, ebrary has been testing patron-driven acquisition in a pilot program. According to Leslie Lees, ebrary VP of Content Development, the ebrary program resulted from a surge of interest from its library customers. While similar to EBL's service, the ebrary version of PDA goes to greater lengths to hide the purchase process from users- there's not even a dialog for the user to confirm that they want to continue their session past past the free period. Participants in the pilot program have been quite happy with the results, particularly when they have done a good job of making the content discoverable. The ebrary service is expected to be ready for general use in time for the 2010 fall semester, and will be integrated with management tools from YBP.

I asked Paulson how EBL managed to convince publishers to go along with the untested model. "Publishers were largely nervous about it in the beginning. We took the approach of getting some of the bigger publishers in first so that the others would follow. All of the publishers we have in our catalogue are participating in Demand-Driven Acquistion. Some of the big ones, especially the textbook publishers, have abstained so far."

Lees told me that in addition to responding to customer needs, ebrary is interested in the "short term loan" component of PDA because it could open the door to new service models that appeal to different publishing segments. Textbook publishers need different models from reference publishers, who need different models from monograph publishers, etc.

Interestingly, one company that hasn't dived into patron-driven acquisition model is public-library market-leader Overdrive; very few public libraries have ventured into all-you-can-eat ebook buffets. Perhaps this reflects their funding situation or perhaps a  reluctance of librarians and publishers to trust public library patrons with ebook acquisition decisions. I see no reason that limited patron driven programs, such as ones targetting young-adult literature, couldn't succeed and deliver value in public library settings; they should be explored.

Keys to the Cellar: Strategies and Secrets of Wine CollectingIn researching this article, I was surprised to realize how profoundly the role of the library is changed by the switch to just-in-time acquisition. eBooks don't go out of print the way print books did, so libraries don't need to plan for future needs in their current acquisitions. Academic libraries weren't in the business of all-you-can-eat buffets, they were in the wine-cellar business, laying down supplies for use 5 and 10 years in the future. No more. Now they're in the business of getting us to eat spinach.

More on patron-driven acquisition:
  • E-books and interlibrary loan: an academic centric model for lending, Jens Vigen and Kari Paulson, 30 Oct 2003. Presented at 8th IFLA Interlending and Document Supply International Conference : Breaking Barriers : reaching users in a digital world, Canberra, Australia, 28 - 31 Oct 2003. Report No. CERN-OPEN-2003-050.
  • "Patron-driven, librarian-approved: a pay-per-view model for e-books" Susan Macicak and Lindsey Schell, Serials: The Journal for the Serials Community 22,(3) Supp. 1, S31-S38. doi:10.1629/22S31
  • "Beguiled by Bananas" – A Statistical Analysis of Patron-selected vs. Upfront Acquisition (pdf) Jason Price and John McDonald. Presented at 2009 Charleston Conference
  • MyiLibrary and Patron Driven Acquisition at SIU-Carbondale, Andrea Imre and Jonathan Nabe. Presented at CARLI eBook Symposium in Champaign, IL on March 4, 2009.
  • "Technology Left Behind – Letting the Patron Drive", Cris Ferguson, Against the Grain 22(2) April 2010, p. 83.
  • "Off the Shelf: Patron-Driven Acquisition" Sue Polanka, Booklist 105 (9/10) 2009. Polanka's Off The Shelf blog has been covering business models for ebooks in libraries.
At the ALA Annual Meeting in Washingtion DC, there will be a panel entitled "Patron-Driven Access for E-Books: Have We Finally Found the Solution? Monday, June 28th, 8:00 – 10:00 am. WCC-143B/C.  I'll be there. Say hello.

If you have links for other resources, please add them in the comments; I also welcome comments from ebook providers that I haven't had the chance to talk to.
Enhanced by Zemanta

Friday, June 18, 2010

New Buzzwords: Geo-Aware eBooks and Sub-text Annotations

A Wizard of Earthsea (Earthsea Cycle, Book One)In the magical world of Ursula K. Le Guin's Earthsea Trilogy, true names are very powerful things. Her wizard-protagonist Sparrowhawk is able to gain power over a dragon by virtue of knowing its true name. The "true name" is a recurrent theme in mythology and fantasy literature, not just in Le Guin's work.

expand or collapseTo some extent, the enchantment of names and words is a writer's conceit; most authors never get their books published and fewer get their work read, so the idea that a writer's craft has magical power is quite seductive.

A similar thing is going on in the W3C semantic web effort; it has promoted a conceit that bestowing a proper (http) identifier (URI) on a thing imbues it with a first-class objectness.

There are times however, when the act of giving something a name is more than a conceit, it animates something previously ill-defined by allowing attachment of attributes. It allows people to talk and allows talk to become buzz. In the past two weeks, I've twice come across novel literary constructs with names that promise new substance: geo-aware ebooks and subtext annotations. Only time will tell if these will end up as flimsy toy widgets or will give rise to new forms of creative expression.

The sub-text annotation is something being explored by blog pioneer Dave Winer. The emergence of the "blog" as a literary form is an interesting example of how a name can emerge alongside the substance of the thing it names. The name "blog" derives from a joke generally credited to Peter Merholz, that the correct pronunciation of weblog (apparently a 1997 coinage of Jorn Barger) was "we blog".

Winer has been blogging since before the blog had its name and was an important early advocate of the form. By adding syndication to the blog's set of standard features in late 1997, Winer helped shape the substance of the weblog from out of a website morass.

I met Winer after an event on citizen journalism that he organized last week at NYU; he is very engaging and a true New Yorker, although he has only recently come home from a lengthy California sojourn.

expand or collapseA subtext annotation is some text hidden below a paragraph that that can be expanded and collapsed by the reader. This feature comes out of a line of thought about hypertext and reading that includes Rich Ziade's work on the "Readability" bookmarklet. Are hyperlinks good for the reader, or are they distractions? How can annotations be included in electronic text so that they optimally engage the reader while leaving the text with as much impact as possible? Readability strips out inline hyperlinks, replacing them with numbered endnotes. Winer point out that collapsing the annotation into the text makes the rest of the text easier to read.

If there's one thread that has pervaded Winer's work over his entire career, it's the application of hideable, hierarchical text. From the ThinkTank outlining application for the Apple II and PC to the Frontier content management system with its collapsible Usertalk scripting language to weblog syndication itself, it's clear that Winer loves collapsing text. Go over to Scripting News and see a cleaner implementation of sub-text; I struggled a while with getting it to work here, and it looks a bit funny.

expand or collapseYes, there's a subtext annotation above this paragraph (and below it). Click on a little plus to expand it. I think you'll agree that it's not something so new that it's inexplicable. You've seen collapsing outlines in all sorts of user interfaces before. Blogs often use collapsed text to implement "below the fold". But what happens if the subtext annotation becomes standard on blogs, and gets included in ebook authoring toolkits? Will authors and readers find the subtext annotation more natural, more versatile, and more usable than the endnote annotation or the hyperlink? Will the subtext lead to new forms of narrative or better ways to present information? Doesn't collapsable text seem more natural than a table of contents for ebook navigation?

In a fascinating 2005 interview with Robert X. Cringely, Winer describes his non-magical view of names- he thinks they don't matter!
And, you know, and later on, in 1998, actually, earlier, sorry, I had a - I did work with Microsoft on something called XML R. Then it became XML RPC. Later became SOAP. And we actually had a rule during the collaboration that we're gonna always go for the worst possible name for every element. And that ended all the arguments where people would say, "Well, I've got a better name." I said, "Well, but that's not the point. You could tell me you have a worse name and I would listen to you. But if you've got a better name, I don't wanna know about it."

Speaking of ebooks, Liza Daly is a practitioner of ePUB magic. ePub is the emerging standard delivery format for ebooks; it's based on HTML, so developing an ePUB ebook is a lot like developing a website. In her talk at IDPF two weeks ago, she described the geo-aware ebook for an audience of ebook technologists. The geo-aware ebook is not Liza's invention, but she certainly summoned it into being for us from out of the HTML5 protoplasm. HTML5 includes new scripting APIs that expose system hooks to a an HTML page's javascript. Of particular interest here is the interface to geolocation information.

expand or collapseTry it.

Yes, this annotation knows that you are in your underwear, wondering what this is. Congratulations, you have just experienced the world's very first geo-aware subtext annotation. Or maybe you haven't, because your browser (IE?) doesn't support it or because you've declined to give this page permission to access your geographic position.

A nice chapter on HTML5 Geolocation is in Dive Into HTML 5

My implementation isn't as sophisticated as Liza's. Feel free to leave a comment if you're using current versions of Chrome, Firefox or Safari and the sentence above still shows "your underwear".

It's an open question as to whether geo-awareness will be widely get used for art as opposed to commerce, but there are sure to be some amazing applications.
The applications to guidebooks and in-ebook advertising are obvious. I imagine a news site akin to Hacker News that ranks stories according to the coordinates of the reader and the contributor.

expand or collapseIt's becoming clear to me that the ebook will assume many new forms. Participants in the book industry- publishers, merchants, libraries, editors and of course authors will need to learn how to manage an increasingly diverse and dynamic product. Even if they don't have "true" names or identifiers for them.

The sorcerers of the "Vook" have many dragons to worry about.

Enhanced by Zemanta

Friday, June 11, 2010

How Electronic Resources Really Get Priced

The recent letter (pdf) from the University of California Digital Library (CDL) about price increases  proposed by Nature Publishing Group (NPG) and the response from NPG have raised a storm of controversy and rebuttal (pdf). To me, the mess is symptomatic of a communications failure between publishers and librarians.

In the interests of promoting better library-publisher understanding, I've decided to reveal some secrets from both sides.

Libraries: Here's how publishers set pricing for electronic resources.

Once upon a time, pricing for library materials had a relation to the cost of their production. Even before the internet came along, this started to change. Printing costs fell, and more and more of the production costs of a good-quality journal were "first-copy" expenses. With electronic materials, the marginal cost of servicing an additional subscription became almost zero. Pricing then became a game whose object was to sustain existing pricing, along with "reasonable" annual increases of a few percent per year or so. (Nature Publishing translates "a few" to "7".)

The game was most difficult for very large or complex institutions. The value of a top medical journal to a US medical school is huge; the value of the same journal to a vo-tech school would be much smaller, but still significant. A medical school in a developing country will also need the journal, but it's not fair to ask them to pay the same as a US school. Differential pricing helps a publisher capture value while still extending access to customers who might otherwise be able to afford the journal. But how, then, to set pricing?

I learned the secret of e-resource pricing through long hours of research (spent mostly in bars). Here's how it works:
  1. Find out how much money the customer has.
  2. Set price somewhat higher than that.
  3. After hard bargaining by customer, offer discount to closely match customer's available funds.
  4. Swear customer to secrecy; you can't give that price to everybody!
In times of budgetary cutbacks, this pricing mechanism works to a library's advantage. A library that needs to cut its electronic resource expenditure in half simply needs to disclose to salespeople the fact that their funding has been cut in half, cancel the subscription...and wait for panic to set in.

The library's leverage will never be greater. It's much more painful for a digital publisher to lose an digital customer than it is to lose a print customer. That's because the publisher has to spend money on digital publishing infrastructure when its customer base grows, but doesn't get anything back when a the customers go away. If anything, the publisher will have to spend more on sales to try replace the customers.

Oh and by the way, libraries, this all works easier if things are quiet- when you agree to give a publisher a higher price than you wanted to, swear them to secrecy- you can't afford to give the same deal to every one of your publishers!

Publishers: Here's how to get libraries to cave on pricing

Many librarians suffer from feelings of powerlessness. They are prisoners of their patrons' needs and desires. They are captive to changing technologies and archaic standards. They are trapped in arbitrary budget gaps. And they are stuck in endless committee meetings.

Libraries are thus willing to spend a great deal on things which offer escape from powerlessness. They dislike monolithic packages that bundle content together, even if they save money. The current reality in libraries is that budgets have been cut. Publishers need to give their library customers options that help them deal with budget cuts. The smartest publishers can figure out ways to help libraries cut costs and free up funds currently spent in other areas.

Cool Hand Luke [Blu-ray]When I was developing an electronic resource management service for libraries, I had this recurring nightmare that e-journal publishers would someday make it as easy for libraries to activate and maintain an e-journal subscription as Apple's iTunes makes it to buy and maintain a song. My software would instantly become worthless. Just kidding- I slept soundly knowing it would never happen in a million years.

Perhaps NPG confused powerlessness with weakness and saw an opportunity to force CDL to act like the other prisoners. Perhaps NPG never saw the movie "Cool Hand Luke".

Monday, June 7, 2010

Are Public Libraries in a Death Spiral?

The year and a half that I worked for Intel was marked by a sharp recession in the semiconductor industry. Most of the industry was plagued with a boom and bust cycle; during the busts, most companies laid off staff. Intel had at that point managed to avoid layoffs, and Andy Grove, Intel's hard-charging President, wanted to protect that record. He came up with a program called the "125% solution". As an alternative to layoffs, Grove asked Intel employees to work 50 hours a week instead of 40, without a pay increase.  We were told that we would work our way through the recession by developing new products.

The management of the Technology Development division, where I worked, had a hard time with this plan. At the meeting where they explained the program to us, they anticipated the obvious question. "No, this doesn't mean you should cut back your hours to 50 from the 50-80 hours some of you have been working." Even so, most of us were relieved that there would be no layoffs, and the line workers felt the same way.

I apologize in advance if this article comes off a bit Grove-ish, but I've become increasingly worried about the future of public libraries. It's not just the large library budget cuts that seem to be pandemic across the country (see the list below if you need any confirmation) but it's the way that budget cuts are preventing preventing libraries from addressing the fundamental changes that are occuring in the ways that people interact with books.

A favorite budget-cutting tactic of public library directors seems to be curtailment of opening hours (with branch-closing a close second). To me, this seems like the worst possible thing for a public library to do. It's as if Andy Grove had told us that we needed to get rid of 25% of our customers. Public library funding comes from the public, and the best way to convince the public that their library deserves more funding is to get the public inside the library doors.

Public Broadcasting is a good example for public libraries (and a competitor for donor support). Does public radio turn off their transmitter when they need money? No, they put on specially good programming and have pledge drives. My local library puts donor names on bricks; I'd like to see libraries put donor names on opening hours.

Tough economic times are exactly when public libraries are needed the most. The assistance that libraries offer to people looking for work, training for new occupations, learning to read, or finding social networks makes public libraries valuable parts of their communities, but that doesn't happen when the doors are locked. I can imagine a future where public library services are online and always open, but that's not today's reality.

One argument for cutting hours might be that the visibility of these cuts makes it easier to restore funding when the economic cycle turns. This may have been true in the past, but I think that the funding that goes away this year will never come back. The way most people access information has changed profoundly since the last economic downturn. A library that reduces its hours is just training its public to meet information needs elsewhere, and that public isn't going to rush back. The library's funding will get cut again and again, and eventually it will have to close its doors.

The public library of the future has to stop being about collections and start being about helping people and communities. Google can't be a physical place where people offer access, assistance and education for the internet and the information it accesses; local public libraries CAN be.   It's these "new" directions that will attract renewed public support and funding into the future. But library directors struggling to deal with budget cuts will find it hard to also reposition their services to meet the needs of an increasingly internet-based society.

Although I don't like cutting hours, it's not like there are lots of alternatives. Libraries with slashed budgets have painful cuts to make. Difficult choices will be made on acquisitions, staff, buildings, user fees, organizational mergers and consolidations. If you're involved with a library that's going through this, don't give up.

Intel survived that recession, and gave us beer mugs to celebrate the end of the "125% Solution". But it didn't escape the endless spiral of economic expansion and contraction. The next contraction cycle combined with competition from Japan to force Intel to lay off many workers and re-examine its businesses. Intel left the DRAM business to focus on microprocessors. You can read the rest of the the library. (Or at Amazon)

Funding cuts in libraries (thanks to Gary Price at ResourceShelf, the Library Journal Funding Page, and LISNews for many of the links):
Reblog this post [with Zemanta]