Monday, December 17, 2012

Paying for e-resources per use

On the OEDB blog, Ellyssa Kroski pointed out this article on Forbes on the war between libraries & publishers over ebooks.  Although the author, David Vinjamuri, doesn't have a background in librarianship (he mentions that libraries are "already transforming themselves" by providing the same kinds of services libraries have been providing for decades), he does seem to have a grasp on the problems libraries are facing with the large publishers' reluctance of extending ebooks to them.   Phrasing the situation as being "at war", however, is a bit of an exaggeration - he extended the "tug of war" metaphor headlining an article from the New York Times piece from last year. But this may be picayune...

I wanted to focus this posting on his suggested solutions, notably, requiring libraries to pay for ebooks (indeed, all electronic resources) per use.  He bases this idea on the assumption that electronic resources are licensed and not sold, which itself is the key difference from books.  Essentially, the copyright law allows publishers to treat libraries as resellers of content rather than owners, which he recommends that libraries should challenge.   Taking his suggestion of a value between 50 cents and a dollar per use, I calculated the cost of our ebooks to public libraries using the mean ebook circulation reported in the latest ALA report on ebook usage in public libraries.  The 44,596 mean "circulations" (itself a difficult concept to apply to ebooks) would have cost an average of $33,447.  This is over three times the amount libraries planned to spend this year on ebooks ($10,400).  With ebook usage expected to increase, this doesn't seem to me to be a sustainable model.

Admittedly, increases are never infinite, and usage will eventually plateau, much like print circulation has.  So, if I based future ebook circulation to be similar to print circulation (an assumption fraught with problems, such as different circulation periods), I found that it would cost public libraries an average of $200,704 for the 267,606 mean circulations (calculated based on the Public Libraries Survey from 2009).  The average amount spent on collections by public libraries in 2009 was about $142,400 (Table 21A).

You can see that a pay-per-use model would not likely be sustainable.  It is, in fact, a model from which libraries have been struggling to get away since the very early days of online databases.  The problem with pay-per-use is that there is no way for the library to become efficient.  As collection assessment librarian, a key measure of efficiency of our collection is cost-per-use.  If this measure were to become fixed, our expenses would be much harder to contain.  Another factor to consider is the moral hazard of having our funds effectively spent by those who do not feel the risk directly (individual members).  By the end of the fiscal year, we would run out of money and access to resources would be restricted.

Finally, David ignores the libraries' fundamental role of preserving our culture (particularly written culture), which would not be possible in the pay-per-use model.  Access to the electronic books would be at the discretion of the publisher, and not the library.

I do fully agree that libraries need to challenge the basic assumption that libraries are resellers of electronic content and not owners.  In the meantime, I support the efforts being made for libraries to retain copies of electronic books on locally- or consortially-managed servers (a la Adobe Content Server and open-source DRM).

Sunday, December 9, 2012

Where good ideas come from...

OK, so I'm a little behind the times here, but my weekly review of TED Talks sometimes goes back in time.  This is because I watch them through my TV, which has a YouTube app. So rather than listing the TED Talk videos with the most recently received first, I see them in order of popularity, and of course, popularity takes time to build.  Which is how I ended up watching the 2010 video, "Where do good ideas come from".  I've requested the book at my library, as well...but for this posting, I was inspired to write about Steven Johnson's ideas as it relates to libraries.  After all, good ideas do come from libraries, don't they?

Shared patterns of innovation, even within biological systems.

Rich vocabulary of creative moments share the basic assumption that innovation is a "single thing...a single moment".  But he advocates that innovation is a network.  Innovation is simply looking at a problem differently and coming up with something new.  "The spaces that lead to innovative thinking look like this..." Hogarth's painting "Humours of an Election," what Johnson calls "The Liquid Network".

Johnson recalls research conducted by Kevin Dunbar, using "the Big Brother approach," recording conversations of ideas, trying to find that "Eureka" moment.  What he found was that breakthroughs happened at the conference table when researchers discussed progress & problems, bouncing ideas off of each other.

Steven Johnson determined that "important ideas have long incubation period," what he calls "The Slow Hunch."  He uses the example of Charles Darwin, who, from analysis of his copious notes, had the full theory of evolution months and months before his self-described epiphany, but who was unable to think it out fully.

Johnson concludes that creating ideas requires time to think, as well as the opportunity to share the hunches or ideas.  The idea of connect ideas rather than protecting ideas with intellectual property restrictions - the power of open innovation.  He ends with this thought: "Chance favors the connected mind."

I see that Johnson's ideas about, um, ideas, justifies the changes librarians have been making within their libraries and with their job descriptions.  By opening the library to small groups, by encouraging the application of group projects, by providing the mild stimulants delivered in coffee and tea, we are providing the environments necessary to connect ideas.  But we are not merely conference rooms - the "liquid" in our networks is not the coffee, but rather the knowledge and information provided via books, articles, databases, reference works, manuscripts, music, images, maps, as well as locally-generated resources like posters and presentations from students and faculty.  We could further enhance this connectedness by making our resources more accessible, find-able, manipulate-able, and useful to our members.

Monday, December 3, 2012

Perusing interesting ideas on a Sunday Morning

I've developed a new routine these last few weeks of watching TEDTalks on my TV (the convergence of TV & Internet in action) and reading miscellaneous blog entries on Sunday mornings while my better half is enjoying riding his motorcycle on the nearly empty streets.  Of course, my dog would rather I be taking him to the park, but his fun will come later.  This is my time to think and learn.  And here is what I've discovered this week:

If we want to help people, Shut Up and Listen!  Ernesto Sirrolli discusses the seemingly obvious lesson learned from decades of failed Western-based aid projects in African nations.  For instance, after attempting to teach some villagers how to raise Italian tomatoes in a rich valley, all the fruits of their labor were lost to the migrating hippos.  Sirrolli sums up why the villagers did not tell him about the hippos - because he didn't ask.

Sirrolli takes this lesson learned and applies it to his own NGO of nurturing entrepreneurship in developing countries.  His method is to approach a potential client with no problems to address and no solutions in mind, but rather to listen to what the client wants to accomplish and what solutions he or she has in mind.  While addressed to those in NGOs, this lesson should (and sometimes is) applied to librarianship.  Consider collection development.  Rather than prescribing the kinds of materials for a particular subject or collection, we should be listening or paying attention to what our readers (faculty and students) use and need.  They may not know the specific items, but they know they what they want to accomplish.  What we can provide is the knowledge of publishing and literature, as well as storage and retrieval, that enables the selection of and access to the most useful materials that will enable them be successful.

From The Scholarly Kitchen, I've read an essay that advocates relying too extensively on "metrics" or even "altmetrics" to measure the quality of scholarly communication.  His alternatives to metrics,  or "alt2metrics," addresses the problem that our current quantitative metrics (e.g. impact factor, Eigenfactor, etc.) are dismissed by the very sources of scholarly communications - the academic researchers: "More often than not, academics and researchers are dismissive of metrics, as they’ve seen how, once you poke at them, they end up being relatively blunt measures with little nuance or depth."  The author, Kent Anderson, then describes what he believes are the factors that the researchers pay attention to when evaluating research, namely:

  • Brand (of journal)
  • Authorship (within the journal)
  • Results 
  • Sponsorship
My thought, however, is that these are exactly the kinds of "signals" that the metrics are trying to quantify.  Impact factors are closely associated with brand of journal, as well as sponsorship, while h-index and the citation networks measure author productivity and impact, as well as networking.  Kent is advocating that we (those interested in measuring the quality of science) not ignore the "primary, original signals of value" that "scientists rely on every day to guide them and their searches for information."  This does make sense, but I'm beginning to believe that the Uncertainty Principle applies to many, many more things than physics.  I'm sure I'm not the only one who's come to this conclusion...