This is here

Icon

information, design, architecture; information design + architecture

Recession-boyed Libraries Find New Bubble to Invest In

Turns out it’s poor people!

According to Alison Flood, libraries have elected to stop making all those “it’s the foot-fall”, “we’re community hubs” and even “free internet!” noises as soon as loan figures look like they’re on the up.

Why are they going up? The same reason anything happens these days: because everyone is fiscally boned. Do we expect people to stay fiscally boned? Not really, no. If we did, we’d have bigger problems to deal with. As it is, rising food prices make throwing a couple of books on top look unappealing, and the pretend value of three-for-twos is probably going to convince less for the next few years, but in everyone else’s good times, libraries found that the cheapest best-sellers in the land were the most inconvenient, and keeping Richard & Judy’s Book Club display shelving well-stocked once month just meant more stock to be sold the next, as well-marketed short-head items refused to grow that extra tail.

Now is a good time for companies to be getting out of mutually-destructive competitions and finding sexy new forms of growth, and it’s going to work out really badly for libraries if now they want to get back in on it.

As long as everyone continues to not work out how to monetise anything other than an author’s linear character strings, we’re going to continue fighting over the right to distribute them in their audience’s chosen media, and libraries are do much better follow a route like Aaron Schmidt, Dave Pattern and JISC’s TILE (see David Jennings on their recent “Sitting on a Goldmine” event), using their inherent librariness to add a side or two to basic book meat.

If this is what good times looks for libraries, libraries are really shit.

(Props, of course, to The Onion.)

Filed under: Uncategorized, User experiences

Reading, again (and a metalesson in its perils)

A very odd thing happened to me on my way to writing what I now realise to be a a completely unrelated post about user-generated content…

Two exciting things came to my attention; Nate Hill linked me to Aaron Schmidt, specifically his social database mockup, my reaction to which got bogged down in first hearing about, and then waiting to hear more about, Talis’ Project Xiphos, now finally better explained better over on their Panlibus blog under the explanatory title “Silos Silos Silos”. There’s a lot to both of them, but broadly the former is a social information site that creates a set off tools for users to databases that, in turn, creates a level of metadata for other users, and the latter is a semweb set of technologies for opening up multiple datasets to, amongst other things, radically alter the currently insanely annoying experience of federated information access regimes.

What’s surprising to me is that the initiative to make data interoperable is explicitly courseware, and the one to make it social is implicitly (and slightly reluctantly) so. You’d except that to be the other way around, which to me firms up the idea that libraries host a specific kind of information use, that they served and encouraged by topic-based physical browsing, that they need to now better support with new, non-physical (well, non-linearly-physical, or something) technologies. (TANGENTALLY RELATED ASIDE: last night Stan Cohen, a bright button, said that Google held 338,000 entries on “climate change denial” as if this meant it contained 338,000 items denying climate change. Of course, it does not. It contains 338,000 items discussing the denial of climate change. There’s something about search as the primary form of access that implies you’re finding exactly that encourages a bias in favour of certain desired biases, I feel. He wouldn’t go to 324.243 and become outraged at the quantity of fascist literature encroaching on academia, would he?)

Final point, (WARNING: .pdf) the JISC & SCONUL Library Management Study reports that “[t]he ability to aggregate user behaviour has significant potential for discovery services, based on click streams, context and personalisation. Nevertheless libraries are not yet exploiting intelligence about user habits to enhance their position in the information value chain.” Well, yes.

Filed under: classification systems, Information use, scholarly communication, Semantic Web, User experiences, , ,

Text – user experiences

Ian Bogost’s post about Noah Wardrip-Fruin’s Expressive Processing, the my-peers-are-bloggers,-my-publisher’s-peers-are-academics,-let’s-get-a-bite-to-eat love-in, reminded me of a previous working running on the Institute for the Future of the Book‘s CommentPress; McKenzie Wark’s Gamer Theory.

As a bloggy form, GAM3R 7H30RY showcased the paragraph-by-paragraph reader’s commentary offered by CommentPress by pulling separate paragraphs out and making them note-card-like, but these were a labour to navigate linearly. Yet the book’s structure clearly intended them to be — otherwise why group them under chapters that deal both with one theme and one game apiece? I’ve skimmed it, sure, but I couldn’t say I’ve skimmed it very well.

It’s treey counterpart, Gamer Theory, however, doesn’t help much. Certainly vs. my laptop, I fell asleep once with it in my bed, woke to find it on the floor, and didn’t burst in to tears. But where I couldn’t get in to the linear passage of argument in G7, I couldn’t get in to the non-linear passage of discussion in GT, whose publisher, Harvard University Press, had selected choice comments and included them as endnotes, along with Wark’s own, more traditional, quasi-paranthetical bibliographical commentary on his own thought processes.

Endnotes kill arguments dead, not because balance and counterpoint don’t have a roll to play in argumentative and discursive text, but because those texts’ rhetoric go from top to bottom, while endnotes go sideways, fowards and backwards, through that text and others. Digital media isn’t going to kill the book; in many ways, like this one, it will create books that are more book than book. But as Bogost notes, current networked text continues to strain against it’s own limits in ways that don’t yet (convincingly) demonstrate it’s possibilities.

Filed under: e-text, User experiences