November 10, 2007
At the end of the day in the Public Libraries track, they had a “super-panel” of really smart, really forward-looking folks who took questions and answered them as best they could. Jenny Levine did a great write-up of this session. Take a look at it and see what other public libraries are doing to bring the patrons into their “digital branches” as creators and consumers of content.
November 4, 2007
Oh. My. God. Can I just say wow! Casey Bisson talked about the advantages of an OPAC with Web 2.0 capabilities (comments, tags, etc.) and then demonstrated how easy it can be to get one! First, he talked about the challenges to our current catalogs – the usability, findability and remixability of our content is pretty limited. He also said that we’ve learned a few things from the Web 2.0 phenomenon – we have one chance to prove we aren’t stupid – if someone comes in looking for books on the sociology of education, and we offer books on watersheds, they are going to think we are stupid (and he showed an example of just that happening on a traditional catalog search), search boxes are for asking questions and links are citations. As for usability and remixability of our data – we should be offering users ways to reuse our content they way they want to use it (easy-link to our catalog, etc.). He also made the point that sites that allow comments value their users. Control isn’t so important when you think of commenting in that way. Also – your website isn’t a marketing tool, it’s a service point. This point was repeated a couple of times in the conference!
The last 11 and 1/2 minutes of the presentation were taken up with the installation and configuration of Casey’s “next-gen opac”, scriblio. He actually did it in real time (with very little prep work done prior to the session) and showed how quick and easy this OPAC is. As he was demoing it at the end, I thought it looked an awful lot like III’s Encore product – but without the hefty price tag and with a tiny bit more upkeep required (automated updates done twice a day would just about do it).
Ruth Kneale started off this talk with a run-down of her library’s search for a content management system that would work for their needs. While we already have one in place for our Intranet, the main website’s CMS isn’t written in stone yet, so I wanted to see what process she used and what conclusion she reached. The upshot is that Drupal (the CMS that runs our Intranet) was the best supported, most feature-rich and least likely to create a learning curve for staff that she saw. There were some Joomla/Mambo fans in the audience who challenged some of her assumptions, but for what she wanted to do, Drupal was clearly the best choice (not that I’m biased or anything…). It was a really interesting look at what they plan to do with their CMS and how they chose it. They are planning to go live with it on Jan 1 of ’08.
Darlene Fichter did a great program on creating pretty rich informational applications with almost no programming knowledge. She discussed the “mashup” – an application that uses content from more than one source to create a new service – and gave us some great examples of mashups found in the wild (just a brief overview of some of the 2456 mashups out there – it was only a 45 minute long session, she couldn’t hit them all). One of the most interesting things she said, for me at least, was that in the new mashup ecosystem, content that can be repurposed and remixed gets used. If we want our content to be used, we have to provide at least a simple API (and that can be as simple as providing the content via RSS) that lets others take our work, add value to it and re-deploy it.
Susan Braun, from the Aerospace Corporation, did a presentation on capturing knowledge in a corporation. This, obviously, has little to do with my job at a public library – though I thought it might have had more from the description. Not a lot to report here, though – at least for me!
In this session, the second presenter wasn’t available, so Karen Draper of Adobe, talked about her corporate training initiatives. She uses in-person, online and video-based training. The coolest part of all of this, as far as I was concerned, was the video based training – she could archive it and make it available for anyone to view or review as the case may be. She, of course, uses Adobe’s Captivate to create and share her video training sessions. She was able to show us some examples of what she has done and it was pretty cool. She also discussed marketing the training sessions – both through the paper newsletter and through a training blog that gave her the opportunity to post links to video or online content and to answer questions from people who weren’t at the session – and keep an archive of all of those resources and questions that she can refer back to as needed.
This session began with two of the people responsible for Hennepin County library system’s awesome bookspace.org site (Glenn Peterson, Web Administrator, & Marilyn Turner, Manager, Web Services & Training). Marilyn started it off with an overview of the goals for their reader’s advisory website (the bookspace.org site). She wanted to bring together all the resources for readers that they had scattered throughout the site into one place and allow both librarians and patrons to contribute even more content to the site than they already had. They did this by assigning a “bookspace coordinator” position that had a bookspace workgroup (of about 5 people) and bookspace contributors (about 30 people) helping to create the content of the site. The one thing I found really interesting was that these 35 folks were not volunteers – they were required to post to the genre blog that they managed at least once a month as well as create and maintain book lists in their genre and part of their performance evaluation included this work. Even so, the participants generally found that they only spent 1 to 2 hours a month working on their part of the site.
Glen then got up and discussed the underpinnings of the site. It is very social, it allows patrons to contribute and share their knowledge on the site. Some of the coolest features it offers are blog comments and book lists (created by users, staff and auto-generated from the catalog). It also lets people know that others are contributing to the site – showing that the site is alive – by featuring the “current activity”, what’s going on right now, and by highlighting the most active contributors/commenters. The site is based on a database-driven model, with RSS everywhere and it uses ColdFusion as it’s scripting language. The “takeaways” from this presentation (Glen wanted to be sure to include these, because each speaker with takeaways got to run their hands through Michael Stephens’ hair…) were:
- Draw on library staff
- Empower your users
- Create opportunities for serendipity
- Let users interact
The slides will be up at http://www.hclib.org/extranet (they say they are there now, but the link is bad) with lots more information than what I was able to write down!
October 31, 2007
Bennett Ponsford and Christina Hoffman Gola from the Texas A&M libraries discussed what they did (surveys, focus groups, etc.) to get information from students on their redesign project. The process of getting respondents was pretty interesting – they used email, facebook group bulletins and discussion forums to try to get users in for focus groups and surveys. Email was, by far, the most effective way to get folks to help them out. The rest of the presentation was focused on what they learned and was pretty much focused on the academic library and how undergrads, grads and faculty use the site differently. Not a lot for public libraries there.
The next presenter, though, had a great idea! Erica Reynolds, from the Johnson County Library, got frustrated with her redesign process and decided to take the web group out of the library and on a field trip. They headed to the Nelson museum of Art in Kansas City and used the concepts from the 4000 years of art collected there to recharge their batteries and get ideas for their redesign process.
- Have a backup plan – they kept the old site available via a link so that if something didn’t get moved to the new site, they could still get to the information
- Be bold. Be dynamic. Be human – use pictures of staff, patrons and guests who provide programs at the library.
- When you paint to sell, you paint people – again, use pictures of both staff and patrons to get users interested
- Enliven your collection through reorganization and presentation (uses Novelist to populate “need a story?” feature)
- Technology changes everything – “if a director’s not blogging, we’re like – ‘what are you doing?’”
- Experiment with small studies and prototypes
- A desire for beauty and serenity endures
- We like surprises. Anticipating surprises is even more delicious.
- A good guide enhances the experience exponentially
- Destruction and creation are forever linked
- Never stop innovating
- We can be both prestigious and playful.
I will definitely link to the last presenter’s slides – they were gorgeous and there was a lot more info on them… I just couldn’t write that fast to get it all!
Frank Cervone started this presentation with an overview of what “evidence-based practice” is – it’s essentially the rigorous use of data from various sources to drive web feature/application/design decisions. He listed the fundamental ideas of evidence-based practice as studying a particular problem using focus groups, surveys and direct observation usability studies, contrasting your results with other studies and then combining results to better understand the problem and, hopefully, the solution. He went through a discussion of HCI (Human-Computer Interaction) principles and discussed levels of evidence, from the peer reviewed, rigorous studies in the literature to the casual anecdotal evidence from the public services staff. The “big issue” from his work with all of this is that we should be designing for a world where our users don’t have to come to the library site – because they won’t. We’ll have to be where our users are!
Amanda Hollister did the last bit of the presentation on her academic library’s use of breadcrumbs. Breadcrumbs are a way to track what pages you have visited in your visit to a site. We have breadcrumbs at the top of our staffweb pages. They used a dynamic, page-based system from Yasure media that they then customized to keep the path data (the route visitors took to get to various pages on the site) so that they could analyze it and streamline their site in response. The methods – and results – were really pretty interesting. It seemed to provide a LOT more information then the standard web log analyzing software.
David King, the web branch manager for the Shawnee county library in Topeka, KS, gave an awesome presentation about how to go about using Web 2.0 tools in libraries. He started off with some unfortunate implementations (private MySpace profiles, blogs that hadn’t been updated since 2005, etc.) and basically talked about how to manage these new tools to serve patrons. His main theme was to think through the goals for your library’s use of MySpace, blogs or whatever it is you are using. He also said to think about the services you provide physically and start offering them digitally (virtual reference, blogs to recreate or replace the newsletter, etc.). He also went through the process of deciding the content. His advice – ask for participation, whether you ask actively (hey – will you all leave your suggestions in the comments?) or passively (use action-oriented titles – like “were you here?” on flickr sets of events to encourage users to comment and say what they thought of the program). He also discussed the “best practices” for social tools – such as leaving comments open and editing them as needed and answering all comments quickly to keep the conversation flowing. He also enumerated the many decisions that have to be made when rolling out a Web 2.0 service – who creates (staff, customers, both)? Who manages the content – posting, editing comments, etc.? He ended with a rundown of the specific decisions that need to be made with each popular 2.0 service – who, what and why, mostly.