Jony Ive is a fantastic designer. As a rule, his vision for a device sets the trend for that entire class of devices. Apparently, Jony Ive hates skeuomorphic design elements. Skeuomorphs are those sometimes corny bits of realism some designers add to user interfaces. These design elements reference an application’s analog embodiment. Apple’s desktop and mobile interfaces are littered with them. Their notepad application looks like a notepad. Hell, the hard drive icon on my desktop is a very nice rendering of the hard drive that is actually in my desktop.
Blog Category: human-computer interaction
Thanks to Frank Nack and Marc Bron, last week I had the opportunity to give a talk in The Netherlands at a NWO CATCH event organized by BRIDGE. NWO is the Dutch national research organization; BRIDGE is a project that explores access to television archives; and CATCH stands for Continuous Access To Cultural Heritage, which is something like an umbrella organization. The meeting was held at the Netherlands Institute for Sound and Vision in Hilversum, a rather interesting building.
Although it was a long way to go for a one-day event, I am grateful to Frank and Marc for the invitation, for their efforts as hosts, and for all the great discussion during the talk, in the breaks between sessions, and, of course, over beers in the evening. It’s great to be able to make such connections; hopefully more collaboration will follow.
For those interested, here are the slides of my presentation, which expands a bit on my earlier blog post about using the history of interaction to improve exploratory search.
Exploratory search is an uncertain endeavor. Quite often, people don’t know exactly how to express their information need, and that need may evolve over time as information is discovered and understood. This is not news.
When people search for information, they often run multiple queries to get at different aspects of the information need, to gain a better understanding of the collection, or to incorporate newly-found information into their searches. This too is not news.
The multiple queries that people run may well retrieve some of the same documents. In some cases, there may be little or no overlap between query results; at other times, the overlap may be considerable. Yet most search engines treat each query as an independent event, and leave it to the searcher to make sense of the results. This, to me, is an opportunity.
We are happy to announce that the 2012 Human-Computer Information Retrieval Symposium (HCIR 2012) will be held in Cambridge, Massachusetts October 4 – 5, 2012. The HCIR series of workshops has provided a venue for discussion of ongoing research on a range of topics related to interactive information retrieval, including interaction techniques, evaluation, models and algorithms for information retrieval, visual design, user modeling, etc. The focus of these meetings has been to bring together people from industry and academia for short presentations and in-depth discussion. Attendance has grown steadily since the first meeting, and as a result this year we have decided to modify the structure of the meeting to accommodate the increasing demand for participation.
Update: This intern slot has been filled.
It’s intern season again! I am looking for a PhD student well-versed in persuasive/affective computing/captology literature to participate in a research project related to improving the quality of interaction in information seeking environments. The goal of the project is to explore how to increase people’s engagement with systems while performing exploratory search. We would like to improve our current system to make it more usable and to explore some novel interaction techniques.
Applicants should be familiar with basic tactics of designing affective and engaging interfaces in a web-based environment. The internship will last three months, and will be structured to produce and evaluate research systems. As a further incentive, we expect to publish the results of this work at CHI 2013, which will be held in Paris. For more information on the intern process, please see the FXPAL web site, or contact me directly. I would like to fill this internship slot as soon as possible.
Google recently unveiled Citations, its extension to Google Scholar that helps people to organize the papers and patents they wrote and to keep track of citations to them. You can edit metadata that wasn’t parsed correctly, merge or split references, connect to co-authors’ citation pages, etc. Cool stuff. When it comes to using this tool for information seeking, however, we’re back to that ol’ Google command line. Sigh.
We are about to deploy an experimental system for searching through CiteSeer data. The system, Querium, is designed to support collaborative, session-based search. This means that it will keep track of your searches, help you make sense of what you’ve already seen, and help you to collaborate with your colleagues. The short video shown below (recorded on a slightly older version of the system) will give you a hint about what it’s like to use Querium.
So Microsoft is suing Barnes & Noble for patent infringement. Well, that’s what patents are for: the right to sue. And that’s what licenses are for: the right to avoid getting sued. The only thing is, if you’re going to sue someone with half a brain, you should at least make sure your patent is reasonably solid.
With that said, one of the patents that Microsoft claims that Nook is violating deals with annotating documents, an area I know a bit about. The patent, filed in December 1999, claims a system and method to associate annotations with a non-modifiable document. The idea is that file positions in the document associated with user-selected objects are used to retrieve annotations from some other location, and to display these annotations for the user.
Sounds obvious, no? So obvious, in fact, that when we built such a system in 1997, we didn’t bother patenting this.
Tobii and Lenovo presented a laptop with a built-in eye tracker at CeBit last week. The eye tracker allows the user to control the laptop, for instance selecting files to open and selecting active window from an expose like view. Engadget have a video of a demonstration of the eye control on the laptop here. I wished I could get my hands on it for some testing. A laptop with a built-in eye tracker certainly has potential, from making eye tracking easier and more flexible for disabled and making usability testing using eye tracking more flexible allowing the usability specialist to move from their labs to the field.
Thanks to Mor Namaan, I came across an interesting blog post by Justin O’Beirne that analyzed the graphic design of several different maps — Google, Bing, and Yahoo — to show why Google maps tend appear easier to read and to use. The gist of the analysis is that legibility is improved through a number of graphical techniques that in combination produce a significant visual effect.
And of course knowing Google, this stuff was tested and tested and tested to get the right margins around text, the right gray scale for the labels, the right label density, etc.
So why did Justin have to reverse-engineer this work to understand it?