Skip navigation and jump directly to page content
Indiana University Bloomington
Choose site to be searched
Type search terms

Human Computer Interaction

Submitted by:
Dr. Andrew Dillon
Associate Professor of Information Science and
Core Cognitive Science Faculty member

When I was an undergrad in psychology in the dim and distant 1980s, I remember, as my final year neared completion, thinking about what I could actually do with that qualification. While colleagues and friends with business and engineering degrees had a ready made career mapped out, the path ahead for a psychologist looked bleak. Certainly I had covered the more interesting material as a psych major, but it was the grads with professional training who were getting the job offers.

Coupled with this was a concern raised in my mind throughout my undergraduate days that psychology didn’t seem to equip me with anything special that I contribute to the real world I was about to enter. Certainly I could discriminate the scientific from the popular, I knew experimental methods and could analyze statistics but I worried that maybe all this good theoretical work I had stuffed into my brain for the previous few years was not going to offer much by way of practical skills I could sell an employer.

Rather than do the sensible thing and get a Masters in Computer Science or an MBA, I opted to do the apparently insane: I registered for more psychology as a grad student. However, this time, my advisor set me looking at the emerging world of human-computer interaction (HCI) and convinced me that my search for a way of having a meaningful impact through the application of cognitive psychology lay in this direction. In so doing he set in motion a chain of events that has landed me here.

Looking back on the last 15 years of software design it is amazing to see how concerns with user cognition and behavior have shifted from a peripheral academic endeavor to a mainstream focus of all developers. Every software company now seeks to be 'user-centered' and the importance of understanding usability and acceptability of new computers is emphasized throughout product development. In a comparatively short space of time, studies of cognition have shifted from the periphery to the center in the development of future technologies.

From the vantage point of 1999, one might wonder how it could ever have been different. The technology is everywhere, the interfaces an equal mixture of the brilliant and the baffling, and routine design flaws cause uproar, flame-wars and frustration. Since the majority of users of computers are not technical experts, the interfaces that enable them to perform tasks need to be designed with such users' expectations and capabilities in mind. This much is generally agreed upon.

However, agreement is one thing, following this through so that computers are designed for human use is much more difficult than it sounds. Typical computer science graduates make great programmers, but they do not necessarily make good user-centered designers. Making an artifact easy to use turns out to be an extremely difficult and skill-laden task. First, even if one understands that intended users do not have the time or motivation to master their information tools, it is no simple matter to create a set of commands or icons that represent their meaning to users on first exposure. Second, even if one acknowledges that computers are tools that should help people perform their work, it is no easy matter to study human activities and clearly deduce design implications from such observations. Third, even if one realizes that testing of new designs is important, it requires skill to devise a test method that will capture reliable and valid measures of user performance.

The field of HCI (Human-Computer Interaction) has grown up over the last 20 years trying to solve most of these problems. In so doing, HCI has drawn on, extended and re-packaged a century of research on cognition, offered a testing arena for cognitive theory to show its relevance, and afforded cognitive scientists careers in the software industry that are rewarding, research-oriented, and have impact on real designs. Right now, the demand for people with these skills outstrips the supply, and that is something even I could not have foreseen as a student in the early 1980s. At IU we teach HCI relevant courses across several depts. At SLIS we run a graduate program in Information Science that is heavily HCI oriented. In Cognitive Science we offer a Certificate in HCI to doctoral students. IU also now offers a standalone PhD minor in HCI. As the Informatics school takes shape, HCI will no doubt be offered at an undergraduate level also. Even for those of you not interested in a career in this field, HCI offers a window on the practical impact cognitive science can have in real design projects.

So, a degree in cognitive science is actually a path to a range of careers, with HCI being just one of the more obvious. What you learn in a disciplined program of study might not appear to have immediate transfer to the job market until you stand back and think what would it take to make software better. There is no simple answer to that question, but the range of answers that might follow will no doubt allude to representation, perception, learning, memory, language, and action, central issues of cognitive science. In comparison to knowledge of cognition, programming skills are easier to obtain and have a far shorter shelf-life!