All eyes on the SLIS lab

Denise Wright

Utility lab manager Aaron Rosenberg sets up an eye scanning test to show the hot spots on a Web site. Hot spots are places on a site where the eye tends to focus the most. KATIE ROUPE | DAILY KENT STATER

Credit: Ron Soltys

A select group of Kent State students is getting paid to search the Web at the School of Library and Information Sciences Usability Lab.

“The lab is a cutting edge facility for studying human-computer interaction,” said Aaron Rosenberg, usability lab business manager.

The SLIS Usability Lab, which is located on the third floor of the library, is home to an assortment of high-tech software and hardware that researchers use to monitor video, audio, keystrokes and shots of screen movement during research studies. The lab is also well-known for its Tobii eye-tracking system, which is used to capture participants’ eye movement.

Lab researchers use this equipment to run a small business known as Scanpath. Rosenberg said Scanpath gets most of its business from companies that approach the lab researchers about conducting studies to test their Web site designs.

The system measures eye gaze and what the participant is looking at. The results help designers organize Web sites better, and display information to viewers more successfully.

“There aren’t too many of these types of labs in the area,” Rosenberg said. “We have technology that not too many people have and that gives us a competitive edge.”

Rosenberg said the majority of the lab’s business comes from EBSCOhost, a popular research database. He said they get a great deal of local business as well.

“We recently took our eye-trackers to the Progressive headquarters (in Cleveland) and did a demonstration on how the eye-tracking equipment could be used in their lab,” he said. “They were so excited that they didn’t hire us; they bought their own eye-tracking equipment.”

Rosenberg said the lab also gets a great deal of business from Ohio public libraries, and researchers are currently in the process of setting up a contract to help redesign the Web site of a local public library.

After researchers set up contracts with a company, they spend two to three weeks writing up the study and recruiting participants for it.

“It takes a couple of weeks to set up, but we usually only spend two days bringing participants in for a study,” Rosenberg said.

To encourage participation, Rosenberg said they require every company they work with to provide an incentive, usually a gift card, for participants.

“We make sure every participant is compensated for their time,” he said. “They’re usually given $20-$40 an hour, but we have had clients give $100 an hour for more specific studies.”

Junior psychology major Megan McElroy participated in a Web site usability study, and said she would be willing to participate again in the future. The study she participated in compared two academic search engines by giving participants a search scenario and seeing which site was more helpful.

“It was fairly quick, easy and we got paid,” she said.

Emily Johnson, freshman middle childhood education major, participated in the same study. She agreed with McElroy, saying it was a “fun and easy way to earn money.”

While Johnson said she was completely comfortable working with the eye-tracking device, McElroy said it made her slightly uncomfortable at first, but she was able to relax after some time.

“The researcher was very kind and made me relax by offering an alternative way of doing something or suggesting I try something else,” she said.

Rosenberg said it’s sometimes difficult to tell if participants are feeling any anxiety about working with the eye-trackers.

“Sometimes people get nervous when they come in because they think they’re being tested,” he said. “It’s hard not to be when you’re being closely observed.”

Although Rosenberg said he understands why participants may be anxious, he also said they have reason to be at ease.

“It’s not a test of how well you use the interface, but how well the interface works for you,” he said. “The only thing that can fail is the interface. If you’re having trouble with the interface it’s because there’s a problem with the interface design.”

After completing the study, participants are shown on-screen results of their session. These results show where they were looking on each page, how long they were looking at a particular spot and in what order they looked at each item.

Rosenberg said most studies require about 10 participants and are open to everyone, not just students.

Contact features reporter Denise Wright at [email protected].