Jump to content

Digital Libraries/Interaction design, usability assessment

From Wikiversity
  • Older versions of the draft developed by UNC/VT Project Team (2009-10-07 PDF WORD)



Module name

[edit | edit source]

Interaction Design & Usability Assessment

Scope

[edit | edit source]

This module covers the basic concepts and processes related to interaction design for digital libraries. Methods of user analysis, tasks analysis, prototype development, and usability testing will be discussed.


Learning objectives:

[edit | edit source]

By the end of this module, the student will be able to:

a. Explain the basic concepts and processes related to designing interfaces for digital libraries.
b. Develop user persona and benchmark tasks for usability testing of DL interfaces.
c. Conduct a heuristic evaluation of usability of a digital library


5S characteristics of the module:

[edit | edit source]
a. Society: DL interfaces are designed based on the needs of the user community of digital libraries.
b. Scenarios: Scenarios related to user activities in digital libraries can be reformatted as user tasks and used for testing the usability of the digital libraries.
c. Spaces: DL systems and their users interact with one another through the user interface, which is a space between the system and users.
d. Structures: usability evaluation methods have structures. For example, the cognitive walk-through is structured in a way that the analysts ask questions step-by-step to examine the DL system. Good structure in designing the user interface might ensure the increased usability of the system.
e. Stream: N/A


Level of effort required:

[edit | edit source]
a. Class time: 3 hours
b. Course preparation for students: 2 hours / session
i. Mostly associated with readings (See the reading list for students in 11. Resources, below)
c. Course assignment completion: 3 hours


Relationships with other modules:

[edit | edit source]
a. 5-b: Application software
5-b covers various features and technologies of DL application software, and these application software packages provide various templates of the interfaces easily applicable to build digital libraries as well as the architectural infrastructure to manage the sources.
b. 6-a: Info needs, relevance, 6-b: Online info seeking behavior and search strategy
In order to design effective information systems for digital libraries, it is important to learn how users understand systems and perform tasks before developing the user interfaces. Thus, the user characteristics and cognition involved with using digital libraries will be covered by 6-a, and 6-b. It is recommended to teach 6-a and 6-b prior to the current module, 6-d: Interaction Design module.
c. 6-e: Information Visualization
6-e discusses advanced features and technologies of interface design mostly relevant to emphasizing on the visual impact and use of DL sources, while the current module, 6-d, covers the basic process and skills relevant to design and evaluate interfaces.
d. 9-c: DL evaluation, user studies
9-c covers methods for evaluating the overall aspect of a digital library, such as its outcomes, impacts and benefits. The current module 6-d could be the subset of the user studies discussed in 9-c, focusing on the user information interaction.

Prerequisite knowledge required:

[edit | edit source]
a. No prerequisite courses are required.
b. No technical programming or any other skills are necessary.


Introductory remedial instruction:

[edit | edit source]
a. None

Body of knowledge:

[edit | edit source]

Session 1: Interaction Design & Usability Test

[edit | edit source]
1. Interaction Design in Digital Libraries
a. Visualizing what appears on the screen of a digital library
b. Identifying how users manipulate, search, browse and use objects in a digital library
c. Enhancing the effective interaction among components of digital libraries
d. Interacting design elements of digital libraries
Figure 1: The Cascade of Interactions in the Digital Library Interface (Bates, 2002)
2. The Process of Interaction Design
Lifecycle of Interface Design
Key Principles for User-Centered System Design (Available at: http://www.it.uu.se/research/hci/acsd/KeyPrinciplesPoster-v.1.2en.pdf)
a. Analyze requirements and user needs
b. Design for usability by prototyping
c. Evaluate use in context
d. Feedback plan the next iteration
3. Identifying users' needs and establishing corresponding requirements
a. User Analysis
i. Scope/definition of user group
ii. User characteristics
e.g., age, sex, perceptual abilities and handicaps, motor skills and physical disabilities, etc.
  • Knowledge and experience
e.g., level of education, reading level, native language, knowledge of particular (domain-specific) terminology, etc.
  • Computer/IT experience or knowledge
e.g., computer literacy (naive, novice, skilled, expert), level of experience with similar systems, level of experience with other systems
  • Level of experience with the task
  • Psychological characteristics
e.g., attitudes, motivation to use the system
b. Developing user personas
i. Understanding information needs and information seeking behaviors of users
(See, module 6-a and 6-b for detailed information)
ii. Building personas
  • Interviews with real users
Pointers for Developing Useful Persona (p.17) (Head, 2003)
  • Categorizing user types
e.g., Primary, Secondary, and Others
  • Annotating user characteristics according to the categories
c. Specifying User Tasks
i. Understanding typical DL tasks
  • The frequency or timing of the task
  • The complexity and difficulty of the task
  • The relationship of the task to other user tasks
  • The physical environment of task performance
  • The social, organizational and cultural environment of the task
ii. Challenges in developing effective tasks for usability test of digital libraries (Notess, Kouper, and Swan, 2005)
  • It is necessary to develop and refine tasks iteratively through testing different versions.
  • Real user needs are complex, so it is hard to create representative tasks focusing on testing a particular feature.
  • Unrealistic user tasks can make users hesitant and reluctant to carry them out.
4. Designing an interface
a. Applying the empirical evidence obtained from user analysis to the design
b. Making design decisions in every step of the process (Hutchins, Hollan, & Norman, 1986)
i. Planning phase
  • Designing interfaces in order to achieve user goals
ii. Translation phase
  • Transforming user behavioral intentions to specific actions
iii. Physical actions
  • Representing the planned sequence of actions to interfaces
iv. Assessment
  • Evaluating whether users sense and interpret the interfaces as they are meant to be.
  • Evaluating whether the outcomes achieve the user goal
c. Constructing a prototype of the interface
i. Prototype: "A limited representation of a design that allows users to interact with it and to explore suitability" ((Rogers, & Preece, 2007, p.241)
ii. Making design decisions concrete by building them into the prototype
d. Considering universal accessibility
i. Physical disability
ii. Internalization/Cultural difference
iii. Language barriers
iv. Computer/Network Illiteracy
5. Evaluating the design
a. Usability criteria applied to DL design
i. Jakob Nielsen's five usability attributes (Nielsen, 1993)
  • Learnability (easy to use)
  • Efficiency (efficient to use)
  • Memorability (easy to remember)
  • Errors (a low error rate)
  • Satisfaction (pleasant to use)
ii. Kling and Elliott's usability for digital libraries (Kling & Elliott, 1994)
1. Interface usability (4 attributes from Nielsen (1993))
  • Learnability
  • Efficiency
  • Memorability
  • Errors
2. Organizational usability
  • Accessibility
  • Compatibility
  • Integrability into work practices
  • Social-organizational expertise
iii. User acceptance of digital libraries (Thong, Hong, & Tam, 2002)
1. Interface characteristics
  • Terminology clarity
  • Screen design
  • Navigation
2. Organizational context
  • Relevance of the information systems to users' information needs
  • System accessibility
  • System visibility
3. Individual differences
  • Computer's self-efficacy
  • Computer experience
  • Domain knowledge
b. Usability Evaluation by Analysts: Intended for formative evaluation
i .Heuristic Evaluation (Blandford, et al., 2004)
  • Quick, cheap, easy, and the most popular evaluation method
  • Working through every page or screen of a system
  • 3-5 analysts are recommended to examine the system (Nielsen, 1994)
  • Revising prototypes based on problems identified and prioritized during the evaluation
(For more detailed information about how to conduct a heuristic evaluation, please go to useit.com at http://www.useit.com/papers/heuristic/.)
ii. Cognitive Walk-through (Wharton, et al., 1994)
  • User-centered perspective in design A team of analysts walk through tasks according to the following questions:
1. Who will be the users?
2. What tasks are to be analyzed?
3. What is the correct action sequence for each task?
4. How is the interface defined?
  • Each analyst evaluates the interface, considering the following questions:
1. Will the user try to achieve the right goal? Does the interface raise the confusion in the goal?
2. Will the user notice that the correct action is available?
3. Will the user associate the action with their goal?
4. If the correct action is performed, with the user see that progress is being made towards the goal?
  • The notes from each analyst are compared and analyzed.
  • Revising prototypes based on problems identified and prioritized during the evaluation
iii. Claims Analysis (CA)
  • Claims are "statements about the positive and negative effects of a design on the user within a particular context of use (a 'scenario')" (Blandford, et al., 2004, p. 31)
  • Less structured than Cognitive Walk-through, but more structured than Heuristic Evaluation
  • Creating scenarios first.
  • Analysts walk through the major features of the design while generating positive and negative claims about the design.
  • Analyzing the claims and identifying the methods to improve the claims.
c. Usability Evaluation through Usability Testing: intended formative or summative evaluation
i. Subset of user studies of DL evaluation
(For more detailed information about user studies, go to module 9-c: DL Evaluation, User studies)
ii. Usability Testing (Nielson, 1993)
  • Test goals, plans and budget
  • Getting test users (novice vs. experts)
  • Selecting test experimenters (facilitators)
  • Stages of a Test: Preparation, Introduction, Running a Test, Debriefing
  • Think Aloud Protocols (i.e., construct interaction, retrospective testing)
  • Lab Setting
Figure 20. Floor plan for a hypothetical, but typical, usability laboratory (Nielson, 1993, p201)
  • Videotaping a user test
  • Screen capturing
iii. Use of the results
  • Interpreting and analyzing collected data
  • Making recommendations based on data analysis
  • Prioritizing the problems to change
  • Revising prototypes (interfaces)

Resources

[edit | edit source]

Reading for Students

[edit | edit source]
i. Bates, M. J. (2002). The Cascade of Interactions in the Digital Library Interface. Information Processing and Management, 38, 381-400.
ii. Borgman, C. L. (2003). Designing digital libraries for usability. In Bishop, A.P.; Van House, N.: & Buttenfield, B.P. (Eds.), Digital Library Use: Social Practice in Design and Evaluation (pp. 85-118). Cambridge, MA: The MIT Press.


Recommened Readings

[edit | edit source]

1. Interaction Design

i. Arms, W. (2000). Chapter 8. User interface and usability. In Digital Libraries (pp. ). Cambridge, MA: The MIT Press
ii. Carroll, J. M. & Rosson, M. B. Getting around the taskartifact cycle: how to make claims and design by scenario. ACM Transactions on Information Systems, 10.2, 1992, 181- 21.
iii. Head, A. (2003). Personas: setting the stage for building usable information sites. Online, 27(4), 14-21.
iv. Notess, M., Kouper, I., & Swan, M. B. (2005). Designing effective tasks for digital library user tasks: lessons learned. OCLC Systems & Services. 21(4), 300-310.
v. Rogers, Y., & Preece, J. (2007). Interaction Design: Beyond Human-Computer Interaction. New York, NY: John Wiley & Sons.
vi. Shneiderman, B. & Plaisant, C. (2004). Designing the user interface: strategies for effective human-computer interaction (4th ed.) Reading, Mass: Addison-Wesley.
vii. Hutchins, E. L., Hollan, J. D., & Norman, D. A. (1986). Direct manipulation interfaces. In Norman, D. A., & Draper, S. W. (eds.), User Centered System Design. Hillsdale, NJ: Lawrence Erlbaum, 87-124.

2. Usability Testing

i. Blandford, A., Keith, S., Connell, I., & Edwards, H. (2004). Analytical usability evaluation for digital libraries: a case study. Proceedings of the Joint ACM/IEEE Conference on Digital Libraries (pp. 27-36).
ii. Buttenfield, B. (1999) Usability evaluation of digital libraries. Science and Technology Libraries, 17(3-4), 39-59.
iii. Keith, S., Blandford, A., Fields, R. & Theng, Y. (2002). An investigation into the application of Claims Analysis to evaluate usability of a digital library interface. In A. Blandford & G. Buchanan (Eds.) Proc. Workshop on Usability of Digital Libraries at JCDL.
iv. Kling, R., & Elliott, M. (1994). Digital library design for usability. Proceedings of the First Annual Conference on the Theory and Practice of Digital Libraries, 146-155
v. Nielsen, J. (1993). Usability engineering. Boston, MA: Academic Press.
vi. Nielsen, J. (2008a). Accessibility Is Not Enough. Retrieved April 18, 2008 from USEIT Website: http://www.useit.com/alertbox/accessibility.html
vii. Nielsen, J. (2008b). How to Conduct a Heuristic Evaluation. Retrieved April 18, 2008 from USEIT Website: http://www.useit.com/papers/heuristic/heuristic_evaluation.html
viii. Norlin, E. (2002). Usability Testing for Library Web Sites: A Hands On Guide. ALA: Washington, D.C.
ix. Shneiderman, B. (1998). Designing the User Interface: Strategies for Effective Human Computer Interaction (3rd ed.). Reading, MA: Addison-Wesley.
x. Thong, J., Hong, W., & Tam, K. (2002). Understanding user acceptance of digital libraries: What are the roles of interface characteristics, organizational context, and individual differences? International Journal of Human-Computer Studies, 57(3), 215-242.
xi. Wanrton, C., Rieman, J., Lewis, C., & Polson, P. (1994). The cognitive walkthrough method: a practitioner's guild. In Usability Inspection Methods. New York, NY: John Wiley & Sons, Inc

3. Think-Aloud Protocols

i. Boren, M. T., & Ramey, J. (2000). Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3), 261-278.
ii. Van Someren, M. W., Barnard, Y., and Sandberg, J. (1994). The think aloud method - a practical approach to modeling cognitive processes, London: Academic Press.

Concept map

[edit | edit source]
None

Exercises / Learning activities

[edit | edit source]
1. In-class Exercise I: Developing User Personas
a. This is a group activity. Students can be grouped by 3 or 4. Select a digital library that the group wants to use and evaluate for the exercise. Identify who are the primary users of a DL (and secondary users, if appropriate). Develop two personas per type of users (a novice user and an advanced user).
b. The Persona can be designed variously depending on what kinds of digital libraries are evaluated and how the tasks of the evaluation are designed. Here is an outline of what each person should include. (The list of items was originally developed from Cooper (1999). It was modified by Head (2003).) If appropriate, students can add or delete items from the following list, when they develop their own personas for this exercise.
i. A name
ii. Age
iii. A Photo
iv. Personal information, including family and home life
v. Work environment
vi. Computer proficiency and comfort level with using the Web
vii. Pet peeves and technical frustrations
viii. Attitudes
ix. Motivation or "trigger" for using a high-tech product
x. Information-seeking habits and favorite resources
xi. Personal and professional goals
xii. Candid quotes
c. After the group activity, each group will have presentations of which digital library they are targeting to develop the persona and who are the users in persona. The processes and challenges of developing a persona will be discussed in the class.
d. References
i. Cooper, A. (1999). The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity, Indianapolis: Sams.
ii. Head, A. J. (2003). Personas: setting the stage for building usable information sites. Online, 27(4), 14-21.
2. In-class Exercise II: Developing user tasks
a. This is a group activity. Form a group with the same students from the persona exercise. Use the same digital library that the group used for developing the user personas. Since the group did an exercise to create the user personas, it is expected that the group has abstract ideas about what users do in the digital library. In this exercise, the group has a chance to articulate the expected user activities and create user tasks pertinent to the digital library.
b. Two tasks should be designed. After creating the tasks, analyze the tasks based on the following guideline. Write the purpose of the study, the instruction of what to do in the context of a digital library, and example tasks.
c. To illustrate the finished product, here are the tasks used for a study of the Open Video repository:
i. Imagine that you are an emergency response officer in California and that you are developing an online tutorial on how to respond to an earthquake. You would like to illustrate the tutorial with several videos showing the damage that can be caused by an earthquake.
ii. Imagine that you are a geography professor and are developing a presentation for your introductory class on the differing roles of rivers. You'd like to show clips from recent videos (since 1990) of several different rivers.
iii. Imagine that you are a video enthusiast, having studied video production techniques since you were in your teens. You are interested in creating a montage of a selection of the really early films from the Open Video collection that are most popular with users of the site.
iv. Imagine that you are a history professor, teaching a course on the history of technology in the U.S. You want to find some videos that illustrate America's growing obsession with cars/automobiles between 1930 and 1950.
d. After creating the tasks, groups will be paired up and each group will perform the tasks from the other group. After this experience with writing tasks and completing assigned tasks, the class should discuss the processes and challenges in developing, performing and analyzing DL search tasks.
3. Homework Assignment: Heuristic Evaluation
a. The instructor will introduce a digital library to the class. The selected digital library should be in the process of being built, e.g., it is available only in prototype, or has only recently been released. Students will be formed into groups of 2-3 people and each group will carry out the heuristic evaluation of the digital library.
b. Each group should conduct an evaluation of the selected digital library, identifying the ways in which its design violates one or more of the usability heuristics proposed by Jakob Nielsen:
i. Visibility of system status
ii. Match between system and the real world
iii. User control and freedom
iv. Consistency and standards
v. Error prevention
vi. Recognition rather than recall
vii. Flexibility and efficiency of use
viii. Aesthetic and minimalist design
ix. Help users recognize, diagnose, and recover from errors
x. Help and documentation
(For more detailed information about how to conduct a heuristic evaluation, please go to useit.com at http://www.useit.com/papers/heuristic/.)
c. Each violation of a heuristic that is identified should be briefly described, along with notes about the ways in which the heuristic is violated.
d. After the evaluation, each group will write the evaluation results on blackboard in the classroom to share the results with the rest of the classmates. Compare the results and discuss the similar/different findings among groups. Prioritize the recommendations in order to improve the DL, and make suggestions for how the design's usability could be improved.


Evaluation of learning outcomes

[edit | edit source]

Glossary

[edit | edit source]
No glossary terms needed.
[edit | edit source]
None

Contributors

[edit | edit source]
a. Developers:
i. Sanghee Oh