CSC 4310 Human Computer Interface Note
CSC 4310 Human Computer Interface Note
CSC 4310 Human Computer Interface Note
CSC 4310
Human Computer Interface
2023
Course Contents
Foundations of HCI, Principles of GUI, GUI toolkits;
Human-centred software evaluation and development;
GUI design and programming.
Foundations of HCI
Definition
The following definition is given by the Association for Computing Machinery (ACM)
"Human-computer interaction is a discipline concerned with the design, evaluation and
implementation of interactive computing systems for human use and with the study of major
phenomena surrounding them."
Because of the interaction that occurs between users and computers at the interface of software
and hardware such as between computer peripherals and large-scale mechanical systems in
aircraft and power plants, human-computer interaction is the study of that interaction between
people (otherwise called users) and computers. It can also be regarded as the intersection of
computer science, behavioral sciences, design and several other fields of study.
Overview
Since human-computer interaction studies a human and a machine in conjunction, it draws from
supporting knowledge on both the machine and the human side. On the machine side, techniques
in computer graphics, operating systems, programming languages, and development
environments are relevant. On the human side, communication theory, graphic and industrial
design disciplines, linguistics, social sciences, cognitive psychology, and human performance are
relevant. Engineering and design methods are also relevant.
The multidisciplinary nature of HCI enables people with different backgrounds contribute to its
success. HCI is also sometimes referred to as man—machine interaction (MMI) or computer—
human interaction (CHI).
Research
Part of research in human-computer interaction involves exploring easier-to-learn or more
efficient interaction techniques for common computing tasks. This includes inventing new
techniques and comparing existing techniques using the scientific method as follows:
1. Designing graphical user interfaces and web interfaces.
2. Developing new design methodologies,
3. Experimenting with new hardware devices,
4. Prototyping new software systems,
5. Exploring new paradigms for interaction, and
6. Developing models and theories of interaction.
Interaction technique
An interaction technique or user interface technique is a combination of input and output
consisting of hardware and software elements that provides a way for computer users to
accomplish a simple task. For example, one can go back to the previously visited page on a Web
browser by either clicking a button, hitting a key, performing a mouse gesture or uttering a
speech command.
Interaction styles
Interaction techniques that share the same metaphor or design principles can be seen as
belonging to the same interaction style. Examples are command line and direct manipulation user
interfaces.
More details are provided in subsequent chapter of this guide.
The initial paradigm started with batch processing that signified impersonal computing. The
paradigm shifts commenced from timesharing processing system that signified an interactive
Computing.
Principles of GUI
Survey of Human Computer Interaction Practices
Nature of Interactivity
Long ago going down memory lane, 'remote computer interaction' was done through batch
processing involving punched card stacks or large data files prepared with long wait for the line
printer output. And if it is not right the wait continued indefinitely ...
But now most computing is 'truly' interactive with rapid feedback and the user is in control most
of the time with the thinking taken over by the computer. A typical computer system interaction
is carried out through input devices such as: the screen or monitor, keyboard, mouse or track pad.
The input devices exist in variations of desktop, laptop, mainframe computers and Personal
Digital Assistants (PDAs). The devices dictate the styles of interaction that the system supports.
If we use different devices, then the interface will support a different style of interaction. In order
to understand the nature of human-computer interaction, one needs to understand the computer
systems.
Terms of interaction
Domain: This is the area of work under study e.g. a graphic design
Goal: This is what you want to achieve e.g. to create a solid red triangle
Task: Concerns how you go about doing it, ultimately in terms of operations or actions e.g select
the fill tool, click over the triangle
Donald Norman's model
These are in seven stages as follow:
1. The user establishes the goal
2. The user formulates intention
3. The user specifies actions at interface
4. The user executes the action
5. The user perceives the system state
6. The user interprets the system state
7. The user evaluates the system state with respect to goal
Norman's model concentrates on the user's view of the interface Execution and evaluation loop.
Interpretation
Goal: The user establishes the goal
Execution: The user formulates intention. The user specifies actions at interface and user
executes the action
Evaluation: The user perceives the system state. The user interprets the system state and the user
evaluates the system state with respect to goal.
Interaction could harbour some human errors which may be slips and mistakes. Slips may
include lack of understanding the system and goal, incorrect formulation of action, incorrect
action and mistake of not even having the right goal! To fix slips, better interface design should
be carried out while to avoid mistakes, one should better understand the system
To avoid some of the Human errors, Abowd and Beale framework is adopted. Abowd and Beale
framework is an extension of Norman model and it has 4 parts namely:
i. the user,
ii. the inp ut,
iii. the system, and
iv. the output while each framework has its own unique language.
If interaction is the translation between languages, and if there are problems in interaction, then
there would be problems in translation.
Ergonomics
This considers both the physical aspects of interfaces and the industrial interfaces. Ergonomics is
the study of the physical characteristics of interaction. It is known as human factors. Ergonomics
is good at defining standards and guidelines for constraining the way we design certain aspects
of systems Examples of Ergonomics include: Arrangement of controls and displays such as the
controls grouped according to function, frequency and sequence of use. Surrounding
environment such as the seating arrangements adaptable to cope with all sizes of user, health
issues such as the physical position, environmental conditions (temperature, humidity), lighting,
and noise. Use of colour such as the use of red for warning, green for okay, and awareness of
colour-blindness etc. The user interacts with real world through interface issues, feedback and
delays.
Common Interaction styles
Two major classes of interaction styles will be considered, they are: -
Dialogue Style of Interaction between computer and user Distinct styles of interaction. Both are
expressed in the following common forms of interfaces:
• Command line interface
• Menus
• Natural language
• Question and answer, and query dialogue
• Form-fills and spreadsheets
• WIMP
• Point and click
• Three-dimensional interfaces
Menus
Menus is a set of options displayed on the screen. The Menu Options are visible, it has a less
recall characteristic that make it easier to use. The visible options rely on recognition so the
names should be meaningful. The selection is done through numbers, letters, arrow keys, mouse
and/or combination of any of them e.g. mouse plus accelerators. Often, the options are
hierarchically grouped. But sensible grouping is needed.
Natural language
This is the language familiar to the user. It may be in form of speech recognition or a typed
natural language. Problems with in this kind of interaction are that the language may be vague,
ambiguous, and hard to be recognised. Design solutions to language interface problems are for
the user to try to understand a subset and pick on key words.
Query interfaces
These comprise question and answer interfaces in which the user is led through interaction via
series of questions. Though with restricted functionality, this kind of interface is suitable for
novice users. It is often used in information systems.
Query languages (e.g. SQL): This is used to retrieve information from database. It requires
understanding of the database structure and language syntax, hence requires some expertise.
WIMP Interface
This interface comprises Windows, Icons, Menus, and Pointers or Windows, Icons, Mice, and
Pull-down menus! The interface is the default style for majority of interactive computer systems,
especially PCs and desktop machines.
Elements of the WIMP interface: The elements include windows, icons, menus, and pointers.
In some other cases they may be buttons, toolbars, palettes, and dialog boxes. Understanding the
concept of 'Look and feel' WIMP systems have the same elements: as windows, icons., menus,
pointers, buttons, etc. but have different window systems that behave differently. For example,
Macintosh Operating System (MacOS) compared with Windows menus. The combination of the
appearance and the behaviour is the 'look and feel'
Windows: Windows are areas of the screen that behave as if they were independent. They can
contain text or graphics and can be moved or resized. They can overlap and obscure each other,
or can be laid out next to one another (tiled)
Icons: Icons are small pictures or images that represent some object in the interface. They appear
often as windows or as actions. Windows can be 'iconised' that is closed down. They are small
representations that fit many accessible windows. Icons can be many and various. They can be
highly stylized with realistic representations.
Menus: These are choice of operations or services offered on the screen. The required option is
selected with the pointer. However, this takes a lot of screen space This problem is partly solved
when a pop-up menu appears when needed.
Kinds of Menus: Menu Bar at top of screen (normally), menu drags down
1. Pull-down menu - mouse hold and drag down menu
2. Drop-down menu - mouse click reveals menu.
3. Fall-down menus - mouse just moves over bar!
Cascading menus: This has a hierarchical menu structure in which a menu selection opens new
menu and so in ad infinitum.
Keyboard accelerators: This comprises key combinations with same effect as menu item. They
operate in two modes;
1. active when menu open - usually first letter and
2. active when menu closed - usually Ctrl + letter
Menus design issues: In order to design an effective menu, the following issues should be
considered:
• which kind to use
• what to include in menus at all
• words to use ( in action or description)
• how to group items
• choice of keyboard accelerators
Palettes and tear-off menus: • Palettes are little windows of actions shown or hidden via menu
option in available shapes in drawing package.
• In tear-off and pin-up menus, menu 'tears off' to become palette
Pointers: Pointers are important WIMP style components that point on and select. They are
activated by the use of mouse, track pad, joystick, trackball, cursor keys or keyboard shortcuts.
They are in wide variety of graphical images. See examples below;
Point and click interfaces: Point and click interfaces are used in multimedia, web browsers, and
hypertext. You just click something such as icons, text links or location on map. It requires
minimal typing.
Scrollbars: Scrollbars allow the user to move the contents of the window up and down or from
side to side.
Buttons: This is an individual and isolated region within a display that can be selected to invoke
an action The Special kinds that exist are; The radio buttons with a set of mutually exclusive
choices and the check boxes with a set of non-exclusive choices.
Toolbars: These are long lines of icons with fast access to common actions and are often
customizable: You can choose which toolbars to see and choose what options are on it.
Dialogue boxes: These are information windows that pop up to inform of an important event or
requested information, for example when saving a file, a dialogue box is displayed to allow the
user to specify the filename and location. Once the file is saved, the box disappears.
The interactivity of dialogue boxes: They are easy to focus on look and feel.
Other types of interaction styles are speech driven interfaces: The development of this kind
of interface is yet to be perfect and accurate; though it is rapidly improving. Example of speech
driven interface dialogue on an airline reservation:
reliable "yes" and "no"?
+ System reflects back its understanding
"you want a ticket from New York to Boston?"
Context Analysis: Context analysis includes understanding the technical, environmental and
social settings where the information systems will be used. It examines whether and how the
interaction between physical and social environment and the physiological and psychological
characteristics of the user would impact users interacting with the system. There are four aspects
in Context Analysis: physical context, technical context, organizational context, and social and
cultural context. Overall, context analysis can provide ideas for design factors such as metaphor
creation, selection and patterns of communications between users and the system.
Physical context: This considers where the tasks carried out, what entities and resources
are implicated in task operation, What physical structures and entities are necessary to
understand observed task action. For example, an ATM machine can be used in a mall
outside a bank office, or in a night club. These environments provide different levels of
lighting, crowdedness, and noisiness. Thus legibility of the screen, use of audible devices
for input or output, or even the size of the working space to prevent people nearly to see
the screen could be designed differently.
Technical context: This considers the technology infrastructure, platforms, hardware and
system software, wired or wireless network connection? For example, an E-commerce
website may be designed to allow access only to people with certain browser versions.
The website may also be designed to allow small screen devices such as PDA or mobile
phone to access.
Organizational context: Organizational context may play different roles in internal and
external situations. For an organizational information system to be used by the
organization's own employees, organizational context analysis answers questions such as:
o What is the larger system where this information system is embedded?
o What are the interactions with other entities in the organization?
o What are the organizational policies or practice that may affect individual's
attitude and behavior towards using the system?
For example, assuming that Lotus Note is used by an organization as a communication
and collaboration tool, management may depend on using the tool to set up meetings by
checking employees' calendars on mutually available time slots. The effectiveness of
setting up meetings depends on whether employees use the tool, and how they use it. The
whether and how questions can be enforced by organizational policies.
Social and cultural context: What are the social or cultural factors that may affect user
attitudes and eventual use of the information system? In an E-Commerce website
example, the website can be accessed from all over the world. It thus is a design
consideration that the website allows access by people with any language and cultural
background that can provide credit cards with the foreign currency exchange, or it is only
accessible to people who speak certain languages and are from certain cultures.
Interactions are also affected by other social and organizational context as follow:
• By other people: A desire to impress, competition among stakeholders, and fear of failure
from management
• Motivation from management as against fear, allegiance, ambition, self-satisfaction that
exist among employees
• Existing inadequate systems that may cause frustration and lack of motivation.
The organizational, social and cultural context in which humans interact with IT is largely the
result of the broad adoption of IT by organizations and society to support organizational
functions and goals and to enhance society's development. For example, organizational
efficiency may be expected due to redesign of workflows among critical business units that is
affected by the implemented IT; satisfaction and retention of customers/clients are anticipated
due to accurate and fast information gathering and presentations, to name a few. Some of the
organizational or societal impacts may not be tangible or directly attributed to HCI
considerations. This assertion is in line with the issues of determining IT values in organizations
and societies. While each of these HCI concerns may have its own importance in different
situations in relation to human motivation, it would be helpful for designers to see an overview
picture of the potential HCI concerns and goals. The purpose of this picture is not to force every
IT to be compliant with all the HCI concerns, but to provide an overall framework so that
designers can use it as a roadmap and to apply it according to different situations.
Multi-Sensory Systems
Here, more than one sensory channel are involved in interaction as in sounds, text, hypertext,
animation, video, gestures and vision. They are used in a range of applications particularly for
users with special needs and virtual reality. The components of Multi-Sensory systems are:
Speech, Non-speech sounds, Handwriting, together with their applications and principles.
Usable Senses: The five senses: sight, sound, touch, taste and smell are used by us every day and
each is important on its own Together, they provide a fuller interaction with the natural world.
We can ideally utilise the Computers to use all the available senses but this becomes practically
impossible because computers rarely offer such a rich interaction. We can use the sight, sound,
and sometimes the touch senses, but we cannot yet use the taste and smell.
Simple terminologies used to describe speech: The structure of speech is called phonemes
and there are 40 of them. The phonemes as basic atomic units that sound slightly different
depending on the context they are in, the larger units of phonemes are the Allophones.
Allophones: are the sounds in the language between 120 and 1 30 and are formed into
morphemes. The morphemes are the smallest units of language that have meaning.
Prosody: is the alteration in tone and quality. They are also variations in emphasis, stress,
pauses and pitch. They impart more meaning to sentences.
Co-articulation: is the effect of context on the sound. It transforms the phonemes into
allophones. Syntax is the term used for the structure of sentences while semantics is the
collective term used for the meaning of sentences.
Problems in Speech Recognition: Different people speak differently because accent,
intonation, stress, idiom, volume, etc. differ. The syntax of semantically similar sentences
may also vary while background noises can interfere. People often "ummm....." and
"errr ...... " but words are not enough - semantics are also needed. It requires intelligence
to understand a sentence because context of the utterance often has to be known as well
as information about the subject and speaker. For example, even if "Errr ..... I, um, don't
like this" is recognised, it is a fairly useless piece of information on its own.
Speech Synthesis
This is a generation of speech. It is useful because of its natural and familiar way of
receiving information. It is successful in certain constrained applications when the user
has few alternatives and is particularly motivated to overcome problems. However, it has
its own problems sim ilar to speech recognition particularly in pro sody. Additional problems can arise
from intrusion calling the need for headphones particularly due to noise in the workplace. Its transient
nature is a problem when it becomes harder to review and browse. Few examples occur in screen
readers that reads the textual display to the user e. g. as utilised by visually impaired people. Also in
warning signals of spoken information sometimes presented to pilots whose visual and haptic skills
are already fully occupied while flying.
Sounds
Non-Speech Sounds: These are bongs, bangs, squeaks, clicks etc. that are commonly used for
warning and alarms. Fewer typing mistakes occur here with key clicks. It is also useful in video
games that become uninteresting without sound.
Family ear cons: Here, similar types of earcons represent similar classes of action or similar
objects.The family of "errors" would contain syntax and operating system errors. Earcons are
easily grouped and refined due to compositional and hierarchical nature. It is harder to associate
with the interface task since there is no natural mapping.
Touch recognition also include information on shape, texture, resistance, temperature, and
comparative spatial factors. Examples of technologies on touch recognition include electronic
Braille displays and force feedback devices e.g. Phantom that recognises resistance and texture.
Gesture technology: This can be found in its various applications such as in gestural input - e.g.
"put that there" and sign language. The technology comprises data glove and position sensing
devices such as MIT Media Room. Gesture provides the benefits of natural form of interaction
by pointing. It enhances communication between signing and non-signing users. The problems
with gesture interaction are that it is user dependent due to the variable nature of each user.
Issues of co articulation are also considered as problems.