Assignment On Csc4212 Human Coputer Interface (Hci) Abdul Murtala Mahmud UG17/COMS/2006
Assignment On Csc4212 Human Coputer Interface (Hci) Abdul Murtala Mahmud UG17/COMS/2006
Assignment On Csc4212 Human Coputer Interface (Hci) Abdul Murtala Mahmud UG17/COMS/2006
ASSIGNMENT ON CSC4212
HUMAN COPUTER INTERFACE (HCI)
ABDUL MURTALA MAHMUD
UG17/COMS/2006
ABSTRACT
The introduction of smart virtual assistants (VAs) and corresponding smart devices
brought a new degree of freedom to our everyday lives. Voice-controlled and
Internet-connected devices allow intuitive device controlling and monitoring from
all around the globe and define a new era of human–machine interaction. Although
VAs are especially successful in home automation, they also show great potential as
artificial intelligence-driven laboratory assistants. Possible applications include
stepwise reading of standard operating procedures (SOPs) and recipes, recitation of
chemical substance or reaction parameters to a control, and readout of laboratory
devices and sensors. In this study, we present a retrofitting approach to make
standard laboratory instruments part of the Internet of Things (IoT). We established
a voice user interface (VUI) for controlling those devices and reading out specific
device data. A benchmark of the established infrastructure showed a high mean
accuracy (95% ± 3.62) of speech command recognition and reveals high potential
for future applications of a VUI within the laboratory. Our approach shows the
general applicability of commercially available VAs as laboratory assistants and
might be of special interest to researchers with physical impairments or low vision.
The developed solution enables a hands-free device control, which is a crucial
advantage within the daily laboratory routine. Not long ago, the concept of a virtual
assistant (VA) was something one would have surely imagined being set into the
territory of science fiction, where artificial intelligence (AI)-driven entities support
comic superheroes saving the day and help TV spaceship crews exploring places “no
UG17/COMS/2006
man has gone before.” In recent years, natural language processing (NLP)—the
foundation for voice-based VAs—has become more and more sophisticated and
reliable.1 Lately, many big information technology (IT) companies have started to
work on VAs, which use speech recognition technology, AI, and speech synthesis
in order to understand and answer questions and execute specific tasks.2 Today
smart personal assistants are commercially available and have found their way into
our homes and pockets. When Apple released one of the first widely available VAs,
Siri, to the public in late 2011, the masses were thrilled, although privacy and
security concerns were raised immediately. Soon other smart assistants were
released: Google’s Google Assistant, Microsoft’s Cortana, Amazon’s Alexa, and
Samsung’s Bixby, just to name the most popular.
Also, a skill for assistance in chemistry labs was developed recently. “Helix” is an
Amazon Echo-based skill that can look up chemical reaction information and
chemicalassociated data.10 While the introduction of VAs into natural sciences just
recently gained the interest of the scientific community, which is most probably due
to the recent emergence of the technology and software, other innovative automation
concepts and smart applications found their way into laboratories some time ago.
Smart devices are already used to support scientists within their experiments.
Current research topics include the use of smartphones, tablets, and smartglasses as
inexpensive substitutions for some laboratory devices, for documentation tasks, and
for data accession and analysis.11–13 Novel automation efforts comprise the fully
automated design, execution, and evaluation of experiments, which could lead to a
tremendous efficiency enhancement of lab work in the near future.14,15 The digital
transformation of science has just started, but it seems to hold great potential to ease
the work in all parts of scientific research.
INTRODUCTION
If current trends continue, it is likely that the web browser will become the only
widely used user interface. Web applications will become the predominant software.
Should this happen, user interface design, implementation and evaluation skills can
become more focussed and effective. Some of the benefits current browser user
interfaces provide are discussed in the context of web application tools produced by
the author and supported by examples. The software architecture of the Web brings
special HCI demands, and the user design experts of the future will require training
in this architecture. This evolution is scrutinised in terms of the new web services
that will become available. Recent trends in this direction are presented, and future
trends explored, with supporting evidence taken from a range of applications. The
influence of the Web, with now a long history of user experience,
UG17/COMS/2006
Intelligent systems that learn interactively from their end-users are quickly becoming
widespread. Until recently, this progress has been fueled mostly by advances in
machine learning; however, more and more researchers are realizing the importance
of studying users of these systems. In this article we promote this approach and
demonstrate how it can result in better user experiences and more effective learning
systems. We present a number of case studies that characterize the impact of
interactivity, demonstrate ways in which some existing systems fail to account for
UG17/COMS/2006
the user, and explore new ways for learning systems to interact with their users. We
argue that the design process for interactive machine learning systems should
involve users at all stages: explorations that reveal human interaction patterns and
inspire novel interaction methods, as well as refinement stages to tune details of the
interface and choose among alternatives. After giving a glimpse of the progress that
has been made so far, we discuss the challenges that we face in moving the field
forward.
While the presented case studies paint a broad picture of recent research in user
interaction with interactive machine learning, this article does not exhaustively
survey the literature in this space. Rather, these case studies are selected to highlight
the role and importance of the user within the interactive machine-learning process,
serving as an introduction to the topic and a vehicle for considering this body of
research altogether. We conclude this article with a discussion of the current state of
the field, identifying opportunities and open challenges for future interactive
machine-learning research.
BODY
The increased interaction between users and learning systems in interactive machine
learning necessitates an increased understanding of how end-user involvement
affects the learning process. In this section, we present case studies illustrating how
such an understanding can ultimately lead to better-informed system designs. First,
we present case studies demonstrating how people may violate assumptions made
by traditional machine learners, resulting in unexpected outcomes and user
frustration. Next, we present case studies indicating that people may want to interact
with machine-learning systems in richer ways than anticipated, suggesting new input
and output capabilities. Finally, we present case studies that experiment with
increasing transparency about how machine-learning systems work, finding that
UG17/COMS/2006
such transparency can improve the user experience in some scenarios, as well as the
accuracy of resulting models. Users Are People, Not Oracles Active learning is a
machine-learning paradigm in which the learner chooses the examples from which
it will learn (Settles 2010). These examples are selected from a pool of unlabeled
samples based on some selection criterion (for example, samples for which the
learner has maximum uncertainty). For each selected sample the learner queries an
oracle to request a label. This method has had success in accelerating learning (that
is, requiring fewer labels to reach a target accuracy) in applications like text
classification and object recognition, where oracles are often paid to provide labels
over a long period of time. However, Cakmak and colleagues (2010) discovered that
when applied to interactive settings, such as a person teaching a task to a robot by
example, active learning can cause several problems. Cakmak’s study (figure 3)
found that the constant stream of questions from the robot during the interaction was
perceived as imbalanced and annoying. The stream of questions also led to a decline
in the user’s mental model of how the robot learned, causing some participants to
“turn their brain off” or “lose track of what they were teaching” (according to their
self report) (Cakmak, Choa, and Thomaz 2010). Guillory and Bilmes (2011)
reported similar findings for an active movie recommendation system they
developed for Netflix. These studies reveal that users are not necessarily willing to
be simple oracles (that is, repeatedly telling the computer whether it is right or
wrong), breaking a fundamental assumption of active learning. Instead, these
systems need to account for human factors such as interruptibility or frustration
when employing methods like active learning.
A user interface software tool helps developers design and implement the user
interface. Research on past tools has had enormous impact on today's developers—
virtually all applications today are built using some form of user interface tool. In
UG17/COMS/2006
this article, we consider cases of both success and failure in past user interface tools.
From these cases we extract a set of themes which can serve as lessons for future
work. Using these themes, past tools can be characterized by what aspects of the user
interface they addressed, their threshold and ceiling, what path of least resistance
they offer, how predictable they are to use, and whether they addressed a target that
became irrelevant. We believe the lessons of these past themes are particularly
important now, because increasingly rapid technological changes are likely to
significantly change user interfaces. We are at the dawn of an era where user
interfaces are about to break out of the “desktop” box where they have been stuck
for the past 15 years. The next millenium will open with an increasing diversity of
user interface on an increasing diversity of computerized devices. These devices
include hand-held personal digital assistants (PDAs), cell phones, pages,
computerized pens, computerized notepads, and various kinds of desk and wall size-
computers, as well as devices in everyday objects (such as mounted on
refridgerators, or even embedded in truck tires). The increased connectivity of
computers, initially evidenced by the World Wide Web, but spreading also with
technologies such as personal-area networks, will also have a profound effect on the
user interface to computers. Another important force will be recognition-based user
interfaces, especially speech, and camera-based vision systems. Other changes we
see are an increasing need for 3D and end-user customization, programming, and
scripting. All of these changes will require significant support from the underlying
user interface software tools.
UG17/COMS/2006
CONCLUSION
REFERENCES
Austerjost, J., Porr, M., Riedel, N., Geier, D., Becker, T., Scheper, T., Marquard,
D., Lindner, P. and Beutel, S., 2018. Introducing a virtual assistant to the lab: A
voice user interface for the intuitive control of laboratory instruments. SLAS
TECHNOLOGY: Translating Life Sciences Innovation, 23(5), pp.476-482.
Dudley, J.J. and Kristensson, P.O., 2018. A review of user interface design for
interactive machine learning. ACM Transactions on Interactive Intelligent
Systems (TiiS), 8(2), pp.1-37.
Gillies, M., Fiebrink, R., Tanaka, A., Garcia, J., Bevilacqua, F., Heloir, A.,
Nunnari, F., Mackay, W., Amershi, S., Lee, B. and d'Alessandro, N., 2016, May.
Human-centred machine learning. In Proceedings of the 2016 CHI conference
extended abstracts on human factors in computing systems (pp. 3558-3565).
Rees, M.J., 2002, January. Evolving the browser towards a standard user
interface architecture. In ACM International Conference Proceeding Series (Vol.
20, pp. 1-7).