Ubiquitous Computing
In the early days of personal computing, the computer was a standalone desktop machine and a person had to directly interact with it. But in the 1980s, a new vision for computing was emerging.
What if computing could be integrated into the everyday life environment so seamlessly that it becomes ubiquitous? The term ubiquitous comes from a Latin word meaning “everywhere” and today this idea of pervasive computing is no longer far fetched.
We have come a long way since CMU grad students connected their Coke machine to ARPANET in 1982 – an event that occurred before the terms “ubiquitous computing” and the “internet of things” (IoT) were coined in ‘88 and ‘99 respectively. Now we are blending technology into everyday objects in many ways – adding sensors to make things context aware, connecting a variety of devices to the internet, and using AI and ML to create personalized experiences. However, as technical advancements allow for more “smart” items in our lives (wearable health trackers, smart homes and smart cities to name a few), look for continued work in the related HCI areas of user experience design, data privacy and ethics.
Students who want to learn more about ubiquitous computing and HCI might be interested in the following CMU courses:
Auptimize
PROJECTSpatial audio in Extended Reality (XR) provides users with better awareness of where virtual elements are placed, and efficiently guides them to events such as ...
PrISM: Procedural Interaction from Sensing Module
PROJECTThere are numerous complex tasks in everyday life, from cooking to medical self-care, that involve a series of atomic steps. Properly executing these step...
MineXR
PROJECTExtended Reality (XR) interfaces offer engaging user experiences, but their effective design requires a nua...
Thermal Camera Senses Breathing To Improve Exercise Calorie Estimates
NEWSAny fitness buff will tell you that the estimates of calories burned made by smartphones, smartwatches and other wearable devices vary wildly. That's because these devices lack the sensors...
HCII at UIST 2023
NEWSThe 2023 ACM Symposium on User Interface Software and Technology (UIST) was held in San Francisco, California from October 29 to November 1, 2023. ...
Smartwatches Could One Day Help Diagnose ADHD in Children
NEWSSmartwatch sensor technology developed by Carnegie Mellon University researchers could help doctors make more accurate diagnoses of attention deficit hyperactivity disorder (ADHD)....
RealityReplay
PROJECTHumans easily miss events in their surroundings due to limited short-term memory and field of view. This happens, for example, while watching an instructo...
Harrison Earns Lasting Impact Award for Turning Everyday Surfaces Into Touch Screens
NEWSGood technology takes time to get right. When widespread use of touch screens skyrocketed around 2007, for example, the research had been underway for roughly 50 years....
User Preference for Navigation Instructions in Mixed Reality
PROJECTCurrent solutions for providing navigation instructions to users who are walking are mostly limited to 2D maps on smartphones and voice-based instructions...
Towards Understanding Diminished Reality
PROJECTDiminished reality (DR) refers to the concept of removing content from a user's visual environment. While its implementation is becoming feasible, it is s...
SemanticAdapt
PROJECTWe present an optimization-based approach that automatically adapts Mixed Reality (MR) interfaces to different physical environments. Current MR layouts, ...
ReCompFig
PROJECTFrom creating input devices to rendering tangible information, the field of HCI is interested in using kinematic mechanisms to create human-computer inter...