CMU logo
Search
Expand Menu
Close Menu

Ubiquitous Computing

In the early days of personal computing, the computer was a standalone desktop machine and a person had to directly interact with it. But in the 1980s, a new vision for computing was emerging. 

 

What if computing could be integrated into the everyday life environment so seamlessly that it becomes ubiquitous? The term ubiquitous comes from a Latin word meaning “everywhere” and today this idea of pervasive computing is no longer far fetched. 

 

We have come a long way since CMU grad students connected their Coke machine to ARPANET in 1982 – an event that occurred before the terms “ubiquitous computing” and the “internet of things” (IoT) were coined in ‘88 and ‘99 respectively. Now we are blending technology into everyday objects in many ways – adding sensors to make things context aware, connecting a variety of devices to the internet, and using AI and ML to create personalized experiences. However, as technical advancements allow for more “smart” items in our lives (wearable health trackers, smart homes and smart cities to name a few), look for continued work in the related HCI areas of user experience design, data privacy and ethics.  

 

Students who want to learn more about ubiquitous computing and HCI might be interested in the following CMU courses: 

  • 3 pane illustration

    Auptimize

    PROJECT

    Spatial audio in Extended Reality (XR) provides users with better awareness of where virtual elements are placed, and efficiently guides them to events such as ...

  • side by side images of a person holding gauze on their forehead and an interface with both question answering and proactive interventions

    PrISM: Procedural Interaction from Sensing Module

    PROJECT

    There are numerous complex tasks in everyday life, from cooking to medical self-care, that involve a series of atomic steps. Properly executing these step...

  • close up of scene reconstruction, data annotation and data summary elements

    MineXR

    PROJECT

    Extended Reality (XR) interfaces offer engaging user experiences, but their effective design requires a nua...

  • New work by researchers at CMU and the Indian Institute of Technology (IIT) Gandhinagar shows that adding an inexpensive thermal camera to wearable devices could substantially improve how accurately they estimate calories burned.

    Thermal Camera Senses Breathing To Improve Exercise Calorie Estimates

    NEWS

    Any fitness buff will tell you that the estimates of calories burned made by smartphones, smartwatches and other wearable devices vary wildly. That's because these devices lack the sensors...

  • Vivian facilitating a demo at UIST 2023. [This image is from the official ACM UIST photo album]

    HCII at UIST 2023

    NEWS

    The 2023 ACM Symposium on User Interface Software and Technology (UIST) was held in San Francisco, California from October 29 to November 1, 2023. ...

  • Smartwatch sensor technology developed by CMU researchers could help doctors make more accurate diagnoses of attention deficit hyperactivity disorder in children.

    Smartwatches Could One Day Help Diagnose ADHD in Children

    NEWS

    Smartwatch sensor technology developed by Carnegie Mellon University researchers could help doctors make more accurate diagnoses of attention deficit hyperactivity disorder (ADHD)....

  • Scenario of a user working with the RealityReplay system with side by side views of the environment with and without RealityReplay

    RealityReplay

    PROJECT

    Humans easily miss events in their surroundings due to limited short-term memory and field of view. This happens, for example, while watching an instructo...

  • HCII faculty member Chris Harrison earned the UIST Lasting Impact Award for his 2011 work on OmniTouch, a wearable system that turns everyday surfaces into an interactive screen.

    Harrison Earns Lasting Impact Award for Turning Everyday Surfaces Into Touch Screens

    NEWS

    Good technology takes time to get right. When widespread use of touch screens skyrocketed around 2007, for example, the research had been underway for roughly 50 years....

  • A grid of six images showing different VR environments (street, museum, supermarket) and different types of navigation instructions (arrows on ground, avatar, call outs).

    User Preference for Navigation Instructions in Mixed Reality

    PROJECT

    Current solutions for providing navigation instructions to users who are walking are mostly limited to 2D maps on smartphones and voice-based instructions...

  • Screenshots of two examples of Diminished Reality. One shows two side-by-side images of a desk with a computer, mouse, keyboard and decorative object on one side, and the same desk with all objects but the decorative object faded out. The other image shows a shelf with books on one side, and a shelf with all but one book faded out.

    Towards Understanding Diminished Reality

    PROJECT

    Diminished reality (DR) refers to the concept of removing content from a user's visual environment. While its implementation is becoming feasible, it is s...

  • UML-style workflow diagram of system. A Mixed Reality layout is generated by users, and an automatic system the adapts its positioning.

    SemanticAdapt

    PROJECT

    We present an optimization-based approach that automatically adapts Mixed Reality (MR) interfaces to different physical environments. Current MR layouts, ...

  • close up of the ReCompFig kinematic display

    ReCompFig

    PROJECT

    From creating input devices to rendering tangible information, the field of HCI is interested in using kinematic mechanisms to create human-computer inter...