Join with us

Breaking News
Join This Site

Search Blog

Technology - Gauntlets of Levitation, Living Desktops, and More: Video Highlights from The 2016 Human-Computer Interaction Conference

Technology - Gauntlets of Levitation, Living Desktops, and More: Video Highlights from The 2016 Human-Computer Interaction Conference

Technology - Gauntlets of Levitation, Living Desktops, and More: Video Highlights from The 2016 Human-Computer Interaction Conference


The 2016 Computer Human Interaction Conference (CHI, which is pronounced “kai” like the Greek letter) is taking place this week in San Jose, Calif. The conference is all about the ways in which the future of interaction technology is advancing, and how it will shape the ways in which we experience our environment (and each other). Really, this is just a complicated way of saying that the conference provides a great excuse for researchers to explore new and crazy ways of using computers, and some of the stuff that they’ve come up with will blow your mind. Think power tools that tell you what to do with themselves, or a couch that’s also a huge touch controller, or projection-augmented 3-D printing on your skin, or gloves that let you levitate objects: all of these are functional prototypes that researchers described at this year’s conference.
You can watch all 281 video previews here, or you can have a look at this baker’s dozen of videos that we’ve hand selected for overall future-ness, technical bewonderment, transformational potential, and generalized weirditude.

GauntLev: A Wearable to Manipulate Free-floating Objects

Acoustic levitation can be used to trap small objects in ultrasonic fields, and by building ultrasonic emitter arrays into wearables, you can make yourself a “Gauntlet of Levitation” as well as a “Sonic Screwdriver:”


Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality

The best way to achieve a satisfying experience manipulating an object in virtual reality is to be manipulating a similar “proxy” object in reality and at the same time. If you want to manipulate many virtual objects, you’d need many proxy objects, unless you can use haptic retargeting to manipulate users into thinking they’re interacting with many objects when there’s actually just one:

Read the paper here.

SATURNO: a Shadow-Pushing Lamp for Better Focusing and Reading

With a single light source, shadows cast by your body can get in the way of whatever you’re doing. The SATURNO lamp can track your motions, and dynamically adjust its light output to reduce or eliminate shadows. It can do some other cool stuff, too:

The extended abstract is available here.

TactileVR: Integrating Physical Toys into Learn and Play Virtual Reality Experiences

Kids are already absorbed in virtual worlds, and TactileVR takes the experience to its logical conclusion by allowing physical objects co-exist with virtually augmented versions of themselves. It looks like a lot of fun to play with, even if it’s slightly creepy to watch:

The extended abstract is available here.

Drill Sergeant: Supporting Physical Construction Projects through an Ecosystem of Augmented Tools

I love these prototypes of power tools that use real-time interactive feedback to walk you through building things. Instead of reading the instructions, measuring twice, and cutting once, now you can just skip straight to the cutting step because your tools are doing the rest for you:

Read the paper here.

bioSync: Synchronize Kinesthetic Experience among People

Using paired sensors and electrodes, bioSync allows you to “jack into” another person, such that when sensors detect them moving their muscles, electrodes force you to move yours, or vice versa. It’s intended to replicate the experience of having a neuromuscular disease, and it certainly would never be used for anything else:

Read the paper here.

Embodied Interactions for Novel Immersive Presentational Experiences

PowerPoint presentation a little dry? By putting yourself inside of your own presentation as an avatar, you can interact with the content of your slides in real-time with gestures as your audience giggles at you:

Read the paper here.

FlexTiles: A Flexible, Stretchable, Pressure-Sensitive Input Sensor

Having a couch that is also a soft and comfortable touch input device based on a scalable textile sensor seems like a fantastic idea:


HotFlex: Post-print Customization of 3-D Prints Using Embedded State Change

3-D printing lets you physicalize designs very quickly, but if you don’t get things just right, it still takes iterations to end up with something that you’re happy with. By embedding heating elements into 3-D printed structures, the user can soften, remold, and reharden 3-D printed objects whenever they like.


One-Dimensional Handwriting: Inputting Letters and Words on Smart Glasses

What’s the minimum amount of input that it takes to write? One dimension (swiping back and forth) is apparently enough, and for input on something like Google Glass, that’s all you get:

Read the paper here.

Haptic Edge Display for Mobile Tactile Interaction

The most common way to hold your phone is by the edges, which presents an interaction opportunity through the use of an array of piezo actuators:

Read more from Sungjune Jang at MIT.

ExoSkin: On-Body Fabrication

In case the only thing stopping you from 3-D printing on your own body was a lack of confidence in your ability to freehand draw:

Read more from Autodesk.

LivingDesktop: Augmenting Desktop Workstation with Actuated Devices

What if your keyboard, mouse, and monitor could autonomously move themselves around your desktop? How cool would that be? Exactly this cool:

PLEASE SHARE IT AS SOON AS POSSIBLE

Recent Comments