Micro Elastic Pouch Motors: Elastically Deformable and Miniaturized Soft Actuators Using Liquid-to-Gas Phase Change
(IEEE RA-L 2021, RoboSoft 2021)

We proposed Micro Elastic Pouch Motors, a largely deformable and miniaturized soft actuator that is made by an elastic rubber pouch with a low-boiling-point liquid. When the temperature of the low-boiling-point liquid reaches 34 ˚C, the liquid inside the pouch evaporates, and the whole structure inflates. Thanks to the proposed fabrication method, we can make a miniaturized pouch of approximately 5 mm in diameter with a thin rubber membrane, and the pouch can inflate to a volume of 86 times or more compared to its initial volume and can generate approximately 20 N at maximum.

CoVR: Co-located Virtual Reality Experience Sharing forFacilitating Joint Attention via Projected User-Perspective View
(SIGGRAPH Asia 2020 ETech)

VR experience sharing between users wearing head-mounted displays (HMD users) and users not wearing HMDs (Non-HMD users) is a promising approach to bridge the experience gap between these users. We proposed “CoVR,” a co-located VR sharing system for the HMD and Non-HMD users via projected user-perspective images, using an HMD device with a head-mounted focus-free projector. We introduce a design methodology of displaying images considering the viewpoint and the perspective of images presented to the HMD and Non-HMD users with additional information to the images and demonstrate applications.

Fully Flexible Liquid-to-gas Phase Change Actuators with Integrated Liquid Metal Heaters
(JJAP 2021, MNC 2020)

We proposed a small liquid-to-gas phase change actuator with an integrated liquid metal heater. A low-boiling-point liquid and liquid metal heater were encapsulated in a nylon-polyethylene bladder using wire molding. The proposed soft actuator has high flexibility and durability against bending because of the material used.

HaPouch: A Miniaturized, Soft, and Wearable Haptic Display Device Using a Liquid-to-gas Phase Change Actuator
(UIST 2020 Poster, EuroHaptics 2020 Poster, Demo)

We proposed HaPouch, a wearable haptic display that uses a liquid-to-gas phase change actuator and a Peltier device as a way to reduce the size of the entire system. A low-boiling-point liquid is encapsulated in the flexible pouch of the actuator, and the vaporization of the liquid, which induces the inflation of the actuator, is controlled by the external Peltier device. We implemented a temperature and pressure sensor to monitor pressure inside the pouch and this measurement will contribute to controlling the generated force.

HaptoMapping: Visuo-Haptic Augmented Reality by Embedding User-Imperceptible Tactile Display Control Signals in a Projected Image
(EuroHaptics 2020, SIGGRAPH Asia 2020 ETech)

We proposed HaptoMapping, a novel projection-based visuo-haptic augmented reality (VHAR) system, that can employ various visuo-haptic combinations and present consistent visuo-haptic sensations on physical surfaces. HaptoMapping is comprised of a high-speed projector and wearable haptic displays and the system controls the haptic displays by embedded control signals that are imperceptible to the user in projected images using a pixel-level visible light communication technique.

Head Orientation Control of Projection Area for Projected Virtual Hand Interface on Wheelchair
(SICE JCMSI 2021, SICE 2020)

We proposed a wheelchair system enhanced with a projected virtual hand that allows controlling a projection area using a user’s head orientation. The proposed system measures the current user’s head orientation and distance between a user and a projection surface. Then, it adapts the suitable pan and tilt of the projector by considering its positional relationship with the projection plane. As users can operate the projection area simply by turning their head, this operation can be executed simultaneously with operating a virtual hand using their hands.

Laser Pouch Motors: Selective and Wireless Activation of Soft Actuators by Laser-powered Liquid-to-gas Phase Change
(IEEE RA-L 2020, RoboSoft 2020)

We proposed a method to wirelessly drive multiple soft actuators by laser projection. Laser projection enables both wireless energy supply and the selection of target actuators. Thus, we do not need additional components such as electric circuits and batteries to achieve simple and scalable implementation of multiple soft actuators. We verified that our method can activate a group of mobile soft robots simultaneously and individually while tracking the actuator positions.

IlluminatedFocus: Vision Augmentation using Spatial Defocusing via Focal Sweep Eyeglasses and High-Speed Projection
(IEEE TVCG 2020, IEEE VR 2020)

Aiming at realizing novel vision augmentation experiences, this paper proposes the IlluminatedFocus technique, which spatially defocuses real-world appearances regardless of the distance from the user’s eyes to observed real objects. With the proposed technique, a part of a real object in an image appears blurred, while the fine details of the other part at the same distance remain visible. We apply Electrically Focus-Tunable Lenses (ETL) as eyeglasses and a synchronized high-speed projector as illumination for a real scene.

Modifying Texture Perception with Pseudo-Haptic Feedback for a Projected Virtual Hand Interface
(IEEE Access 2020, World Haptics 2019 Poster, Demo)

A projected virtual hand interface is a promising approach for body augmentation because it can extend a user’s reach in daily life without the need to wear a device. Although users can manipulate a projected virtual hand as if it were their own hand and can interact with distant objects through it, they cannot feel the sensation of touch when the projected virtual hand is overlaid on a real object. We proposed a novel pseudo-haptic feedback framework to provide users with the tactile texture of objects without the use of haptic devices.

NavigaTorch: Projection-based Robot Control Interface using High-speed Handheld Projector
(SIGGRAPH Asia 2019 ETech)

We proposed “NavigaTorch”, a projection-based robot control interface that enables the user to operate a robot quickly and intuitively. The user can control and navigate a robot from the third-person viewpoint using the projected video as visual feedback. We achieved flickerless image feedback and a quick response in robot operation by developing a handheld pixel-level visible light communication (PVLC) projector. Our contribution is the development of a robot control interface based on high-speed projection technology and the exploration of the design methodology of projection-based robot control using a handheld projector.

Imperceptible AR Markers for Near-screen Mobile Interaction
(IEEE Access 2019)

Mobile interactions with display screens have gained attention within the advertising and gaming industries as well as in human–computer interaction research. However, the visible markers like QR code or AR markers interfere with the display content; this problem is critical for localization over a wide range of interactions, and fewer markers result in less reliability and accuracy. To address this problem, we proposed an easy-to-install localization method that uses an array of AR markers, which are made imperceptible to the human eye through chromaticity vibration at 30 Hz.

PILC Projector: Image Projection with Pixel-level Infrared Light Communication
(IEEE Access 2019, IEEE VR 2019 Poster)

Invisible-light communication between a projector and multiple devices on the projection plane enables human–computer interaction applications such as movable displays and pointing devices. We proposed a pixel-level infrared light communication (PILC) projector that can project rich, invisible position-dependent information on high-contrast, full-color visible images. Although previous methods for such communication sacrificed the image quality, the amount of information, or other features, our method meets all the requirements by adding an infrared light source to a full-color DLP projector.

Dynamic PVLC: Pixel-level Visible Light Communication Projector with Interactive Update of Images and Data
(ITE TMTA 2019, IDW 2018)

We previously studied methods leveraging pixel-level visible light communication (PVLC) that embeds imperceptible information for human eyes in each pixel of an image. The PC computation load and amount of data transferred between the PC and projector in previous PVLC systems were excessive because the PC executed both the video and data encoding processes. Thus, it was impossible to achieve both high-dynamic-range images and dynamic updates of the images and data. We proposed a dynamic PVLC system that offers high video quality and interactively updates the PVLC information through hardware encoding processing.

Touchable Wall: Easy-to-Install Touch-Operated Large-Screen Projection System using Active Acoustic Sensing
(ISS 2018 Demo)

Recently, small and inexpensive portable projectors are commonly being used for presentations. For convenience, it is desired to perform a direct pointing operation on the screen without requiring to grip or mount any device and control the pointer intuitively. Therefore, this study proposes a new touch-operated large-screen projection system using acoustic vibration sensing and a projector-camera system. We apply a continuous signal in the inaudible range of an actuator to a surface and acquire its elastic compliance change as a resonance of sounds.

Projection-based Control for Mobile Robots Cooperating with Displayed Images
(TVRSJ2020, UbiComp 2018 Doctoral Colloquium)

Collaborative control system between digital images and multiple robots has attracted increasing attention in the robot environment for displaying information to users. We proposed the projection-based robot control method for enabling this control system and its applications in the fields of mixed reality and user interfaces. In this paper, we described the requirements of the robot environment for displaying information to users, the related work of the projection-based robot control system, and the robot control methods using velocity vector field that was applied in the projection-based robot control.

Imperceptible Color Vibration for Screen-Camera Communication via 2D Binary Pattern
(ITE TMTA 2020, UIST 2018 Demo)

Communication between screens and cameras has attracted attention as a ubiquitous information source. Previously, embedding matrix barcodes into images on displays by utilizing imperceptible color vibration was proposed. In this approach, the visual experience is maintained considering that barcodes are imperceptible, and it can be implemented on almost any display and camera. Herein, we describe a sophisticated modulation protocol and retrieval procedure whereby device characteristics such as the display’s gamma feature and the smartphone’s rolling shutter are taken into consideration.

Screen-Device Interaction by Embedding Pixel-by-Pixel Data into Digital Images Using Imperceptible Color Vibration
(CHI 2017 LBW)

In addition to displaying images, graphic display devices can be configured to transmit information to nearby computing devices, to facilitate human interaction with the image. However, some existing methods require the use of visible markers, which impairs the visual experience, while others require specially modified projectors. To solve these problems, we propose a novel method that employs imperceptible color vibration to embed pixel-by-pixel data into images on an ordinary LCD display; this is achieved by fixing the luminance and vibrating only the chromaticity of the color of each pixel.

Phones on Wheels: Exploring Interaction for Smartphones with Kinetic Capabilities
(UIST 2016 Demo)

This study introduced novel interaction and applications using smartphones with kinetic capabilities. We developed an accessory module with robot wheels for a smartphone. With this module, the smartphone can move in a linear direction or rotate with sufficient power. The module also includes rotary encoders, allowing us to use the wheels as an input modality. We demonstrated a series of novel mobile interaction for mobile devices with kinetic capabilities.

Sensible Shadow: Tactile Feedback from Your Own Shadow
(AH 2016)

Sensible Shadow a new shadow interface system that provides tactile feedback from a user’s shadow to physical body. When users obstruct the projected image by their bodies, wearable photoreactive tactile displays receive the light information; they decode it and transmit tactile sensation. The proposed system does not require complex sensing, complicated settings and communication systems; it can perform high speed tactile feedback with the information directly received from the projected light.

Phygital Field: An Integrated Field with Physical Robots and Digital Images using Projection-based Localization and Control Method
(SICE JCMSI 2018, IEEE/SICE SII 2016, SIGGRAPH Asia 2016 ETech)

Collaboration between computer graphics and multiple robots has attracted increasing attention in several fields. However, realizing a responsive control system for a large number of mobile robots without complicated settings while avoiding the system load problem is not trivial. We proposed a novel system, called “Phygital Field,” for the localization and control of multiple mobile robots. Utilizing pixel-level visible light communication technology, our system can project two types of information in the same location: visible images for humans and data patterns for mobile robots.

Reconfigurable PVLC: Reconfigurable Pixel-level Visible Light Communication with Light Source Control
(TVRSJ 2016)

Pixel-level visible light communication (PVLC) is a method that can embed human-imperceptible metadata in projected images with high-speed flickering. However, previous projection systems utilize PVLC can update neither invisible data nor visible images in real time due to its technical limitation. To solve this problem, we propose Reconfigurable pixel-level visible light communication (RPVLC) system that can update the data and the images dynamically, reconfigure the trade-off between image frame rate and image resolution, and control full color LED light source.