Arata Jingu | 神宮 亜良太

I am a third-year Ph.D. student at Human Computer Interaction Lab at Saarland University, advised by Prof. Dr. Jürgen Steimle.

My research interest is to create 3D virtual objects with the same visuo-haptic characteristics as the real world. My work has been published/exhibited at top-tier venues (e.g., ACM CHI/UIST, Laval Virtual) and featured on mainstream media (e.g., Yahoo Finance, Business Today, Engadget, hackster.io, and Communications of the ACM). Recipient of Google PhD Fellowship 2024.

CV | Google Scholar | X | Email: jingu@cs.uni-saarland.de

Main Publications

Shaping Compliance: Inducing Haptic Illusion of Compliance in Different Shapes with Electrotactile Grains

Arata Jingu, Nihar Sabnis, Paul Strohmeier, Jürgen Steimle
In Proc. CHI'24 (full paper)

Rendering Softness in Different Shapes with a Thin Haptic Device.

Paper | Video

Double-Sided Tactile Interactions for Grasping in Virtual Reality

Arata Jingu, Anusha Withana, Jürgen Steimle
In Proc. UIST'23 (full paper)
Best Demo Honorable Mention (People’s Choice, top 7%)

Haptic Interaction Paradigm for the Pinched Fingers.

Paper | Video

LipIO: Enabling Lips as both Input and Output Surface

Arata Jingu, Yudai Tanaka, Pedro Lopes
In Proc. CHI'23 (full paper)

Lips as Touch I/O Surface.

Paper | Video | Talk

LipNotif: Use of Lips as a Non-Contact Tactile Notification Interface Based on Ultrasonic Tactile Presentation

Arata Jingu, Takaaki Kamigaki, Masahiro Fujiwara, Yasutoshi Makino, Hiroyuki Shinoda
In Proc. UIST'21 (full paper)

Lips as Notification Receiver.

Paper | Video | Talk

Tactile Perception Characteristics of Lips Stimulated by Airborne Ultrasound

Arata Jingu, Masahiro Fujiwara, Yasutoshi Makino, Hiroyuki Shinoda
In Proc. WHC'21 (full paper)

Lip Perception in Ultrasound Haptics (The first paper on non-contact haptic feedback to the lips).

Paper | Talk

More Publications

Dynamic Iris Authentication by High-speed Gaze and Focus Control

Tomohiro Sueishi, Arata Jingu, Shoji Yachida, Michiaki Inoue, Yuka Ogino, Masatoshi Ishikawa
In Proc. SII'21 (short paper)

Iris Authentication for Remotely Moving Eyes.

Paper

Awards

Funding

Perspectives

Lips as Interface

My three papers (CHI'23, UIST'21, WHC'21) explored "Lips as Interface". Lips have high potential as I/O interfaces: dexterous, inherent in humans, sensitive to touch, electrically conductive, valley-shaped capable of converging ultrasonic waves, close to other intraoral parts, and easy to access from the outside environment. Just as our fingers have become a touch I/O interface in today's touchscreens, we might witness how our lips EVOLVE in the following decades.

Exhibitions

Be in"tree"sted in | きになるき

International Virtual Reality Contest 2019, Tokyo (Laval Virtual Award, Unity Award)
Laval Virtual 2020, Virtual

VR experience of becoming a tree and spending the four seasons.
(w/ three co-creators, my part: VR/Server)

Paper | Video

Autonomous Shadow | 自律する影

IIIExhibition 2018 "Dest-logy REBUILD", Tokyo

Your shadow moves autonomously.
(w/ two co-creators, my part: Python/OpenCV/Shadow Image Processing/OSC)

Video

Leaplat

The 69th Komaba festival (2018), Tokyo

Splatoon-like AR battle game.
(Unity/Design)

Video