BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Tokyo X-LIC-LOCATION:Asia/Tokyo BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:JST DTSTART:18871231T000000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20250110T023314Z LOCATION:Hall E\, E Block\, Level B2 DTSTART;TZID=Asia/Tokyo:20241205T100000 DTEND;TZID=Asia/Tokyo:20241205T170000 UID:siggraphasia_SIGGRAPH Asia 2024_sess190@linklings.com SUMMARY:Extended Reality (XR) DESCRIPTION:XR\n\nStep into the dynamic realm of Extended Reality (XR) at SIGGRAPH Asia 2024 in Tokyo. XR, comprising augmented (AR), mixed (MR), an d virtual reality (VR), blurs the boundaries between the physical and virt ual worlds. Delve into the latest advancements in XR technology, research, and content creation, offering unparalleled experiences that push the lim its of innovation.\nEmbark on an immersive journey through two captivating categories: XR Demo and XR Theater. In the XR Demo, witness state-of-the- art demonstrations that unveil novel experiences, seamlessly integrating t ouch, sound, and other sensory modalities. Engage with groundbreaking appl ications and artifacts that showcase the transformative power of XR.\n\nMe anwhile, in the XR Theater, experience the magic of immersive cinematic st orytelling and narrative exploration. Transport yourself into captivating virtual environments, where rich narratives and stunning visuals come to l ife. Whether seated or standing, these experiences promise to captivate yo ur senses and ignite your imagination.\n\nAt SIGGRAPH Asia 2024, the XR pr ogram serves as a hub for innovation, creativity, and collaboration, offer ing attendees an exclusive glimpse into the future of XR. Join us as we em bark on an extraordinary adventure, where technology meets imagination, an d possibilities are limitless.\n\nGenerative Terrain Fast Prototyping in V irtual Reality with Freehand Interface\n\nWe provide an interactive VR-bas ed terrain generation system that utilizes VR features like hand tracking and gesture detection for users to draw mid-air strokes to design landscap es. A Conditional Generative Adversarial Network is used for adding realis tic details on the terrain. The system is suit...\n\n\nYushen Hu (New York University)\n---------------------\nMeiMeiRoRo\n\n“MeiMeiRoRo" is an inte ractive experience in a nested maze, and presents two consciousnesses: act ive and passive. The inclination to the miniature maze is feedbacked as th e user's body inclination. This allows for two consciousnesses to arise.\n \n\nOsuke Funabiki (Meijo University)\n---------------------\nRe-Touch: A VR Experience for Enhancing Autobiographical Memory Recall Through Haptic and Affective Feedback\n\nRe-Touch enhances autobiographical memory recall using VR, haptic feedback, and affective computing, adapting to users' em otional states. By monitoring physiological responses like PPG and EDA, it adjusts haptic feedback and the virtual environment's richness, providing personalized, immersive exper...\n\n\nTamil Selvan Gunasekaran (The Unive rsity of Auckland, Empathic Computing Lab)\n---------------------\nSignal Journey: Sense of Connection in a Two-Player VR Game Through a Proximity-B ased Interactive Experience\n\nSignal Journey is a co-dependent VR game de signed to offer a sense of connection between two players by tracing and a ugmenting their virtual distance. Positioned at opposite ends of the virtu al environment, the players exchange audio and visual signals to locate ea ch other until they come into close...\n\n\nDayoung Lee (Sogang University )\n---------------------\nReyeal: Implicit Gaze-based Interaction for Crea ting Personal Landscape Painting in Cinematic Virtual Reality\n\nReyeal is an implicit gaze-based interaction for cinematic virtual reality that all ows users to create their own immersive landscape painting experience. Rey eal eliminates the need for specific target objects and relies on invisibl e objects, resulting in a compelling experience for every user.\n\n\nMigue l Ying Jie Then (National Taiwan University)\n---------------------\nZenbu Koko - A mixed reality platform for inward contemplation\n\nZenbu Koko is a XR platform that presents a seamless immersive meditative experience in spired by the yogic tradition. Thanks to its multisensory environments and its tool to reconstruct and observe the participants’ bodies from differe nt perspectives, it leads the user across a short journey f...\n\n\nLoup V uarnesson (All Here SA), Richard Nelson (KKAA), and Erkin Bek (All Here SA )\n---------------------\nTheater of Future Puppetry: An Immersive Puppete ering Experience Based on Hands Tracking and Gestures Recognition\n\nThis research presents an advanced gesture recognition system called Theater of Future Puppetry. Unlike systems that only use hand tracking and hand skel eton manipulation of puppets, this system can recognize specific gestures used in traditional Budaixi (Taiwanese glove puppetry) and interact with . ..\n\n\nChun-Cheng Hsu (National Yang Ming Chiao Tung University/Institute of Applied Arts)\n---------------------\nSlime Hand XR: Distinct Illusory Skin Deformation in HMD Space\n\nSlime Hand XR is an interactive HMD syst em that induces illusory skin deformation in VR. Participants view slime i n the HMD's view and feel their skin stretch as the slime. Virtual suspens ion maintains the illusion. In briefly experiment with children, 56 of 63 participants reporting strong skin def...\n\n\nKenri Kodaka (Nagoya City U niversity)\n---------------------\nNecomimi illusion: Generating Ownership of Cat Ears through Haptic Feedback via Hair\n\nIn this demonstration, we developed a device that creates a sense of ownership over imaginary body parts, specifically cat ears. We achieve this through haptic feedback to t he head via hair. Participants transform into an avatar with cat ears and experience having their imaginary cat ears stroked.\n\n\nHiroo Yamamura\n- --------------------\nLocation-Based Artifact Installation Interaction\n\n Our project utilizes spatial location technology to create interactive ins tallations that engage with digital projections of artifacts, commemoratin g World War II artifact protection efforts led by the Palace Museum in Chi na. It offers a new, immersive way to experience history through visual, a udit...\n\n\nSixue Zhang (Beijing Normal University)\n-------------------- -\nFinger Painting in VR: Multi-Dynamic Gestural Input for VR Painting\n\n This study explores the multi-dynamic gestural interaction system using VR headsets and wearable devices, for enhancing VR painting with pressure co ntrol, gesture recognition, and haptic feedback. Testing demonstrated nove l visual expressions through hand gestures, suggesting the potential for m ore ...\n\n\nRosina Yuan\n---------------------\nInto the Womb -I want to be born again-\n\n"Into the Womb" is a VR content about rebirth, reflectin g the author's disability, trauma, inability to connect emotionally, chall enges and struggles with love. It starts from the perspective of a woman w ith developmental disabilities in Tokyo, returns to the fetus, and offers new beginnings for all...\n\n\nShoko Kimura (Aoyama Gakuin University, Nag oya Institute of Technology); Ayaka Fujii and Kenichi Ito; Rihito Tsuboi ( Game Sound Creator); and Yoshinori Natsume (Nagoya Institute of Technology )\n---------------------\nNatureBlendVR: Hybrid Space Interactive Experien ce For Emotional Regulation And Cognition Improvement\n\nNatureBlendVR is an immersive interactive system designed to boost emotional well-being and enhance cognitive functions linked to the therapeutic practice of forest bathing. This experience seamlessly integrates extended reality (XR) techn ology with interactive bio-responsive physical elements, crea...\n\n\nKing a Skiers (Keio University Graduate School of Media Design)\n-------------- -------\nGravField: Live-coding Bodies through Mixed Reality\n\nGravField (short for "Gravitational Field") is a live-coding intercorporeal improvis ation system within a collocated Mixed Reality (MR) environment. It explor es the dynamic interplay between bodies, mediated through live-coded MR au diovisual affordances. Inspired by contact improvisation techniques,...\n\ n\nBotao Hu (Reality Design Lab)\n---------------------\nTranscendental Ch akra: A Multi-Sensory Meditation Spiritual Journey to Enhance Self-Awarene ss Based on VR\n\nChakra, from Hinduism, represents luminous wheels reflec ting self-awareness. Chakra-based meditation supports well-being, but visu al and tangible feedback for mindfulness is unexplored. Transcendental Cha kra, a VR experience, uses guided audio, visuals, and vibrotactile feedbac k to help beginners vi...\n\n\nDanyang Peng (Keio University Graduate Scho ol of Media Design)\n---------------------\nA Collaborative Multimodal XR Physical Design Environment\n\nOur collaborative XR system integrates phys ical and virtual design spaces, using video passthrough to superimpose vis ual modifications on physical objects. With features such as multimodal in puts, real-time physical object tracking, and object-based 3D annotation, it speeds up design iterations for ...\n\n\nKen Perlin (New York Universit y)\n---------------------\nPetroller: Altering the Virtual Reality Control ler with an Attachable Prop-based Haptic for Embodied Virtual Companion\n\ nWe present "Petroller," an attachable haptic device for VR controllers th at simulates virtual companions, allowing users to hug, pat, and exercise, promoting relaxation, emotional healing, and physical exercise.\n\n\nTa-W ei Liu, Pei-Cih Zeng, Guan-Yi Lu, Kuan-Ning Chang, Jih Hsuan Peng, and Pin g-Hsuan Han (National Taipei University of Technology)\n------------------ ---\nWireless Vibrant Virtual Walker: Wireless Foot Vibration Device for V irtual Walking Experience\n\nOur wireless foot vibration device enhances v irtual walking by using vibrators and pressure sensors for realistic contr ol and feedback. It allows users to navigate virtual environments with sli ght foot pressure, avoiding physical constraints, and is versatile for use in standing, seated, or lying po...\n\n\nJunya Nakamura (Toyohashi Univer sity of Technology)\n---------------------\nExpressiveWorld: Detachable Ex pressions for VR Collaboration\n\nThis project aims to enhance VR collabor ation by allowing users to customize the emotion representations of their collaborating partner to improve emotional understanding. It uses 3D head models, emojis, visual effects, and vibrotactile feedback to simulate five discrete emotions, enabling users to ...\n\n\nTheophilus Teo (University of South Australia)\n---------------------\nDragravity: Wearable, Reconfig urable, Dynamic Weight-Shifting Waterbags to Enhance Full-Body Weight Sens ations.\n\nDragravity, a wearable, reconfigurable waterbag for dynamic wei ght-shifting haptic proxy. Waterbags can be worn on body partswith elastic bandsdistribute,enhancing the user's full-body weight sensation to enable more application possibilities with the haptic proxy, which can simulate the weight sens...\n\n\nChi-Yu Lin (National Taipei University of Technolo gy)\n---------------------\nEchoVision: Experiencing Bat Echolocation via Mixed Reality\n\nEchoVision is an interactive mixed reality art experience that allows participants to experience the umwelt of bats by simulating e cholocation using a custom-designed, handheld, bat-shaped, optical see-thr ough mixed reality mask. This work promotes an ecocentric design and foste rs understanding betw...\n\n\nJiabao Li (University of Texas at Austin)\n\ nRegistration Category: Enhanced Access, Exhibit & Experience Access, Expe rience Hall Exhibitor, Full Access, Full Access Supporter, Trade Exhibitor END:VEVENT END:VCALENDAR