• en
  • fr
  • BLOG
  • Recto VRso
  • MY ACCOUNT
  • Exhibition
    • Exhibitors
    • LV Competitions
    • Floor Plan
    • Guided Tours
    • Sponsors
  • Sessions
    • Conference programme
    • 2025 Speakers
    • • XR Devices & Technologies
    • • XR Everywhere
    • • Immersive simulations and engineering innovation
    • • AI-Powered virtual characters & assistants
    • • Delight your customers with XR
    • Tech-talks
    • Doctoral Consortium
  • Awards
    • LV Awards Ceremony
    • • Consumer Experience & Entertainment
    • • Developer & Authoring Tools
    • • Education & Training
    • • Enterprise & Productivity Solution
    • • XR devices & Interaction Products
    • • XR for a Cause
    • SIGGRAPH ReVolution Award
    • StartUps Award
    • IVRC Students Award
    • Hackathon Prize
  • Networking
    • Bootcamp XR
    • Networking Dinner
    • Workshop – European Commission
    • Laval Virtual Party
    • LV Awards Ceremony
    • Mobile Application
  • Useful
    • Schedule
    • Discover the event
    • Prepare your visit
    • Mobile Application
    • FAQ
TICKETING
  • Polytechnique Montreal

    Polytechnique Montreal

    In conventional mixed reality, we are used to have our real life surroundings augmented with digital world elements. The mentality of our project was to reverse the dynamic in mixed reality. We want to create an experience where the digital world is augmented by real life elements. To achieve this vision, we have a 2 person experience. The experience involves 1 person in the real world and 1 person in a virtual reality which will respectively be called RWU (Real World User) and VRU (Virtual Reality User). Through a setting and story that remains to be finalized, the VRU must complete a set of challenges, puzzles in a short allocated time. Destined to a sure death without help, the VRU is lucky enough to have an acolyte to help him through this experience. The RWU will be able to send objects from the real world to the virtual realm thanks to a special device they will control.

    Anne-Sophie Clayer

    22 February 2024
  • SociaPol

    SociaPol

    The Sociapol platform is a unique web3-based Metaverse solution that primarily offers gaming and socialization activities, enabling users to engage in multiple activities and experiences using their personalized NFT 3D avatars. Sociapol can be described as a futuristic platform that allows users to earn financial and social rewards in the Metaverse by playing games, designing NFTs, creating valuable content, buying land space, engaging in routine activities and so on.

    Anne-Sophie Clayer

    21 February 2024
  • TARSIOPTICS

    TARSIOPTICS

    Tarsioptics, a deep-tech startup and SCOPTIQUE spin-off leads in VR/XR optics with compact, lightweight, and wide-field systems. Leveraging Fresnel technology, we focus on developing intellectual property and providing R&D services for high-value applications. Aligned with industry giants like Magic Leap and Waveoptics, Tarsioptics aims to revolutionize optical technology across the entertainment, aeronautics, and automotive sectors, emerging as a beacon of innovation in the optical industry.

    Tarsioptics presents a pioneering XR optics prototype, leveraging Fresnel technology for a 210° field of view in a compact 2-inch design. Technology adapts to various optical systems, including binoculars, VR/XR glasses, and cameras, with applications spanning the leisure, aeronautics, medical, and energy sectors. Ideal for industry leaders like Meta and HTC, as well as new players in consumer electronics, our solutions empower clients to create innovative, impactful products for global markets.

    Alizée Gautier

    21 February 2024
  • V.RTU

    V.RTU

    V.RTU is on a mission to transform how we interact with technology by integrating the sense of touch into digital spaces. Leveraging advanced haptic interfaces for XR, we’re combining neuroscience and AI to create realistic tactile sensations, enriching user experiences across online shopping, professional training, gaming and beyond. Our technology promises to make digital interactions more immersive and intuitive, opening new possibilities for sensory engagement in the technological realm.

    HAPTIFY, our SAAS for 3D and 2D developers, enables tactile feedback creation across various development platforms, using AI for realistic sensations. V.RTACT, our ergonomic wearable tech, offers physical texture and force experiences in VR/MR. Both products are powered by AI-driven haptic generation, inspired by neuroscience research, and aim to revolutionize digital interaction.

    Alizée Gautier

    21 February 2024
  • MotionXP

    MotionXP

    A one man 4 months old company, which makes dynamic simulator dream affordable! When simulation meets motion!

    A complete dynamic motion simulator at home. Let’s be Maverick in Top Gun, Luke in Star Wars or Michel Vaillant in your garage.

    Alizée Gautier

    21 February 2024
  • Dexr

    Dexr

    admin

    21 February 2024
    Education / Training, Industry
  • Moverse

    Moverse

    Moverse is a start-up with a mission to accelerate 3D animation workflows and create a one-stop shop for end-to-end 3D animation needs. It is building an ecosystem of 3D animation tools at the forefront of innovation to reduce the resources needed from 80-90% across the various production stages, from motion capture to ready–to–use 3D animation assets.

    Alizée Gautier

    21 February 2024
  • ARVISION GAMES

    ARVISION GAMES

    ARVision Games is a technology company founded with the spirit of innovating in AR multi user interaction, fusing AR, AI, and gaming technologies to deliver unique and immersive user experience

    The first AR gaming platform offering a real AR multi-user experience. A sort of Netflix for AR Games.

    Alizée Gautier

    21 February 2024
  • Multisensory Signal Analysis and Enhancement Lab (MuSAE Lab)

    Multisensory Signal Analysis and Enhancement Lab (MuSAE Lab)

    At the MuSAE Lab we conduct research within the broad topics of human-machine interfaces/interactions (HMIs) and their applications. With expertise in signal processing and machine learning, we develop tools to optimize the human, machine, and interaction aspects of HMIs focusing on multisensory immersive experiences, affective computing, brain-computer interfaces, and neuroergonomics.

    Alizée Gautier

    21 February 2024
  • College of Image Arts and Sciences, Ritsumeikan University

    College of Image Arts and Sciences, Ritsumeikan University

    The MR BLS Rescue Trainer is a training system that allows users to experience realistic simulation with physical feeling in lifesaving procedures, that is BLS (basic-life-support), including CPR (cardiopulmonary-resuscitation) and AED (Automated External Defibrillator) by applying physical mixed reality incorporated with partially tangible components. The system consists of an HMD, a human chest model for CPR training, an AED simulator, and a PC. The human model and the AED simulator are integrated with a microcontroller and connected to the PC. The human model is placed on the floor. And then the user wears the HMD, kneels on the floor, and performs life-saving actions according to the detailed instructions displayed on the HMD.

    Anne-Sophie Clayer

    21 February 2024
Previous Page
1 … 18 19 20 21 22 … 29
Next Page
  • X
  • LinkedIn
  • Instagram
  • Flickr
  • YouTube

About

  • About us
  • LV Team
  • FAQ
  • our blog

contact

  • CONTACT US
  • Press Accreditation
  • Get a booth
  • Schedule a visit
  • Legal notice
  • English
  • Français (French)