Research projects
r&d lab
Video games are about much more than entertainment. The topics developed in this area focus on the human element, exploring innovative projects that meet the needs of business, science and society;
These projects require specific skills rooted in the school’s identity, particularly in:
Because of the plural and interdisciplinary nature of this axis, which applies to other fields such asmental health, educational innovation, defense, ecology, culture and heritage; theprojects carried out are oriented towards a practical and concrete application, thus involving collaborations with our academic or industrial partners.
Keywords : 3D visualisation, e-learning, computer graphics, rendering, 3D modelling, virtual reality, UI & UX, emotions.
projects
The 3D visualisation of scientific data in a wide range of fields (sustainable development, medical, industrial, archaeological, etc.) makes it easier to understand complex phenomes, improves the presentation of models and aids decision-making, particularly in immersive, scripted environments.
With this in mind, the ISART team is collaborating with various multidisciplinary university laboratories specialising in urban modelling (UMR ESPACE), artificial intelligence (LIP6) and sustainable development (IMREDD)…
Context
Digital twins are essential for the development of Smart Cities. They allow for the simulation of urban life in a neighborhood or city and the scenario creation of various events (natural disasters, fires, crowd control, etc.).
Such digital twins also facilitate political and urban planning decision-making. But the main issue at the heart of Smart City initiatives is ultimately the human aspect.
Philosophical and ethical aspects, safety and well-being, and changes in behavior and uses are also taken into account in smart cities to build a more humane urban future. The societal challenge: how can we develop a smart and resilient territory in the face of all environmental issues?
Topic
The objective of this project is to develop a 3D virtual environment offering an urban model of the ecosystem of the Parc Méridia district in the city of Nice using the real-time Unreal Engine.
Two main types of simulations are performed: air pollution and noise pollution. Several parameters are taken into account in this simulation, such as occlusion and sound reflection by buildings, variable emission rates of different types of air pollutants depending on vehicle speed, and consideration of wind in pollutant dispersion.
The main sources of pollution are moving vehicles and certain buildings, as well as potential crowds of pedestrians. Vegetation acts as a pollutant fixer.
Other project highlights:
The results of this project will enable the EPA (public development agency) to take into account various criteria and constraints in their developments.
Keywords: Urban planning assistance, scientific visualization, real-time, 3D modeling, pollution.
Partner: IMREDD – Université Côte d’Azur
Years: 2023-2024 (ongoing project)
Context
The EMC2 project “The Evolutive Meshed Compact City. A pragmatic transition path to the 15-minute city for European metropolitan peripheries” is a European collaborative project led by G. Fusco of the UMR ESPACE, and bringing together various universities, research centers, development and urban planning agencies.
The EMC2 project is part of an urban development approach that aims to create more liveable, safer, and more environmentally friendly cities.
Unlike the original model focusing on large metropolises, the EMC2 project analyses the 15-minute city in peri-urban areas, particularly for cities articulated around a main axis, linear or Y-shaped, and where every pedestrian must be less than a 15-minute walk from this main street.
Topic
This study focuses on calculating visibility in the 15-minute city in peri-urban areas as defined above.
This visibility calculation is based on Jane Jacobs’ “Eyes on the Street” theory. Urban planners and researchers in this field generally work with 3D modeling tools from the ArcGIS suite developed by Esri.
CityEngine is a widely used tool for exploiting geographic data. It is capable of providing 3D visibility lines one by one or designing 2D visibility maps, but not in 3D in general. In response to the problem of calculating 3D visibility, it was proposed to use a video game engine, in our case Unreal Engine.
This solution allows 3D models to be retrieved and sightlines to be detected using ray tracing collisions, a technique inherent to video game engines and specifically optimized.
Keywords: Urban planning, 15-minute city, Twin cities, 3D visualization
Partner: ESPACE, Université Côte d’Azur
Years: 2024-2025 (ongoing project)
Context
Faced with the environmental crisis, how can we measure the impact of each new decision or policy on the “Earth system,” comprised of human societies, the climate, and biodiversity?
Today, there is no solution that can scientifically determine which individual and collective behaviors are the most harmful or beneficial. A team from the Sorbonne University Computer Science Laboratory (LIP6) created the TerraNeon project to fill this gap by developing a decision-making tool based on a systemic multi-agent approach (derived from collective artificial intelligence).
This tool can model and simulate:
To reap the benefits of TerraNeon, it is crucial to raise awareness among all stakeholders in society about the impact of their actions on the climate.
This can be achieved using ImmerSim, a 3D visualization tool (in the form of a video game), which will provide 3D immersion of the results of multi-agent simulations.
Topic
This project involves developing a 3D environment—in Unity—with a global (planetary scale) and local (human scale) view, similar to “Google Earth.” The user/player will thus be able to better understand the impact of a given political decision regarding climate change on the environment and biodiversity by walking across the globe, in cities and countryside, at different time scales.
The temporal datasets will be sourced from TerraNeon. The scenery elements in the virtual universe are procedurally generated to dynamically account for climate impact based on the data from the simulation.
Keywords: scientific visualization, real-time, 3D modeling, procedural generation, multi-agent system
Partner: Project in partnership with the LIP6 of the SU
Years: 2023-2024 (ongoing project)
Context
3D visualization of scientific data in various fields (medical, industrial, archaeological, etc.) facilitates the understanding of complex phenomena, improves the presentation of models, and enhances research results.
The ISART school and the Institute of Computational and Data Sciences (ISCD) at Sorbonne University (SU) are promoting interdisciplinary research by combining their respective expertise in real-time 3D rendering techniques and data science.
Subject
As part of this partnership, various topics were explored, using the real-time Unreal Engine to visualize scientific data in the fields of biomechanics, archaeology, art, and history, among others. Projects completed include:
Keywords: 3D scientific visualization, scientific computing, real-time, culture and heritage.
Partner: Project in partnership with the ISCD of the SU
Year: (project since 2021)
Video game technologies play an important role in mental health due to their immersive nature, design, storytelling, emotions, etc. Video games can help individuals develop their ability to relax, manage their emotions, and treat cognitive disorders such as anxiety and depression. In addition, they serve to raise awareness about mental health issues.
With this in mind, since 2021, the ISART laboratory has been working in collaboration with the Cognition Behaviour Technology Lab (COBTEK) at the Université Côte d’Azur to define and carry out multidisciplinary research projects that harness the potential of video games to improve individuals’ mental health.
Teaching interpersonal skills, communication, social skills, and appropriate behaviour to healthcare students often proves difficult using traditional methods.
To address this challenge, the TeachModvr application was developed in Unity3D as an interactive tool using AR visual and interactive components to facilitate the teaching and assessment of relational basics during a clinical consultation with the patient.
The app, which is available free of charge, currently offers two scenarios (360° omnidirectional videos) focused on psychological dimensions. Two cognitive disorders in patients have been addressed: memory disorders and language disorders.
Further details are available in the IEEE CoG 2022 publication.
The application can be downloaded at:
Keywords: E-learning, e-health, caregiver-patient relationship, Unity3D
Year: 2021 (project completed)
The frequency of depressive and anxiety symptoms among students has increased as a result of the COVID-19 pandemic (distance learning, lockdown, isolation, etc.). Furthermore, emotional regulation methods are rarely taught in France. As a result, TeachMod RE, developed using Unity3D, offers a virtual 3D universe where learners embody a student avatar in the first person to increase immersion in the game. The main scenario consists of assessing the pressure generated by exams through various scenarios.
Several multiple-choice questions are displayed to the student in order to study their emotional regulation strategies (emotional suppression or cognitive reappraisal). The course of events varies depending on the user’s previous choices. Theoretical information based on emotional regulation is also provided. More details are available in the IEEE CoG 2023 publication.
The application can be downloaded at:
Keywords: educational game, Unity3D, emotion regulation, scripting, virtual environment, immersion and interaction
Year: 2023 (project completed)
Context
Patients with cognitive impairments often have difficulty recognising emotions and interacting socially. To assess them, healthcare professionals typically use static photographs or actors.
Topic
This project is part of a Franco-Italian collaboration focusing on immersive technologies applied to the study of cognitive disorders.
We use ultra-realistic avatars called Metahuman to verify whether they can constitute a clinically validated database of emotional expressions.
Development is based on Unreal Engine 5.5.
Mots clés : reconnaissance des émotions, affective computing, Metahuman, Unreal Engine
Partenaire : CoBTeK Université Côte d’Azur
Années : 2024/2025 (projet en cours)
Context
Haptic feedback enriches the virtual reality (VR) game experience by engaging multiple senses (visual, auditory, tactile), providing a rich multisensory immersion.
At ISIR, a CoVR arena was developed where users can move around and interact with physical objects without the use of controllers. However, players immersed in 360° can be prone to becoming distracted and moving anywhere in the 3D virtual world.
This presents challenges for game design, especially when space and physical objects are limited, as is the case with CoVR. How can we optimize the use of space and interaction with the few available physical elements of the CoVR arena to deliver immersive, playful, and captivating experiences that fully exploit the sense of touch?
Topic
The main objective of this project is to design and implement a VR game in Unity3D that leverages the unique capabilities of the CoVR platform to provide a multi-sensory experience. A robotic system featuring physical objects (buttons, steering wheel, shelf, etc.) is used to predict the user’s trajectory in the arena, allowing them to interact with these objects.
Additionally, users can experience various forces through CoVR’s human-computer interaction features, such as pulling, pushing, blocking, touching, moving, etc.
Given the technical constraints, the first scenario of the game is relatively simple and lasts approximately between 5 and 15 minutes. It is inspired by the concept of “Fort-Boyard”, where players can explore different planets by traveling aboard a space capsule in search of mysterious artifacts. Each planet visited corresponds to a game level and has its own challenges and game mechanics.
The 3D universes vary from one planet to another in order to offer a diversity of settings and renewed immersion. This adventure is modular and can adapt to several situations (single player, accompanied by friends, etc.).
In addition, the modularity of the game architecture makes it easy to add new experiences and scenarios, thus allowing it to be developed gradually
Keywords: virtual reality, haptic feedback, human-machine interaction, robotics
Partner: project in partnership with ISIR at SU
Years: 2023/2025 (completed project)
Real-time technologies derived from video games offer powerful platforms for interactive simulation and immersive training for defence purposes (military doctors, firefighters, soldiers, police officers, etc.).
ISART carries out numerous innovative projects, drawing on its expertise in video game engineering, in collaboration with public institutions such as the Military Academy of Saint-Cyr Coëtquidan, the Military Academy of the National Gendarmerie, the Armed Forces Biomedical Research Institute (IRBA), the Val-de-Grâce School, and the Paris Fire Brigade, among others.
Context
Previous work on a Tetris game has highlighted the effect of music synchronization on player performance. Desynchronizing the tempo in the game appears to help players. Other studies (on other games) report the opposite result: in these cases, music synchronization helps the player. However, the players synchronized the sound to the player’s desired actions and not to “stressful” gameplay events (falling pieces, enemy appearance, etc.).
Study hypothesis: Does sound synchronization affect players differently depending on the gameplay element used as a reference?
In other words, we would like to determine whether a sound synchronized to the player’s actions has a different effect than a sound synchronized to gameplay events.
Topic
To address this issue, the ISART research team worked with the AI for Behavioral Engineering (AIPIC) team at the Coëtquidan Research Center. The teams developed a PC shooter game using the Unity engine, offering a multi-level 3D military universe. The player takes on the role of a first-person sniper soldier who must target enemies at specific locations to score points.
The music is synchronized either with:
or
The Wwise tool was used to design and generate the soundtrack and integrate it into the game.
Partner: Saint-Cyr Special Military School CreC
Years: 2022-2023
Partenaire : École spéciale militaire de Saint-Cyr CreC
Années : 2022/2023
Subscribe
60 bd Richard-Lenoir, 75011 Paris
Phone line : 9:00 a.m. > 6:00 p.m.
63 avenue Simone Veil, 06200 Nice
Phone line : 9:00 a.m. > 6:00 p.m.
1440 rue Sainte-Catherine O, Montréal,
QC H3G 1R8
Phone line : 8:30 a.m. > 5:00 p.m.