New VR music platform enables real-time virtual performances
New VR music tech lets musicians play together in sync, without lag, using lifelike avatars and a smartphone.

A new platform lets musicians rehearse and perform together in virtual reality, with real-time syncing and expressive avatars. (CREDIT: ARME Project)
Musicians across the world are now a step closer to feeling the real-time thrill of performing together—without being in the same room. A new technology is changing how music is made, practiced, and taught by turning video recordings into lifelike virtual performances.
This breakthrough blends the rhythm of live music with the power of virtual reality, delivering a vivid and personal music experience that reaches beyond physical boundaries.
A New Way to Play Music Together
The Joint Active Music Sessions, known as JAMS, is a virtual music platform designed to make remote performances feel real. Developed by a team of researchers, this tool allows musicians to rehearse, perform, and teach in virtual spaces using avatars. These digital characters mimic real human movement and expressiveness, letting artists interact just like they would in a live setting.
To join a session, a musician only needs a smartphone to record themselves and a virtual reality headset. The software then transforms the video into an avatar that plays in perfect sync with another user. Musicians can see each other’s actions—like the tilt of a violinist’s bow or a key hand gesture—at exactly the right moment.
This kind of precise coordination is crucial. In live music, the slightest delay between players can break rhythm and focus. That delay, called latency, can cause problems even at 10 milliseconds. But JAMS eliminates that issue entirely.
“Latency is the delay between a sound production and when it reaches the listener,” says Dr. Max Di Luca, one of the platform’s creators at the University of Birmingham. “Performers can start to feel the effects of latency as low as 10 milliseconds, throwing them off-beat, breaking their concentration, or distracting them from the technical aspects of playing.”
With JAMS, every beat hits at the right time, every glance and motion lands naturally, and every session feels like an in-person gathering. The experience is smooth, without interruption or lag.
Related Stories
Capturing the Feeling of Playing Live
JAMS goes far beyond simple video sharing. It uses avatars that respond in real-time and give performers visual cues that matter. This includes small but important elements—like eye contact or shared body language—that help musicians stay in sync and communicate while playing. That feeling of shared presence is central to music.
Unlike traditional video calls, the VR headset delivers a highly immersive setting. It places the musicians in a shared virtual space where their avatars interact as if they were on stage or in a rehearsal room. Even the placement of faces at eye level improves the connection between performers.
This visual accuracy doesn’t just add to the realism—it also boosts learning. Watching a skilled musician’s movements closely can help less experienced players improve faster. “You can adapt the avatar that other people play with, or learn to play better through practice with a maestro,” says Dr. Di Luca.
Building a Social Network for Musicians
The vision behind JAMS includes more than just virtual practice sessions. It aims to create a digital space where musicians can connect, perform, teach, and grow as a community. It could become something like a music-focused version of Spotify or Myspace, where people meet to make music, share their work, and even perform for larger virtual audiences.
The platform is made with musicians in mind—whether they are seasoned professionals or just starting out. This sense of inclusiveness is one of JAMS' strengths. It gives beginners a space to improve while also offering advanced players tools for professional performance.
The avatars aren’t just for show—they are built using a special algorithm created during the Augmented Reality Music Ensemble (ARME) project. This project united experts from several areas, including psychology, computer science, engineering, music, sport science, and math.
Together, they built a model that could capture the complex timing and movement of real musicians. The result is a responsive avatar that can match the physical actions of its human counterpart with surprising accuracy.
From Practice Room to Big Opportunities
JAMS isn’t only a practice tool—it opens doors to many other uses. It can be adjusted for lipsyncing and dubbing in TV or film. It also gathers performance data that can be used to create digital “twins” of musicians. These digital versions can be licensed for games, animations, or virtual concerts, giving artists more ways to share and profit from their work.
This could also change how music rights are handled. Digital avatars performing real pieces might allow music to be shared across platforms in new ways, offering publishers more control and more options for distribution.
There’s also potential for educational growth. Music teachers can use JAMS to create practice partners for their students, helping them improve timing, technique, and musical expression. Students can interact with avatars modeled after famous performers, making learning more exciting and relatable.
A Step Toward the Future of Music
The strength of JAMS lies in its ability to recreate the emotional and technical experience of live music. By removing delay, capturing body language, and offering an interactive virtual world, it helps musicians feel truly connected—no matter how far apart they are.
This platform is not just a tech project. It’s a movement toward a future where people make and share music in ways never before possible. Whether it’s through remote jam sessions, virtual teaching, or large-scale digital concerts, JAMS is reshaping the music world.
As Dr. Di Luca explains, “We’re aiming to bring the magic of playing music in person to the virtual world.”
Note: The article above provided above by The Brighter Side of News.
Like these kind of feel good stories? Get The Brighter Side of News' newsletter.

Mac Oliveau
Science & Technology Writer | AI and Robotics Reporter
Mac Oliveau is a Los Angeles–based science and technology journalist for The Brighter Side of News, an online publication focused on uplifting, transformative stories from around the globe. Passionate about spotlighting groundbreaking discoveries and innovations, Mac covers a broad spectrum of topics—from medical breakthroughs and artificial intelligence to green tech and archeology. With a talent for making complex science clear and compelling, they connect readers to the advancements shaping a brighter, more hopeful future.