The Use of AR and VR to Change the Future of Music

Written by WiNSS Scholars Matt DeCesare & Homer Wang

Music is an experience. Consumers and makers of music now demand more and more immersive and interactive experiences, which has prompted an urgent call for innovative designs, models, and technologies. These innovations can take on many forms, including online programs, offline services, and physical products, but in this article, we aim to explore only those of the mobile digital features that existing or future mobile applications could consider implementing based on technology like Augmented Reality (AR) and Virtual Reality (VR).

Some of the trends happening to music streaming and related technologies are evolving due to the demands of mediated-and-live music. While live music is borne to its situational real-time presentation context and its engaging elements, mediated music refers more commonly to the channels and means of its distribution, through which sounds are transmitted over various media, generally not in real time, and often with significant production processing. The newer genre of mediated-and-live music aspires to integrate both advantages simultaneously, reaching a broader and younger audience, while maintaining an immersive and interactive experience. This trend lends itself well to new technological integration, such as AR and VR, which can enhance the experience of music by targeting music consumers who crave the “best of both worlds” (live and mediated). Social media is often used for music streaming. Mediated streaming of festival music increases around the festival period; live festivals direct the flow of audiences to use social media and streaming services, and  post-hoc evaluation generates ongoing revenue streams to recorded music (Danielsen & Kjus, 2019).

Streaming technology has appeared in multiple forms and has expanded over the past decade, helping to increase user interaction and immersion with music. The most well-known streaming services in the United States, Spotify and Apple Music, are known mainly for providing subscribed users with a wide range of available and streamable music. Additionally, both services provide music video access and have exclusive deals with certain artists that allow users to access additional content. Furthermore, Spotify provides more social features as well. It allows its users to create shared playlists that they can add to with friends, and shows a stream of activity from a user’s friends, such as what music a person has listened to recently. Other apps, such as NetEase, China’s premier streaming platform, and Gaana, India’s most popular streaming platform, are also helping to push streaming technology forward. 

While most publicity in music technology goes towards promoting services like Apple and Spotify, other lesser known services are pushing the industry into VR and AR technology. Some of these companies have used virtual reality to integrate music with gaming. Intone (https://www.wearvr.com/apps/intone) allows users to control their surroundings by making noise into a virtual reality headset. The user can look at different areas on a screen, and that area will be manipulated based on the type of noise the user makes. Rock Band, a popular game that allows players to play well-known music on instrument-like toys, has also released a version compatible with Oculus’ VR devices. Audioshield is another game similar to Rock Band, where players hit notes with their hands to play a song of their choice.

In addition to the above examples, other companies are using VR and AR technology to promote music promotion, discovery, and creation. The best example of this occurs on YouTube, where popular artists have begun to post music videos with a 360-degree scannable environment. Pop artist The Weekend’s The Hills remix feat. Eminem (A Virtual Reality Experience), is one of the most well-known examples of this. The video currently has over 2.5 million views. Other popular VR videos include Marshmello x Crankdat – Falling To Pieces, with 1.9 million views, and Gorillaz – Saturnz Barz (Spirit House) with 15 million views. Clearly, VR technology has presented artists with a unique way to share their content with fans. Another company pushing the envelope is VRTIFY. VRTIFY is focusing on connecting artists, venues, and music labels with their audiences using holographic, VR, and AR technologies. Current partners of VRTIFY include Lolapalooza, an American music festival with a maximum attendance of 400,000, and heavy metal band Metallica.

Virtual Reality and the Brain

Dr. Giuseppe Riva has explored the capabilities of VR technology, and how it might provide a stimulating experience via the Predictive Coding hypothesis of the brain. The Predictive Coding model suggests that the brain maintains an internal model of the body and the surrounding environment to reduce prediction error and real-world surprises. The creation of this model involves the reactivation of multimodal neural networks that have produced the expected effect before. Thus, stimulating a particular experience can involve reactivating groups of neurons, whose summation recreates that experience. VR technology operates extremely similarly to the Predictive Coding hypothesis of the brain. The technology creates an explorable simulated world, maintaining a model of the player’s body and the space around the player, exactly as Predictive Coding expects that the brain does. Since, according to Predictive Coding, human experience is effectively a summation of all of these simulated experiences, there is belief that VR has the capability of altering bodily experience by designing virtual environments targeted towards evoking particular sensations (Riva).

Virtual Reality Designs for Music

The potential for VR to completely change the way music is both consumed and created is enormous. By simulating intense concert environments, or placing users in a recording studio, VR can provide someone with a fully immersive music experience from their living room.

The first design we propose is a complimentary concert service. If, according to Predictive Coding, human experience is an internal simulation of the body and its surroundings, a correctly applied VR experience of a concert might provide someone with the exact feeling of being front row at their favorite artist’s concert from anywhere in the world. We propose that 360-degree sensors and cameras could be placed in the front row of live concerts, allowing consumers to live-stream the concert with a VR headset and experience the action in real time. This, of course, would only be able to occur with a co-sign from the artist whose concert is being streamed. Why would any artist agree to take fans out of their crowd and back to their living rooms? This experience would not be free, but only available to ticket holders, similar to the live concert itself. Take Billy Joel, whose concert at the Hard Rock Hotel and Casino in Hollywood, FL on January 10th, 2020, sold front row tickets for $811. Rather than only selling a small quantity of these front row tickets, additional front row VR tickets could be provided at a cheaper price, say $75. This would generate revenue for the Hard Rock Hotel and Billy Joel, while allowing fans who may not have the time or the money to attend the concert in person to still get an authentic experience. To make the experience more authentic and social, everyone who purchases one of these VR tickets could be placed in a live virtual chat room with other VR ticket holders as they watch the concert. Now that social distancing orders are in effect and the landscape of large audience events is on hold (at least for now), one could also imagine that this could be the future of immersive music experience for even more reasons.

The ideas above could be joined with AR technology for an increasingly immersive experience. In addition to attending the concert in VR, AR capabilities could provide animated effects such as fireworks going off on stage, changes in weather conditions, or even changes in the artist on stage themselves. Take Travis Scott concerts as an example. Travis Scott is known for building amusement park rides on stage, such a roller coaster that stretches over the crowd. The VR ticket holders at the Travis Scott concert might not only see the roller coaster, they could experience it, as well as an entirely AR-generated amusement park in full swing surrounding the entire concert.

Lastly, VR technology could be utilized for music creation. Rather than having to purchase expensive instruments, speakers, and other equipment needed to record music, a VR program could simulate this experience through a virtual recording studio. Similar to the VR games mentioned above, such as Rock Band, where players are given toy instruments, a VR recording studio could allow full capabilities for someone trying to make music. This type of technology could completely change the way music is made.

Augmented Reality Designs for Music

So how could AR affect our musical experience? Professor Anne Danielsen at the University of Oslo and colleague Inger Helseth expressed their concern to the diminishing form of liveness onstage as mediated technologies have become quintessential for presenting music that can stimulate both visual and auditory perceptions (2016). Any discrepancy between these two sensory dimensions could be mitigated by focusing on the genre and context of the music being performed, or else the visual effect has to direct the audience’s attention back to the auditory effects. In simpler terms, having something like AR could not only enhance the mere experience and fun of the music being presented, but also mobilize additional senses, such as touch.

Apps that associate themselves closely with live music events, such as mobile ticketing agencies and info-display apps created for certain theaters and events, could install real-time venue- and event-related AR. This AR feature would allow people who currently have the app to scan their surroundings and make certain mascots and special effects to appear on their mobile device before, after, and during the show. This could direct the audience’s focus to the event or the venue related to the music being presented, but it could also engage the audience’s interest to a set of useful information. For instance, when pointing the activated device to the stage before the show, a Star-War-ending-like introduction of the show could be provided to the audience. An additional feature that could enhance the user’s music experience through AR would be to allow the users to record videos and pictures through an AR lens in the app with other special effects appearing alongside the actual performance (eg. snowflakes flowing in the air in a Christmas concert).

Second, for music creators, whether amateur or professional, apps could integrate AR as an ideation tool. For instance, based on the creator’s surrounding color schemes and patterns, the app could generate a tailored range of sounds using computer vision algorithms. The AR display on the user’s mobile device could show the recommended notes; if the user touches them, those notes would play, allowing users to immerse themselves in a world of origination and imagination. Similarly, other underlying traits of a real production setting, including the weather and time of the day, could also be taken into account by the app when suggesting the notes. As creativity boosts productivity, apps with the capability to install an AR interface should aim to increase musical creativity by allowing an open flow of ideas, interactions, and possibilities.

Third, music entertainment apps that target end consumers could also employ AR alongside object identification technology to increase audience engagement. Take Spotify as an example where it could have such a feature. If you point your device’s camera to an object— say a window— you could hear an addition of window glass shattering to the song being played. In turn, the AR display would also animate this additional sound effect. This natural sound incorporation could apply to a variety of household or workplace objects, animals, symbols, or even people.

A Conclusion

We strongly believe that VR and AR technology has the potential to entirely alter the way music is streamed, discovered, and created. Already existing services, festivals, and venues such as ticketing app StubHub, streaming services like YouTube, Spotify and Apple Music, and Snapchat, and AR capable communication apps, could serve as partners and launching pads off of which our proposed ideas could be promoted around the world.

References

Danielsen, A. & Helseth, I. (2016). Mediated immediacy: The relationship between auditory and visual dimensions of live performance in contemporary technology-based popular music. Rock Music Studies, 3(1): 24-40. doi:10.1080/19401159.2015.1126986.

Danielsen, A., & Kjus, Y. (2019). The mediated festival: Live music as trigger of streaming and social media engagement. Convergence-the International Journal of Research into New Media Technologies, 25(4): 714-734. doi:10.1177/1354856517721808.

Riva, Giuseppe, et al. (2019). Neuroscience of Virtual Reality: From Virtual Exposure to Embodied Medicine. Cyberpsychology, Behavior, and Social Networking 22(1):82–96.  doi:10.1089/cyber.2017.29099.gri.

Headshot of a person wearing a white shirt, standing in front of a textured, light-colored wall. The individual is smiling at the camera.

Matt DeCesare (W’21) is concentrating in Business Analytics and Behavioral Economics and is a minor in Data Science. He is from Marlton, NJ.

A person holding a microphone, appearing to perform or speak on stage, wearing a dark shirt with a blurred background.

Homer Wang (W’22) is concentrating in Finance and Marketing & Operations Management. He is also pursuing a minor in Cognitive Science and a master’s in Computer and Information Technology. He is from China and Canada.