EPSRC-funded PhD on XR Musical Interactions

Posted on 16 March 2024 by Hyunkook Lee

Enhancing Immersive Experience in Extended Reality (XR) Musical Interactions

We are looking for strong candidates to apply for a fully-funded PhD project on “Enhancing Immersive Experience in Extended Reality (XR) Musical Interactions”, which will be supervised by Prof Hyunkook Lee and Dr Duke Gledhill at the University of Huddersfield. This is part of the EPSRC doctoral training programme.

This call is open to UK Applicants only.
Applicants should be of outstanding quality and exceptionally motivated.The studentships are funded for 3 years (a tax free stipend starting at £19,237 for 2024/25) subject to satisfactory annual performance and progression review.

Please note that there are more projects than funded studentships available and therefore this is a competitive application process which will include an interview.  Shortlisted candidates will be contacted for an interview in person or via Teams.  After interview the most outstanding applicants will be offered a studentship.

Application Details

  • Complete the Expression of Interest Form 2024
  • Provide copies of transcripts and certificates of all relevant academic and/or any professional qualifications.
  • Provide references from two individuals

Queries about the application process are welcomed.  These along with completed forms, including all relevant documents should be submitted via-email to pgrscholarships@hud.ac.uk by the closing date which is 12 noon on Friday 12th April 2024.

If you are interested in applying and have any questions, please contact Prof Hyunkook Lee at h.lee@hud.ac.uk


Project Introduction

Extended Reality (XR) systems enable users to interact with virtual beings and/or objects superimposed onto the real or/and virtual worlds. This project aims to (i) create an innovative XR system utilising advanced 3D virtual acoustic and visual simulation technologies, allowing users to seamlessly sing or play a musical instrument alongside real/virtual musicians integrated into the XR world, and (ii) explore the impact of acoustic and visual congruencies within the system on the user’s perceived self and social presence, as well as various psychological and biomedical attributes. 

Project Details

Extended Reality (XR) is a transformative technology that encompasses Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), offering a spectrum of immersive experiences that blend the real and digitally simulated virtual worlds. Spatial computing technologies for XR are advancing rapidly, and XR applications find utility in diverse industries like gaming, music, education, healthcare, and more. One major advantage of XR technologies is their ability to enhance social presence by seamlessly merging virtual and physical worlds, fostering shared experiences, and facilitating more immersive and interactive communication between individuals. 

This PhD project centres around the theme of interactive musical performance in XR environments. Achieving a truly realistic and immersive experience necessitates plausible simulation not only of visual elements in the virtual world but also acoustic features from either the virtual (in VR/MR) or real world (in AR). Despite progress in VR, there is a notable absence of explorations into the possibilities presented by AR and MR systems in musical interaction applications. Furthermore, no research to date has delved into the relative importance of accuracy in visual and acoustic simulations and their congruency in enhancing the sense of social presence, well-being, and other psychological attributes, highlighting the need for in-depth research in this area.

The project is multidisciplinary, involving audio and visual computing, sound engineering, as well as psychophysical and physiological measurements. The aims of this project are as follows: 

(i) To develop an XR system that enables users to collaboratively create music with virtual counterparts, whether they be other users or virtual musicians seamlessly integrated into the real or virtual world. 

(ii) To explore the threshold of accuracy required in visual and acoustic simulations to deliver a highly immersive experience in interactive XR musical performances. 

(iii) To examine and compare the immersive experiences offered by VR, AR, and MR, evaluating their relative effectiveness in providing users with a heightened sense of engagement and presence. 

This project has the potential to create significant societal impacts. The ability to sing or play music together with virtual counterparts will transform musical collaboration and education, while providing a sense of social presence in such collaborations also has the potential to enhance mental health, such as reducing stress or overcoming loneliness. 

Entry Requirements

Qualification: a taught MSc degree (Distinction), an MSc by Research degree or a BSc degree with a First grade in relevant subject areas, including computer science, game engineering, visual computing, audio engineering and electrical/electronic engineering. 

Skills: Programming for XR content creation (C++), statistical analysis and signal processing tools (e.g. R, Matlab, Python), games engine (preferably Unreal Engine) and audio workstation 

Experience: The ideal candidate will possess prior experience in programming game content using Unreal Engine, and/or conducting or assisting with research projects in a related field. 

Knowledge: The candidate should have solid basic knowledge in game programming, 3D visual computing and audio/sound engineering. 

Previous Post
VIRTUOSO – SonicScoop’s Top Pick for Best New Plugin of 2023