The quest for efficient text input without a traditional keyboard remains a significant hurdle in the XR headset industry, a bottleneck in unlocking various productivity possibilities. Researchers have now compiled an extensive catalog of existing text entry methods, highlighting their advantages and limitations. By offering this catalog for free, they are setting the stage for innovators to craft more advanced techniques.
This effort is spearheaded by Massimiliano Di Luca, who heads the VR Lab at the University of Birmingham in the UK. A seasoned figure in both psychology and computer science, Di Luca previously played a pivotal role at Meta, particularly in the realms of hand inputs and haptics for VR. His recent partnership with industry giants gained recognition, earning accolades at the ACM SIGCHI 2025 awards for groundbreaking contributions to the development of Android XR’s interaction framework, establishing key input methods and interaction guidelines for XR systems.
In the age of ever-evolving immersive experiences, the stumbling block of effective text input remains a vital concern for seamless VR and AR interactions. Whether it’s shooting off emails from virtual offices or logging into the metaverse for social interactions, efficient text input is crucial for the overall functionality of extended reality (XR) applications.
To tackle this head-on, my team at the University of Birmingham, in collaboration with researchers from establishments including the University of Copenhagen, Arizona State University, the Max Planck Institute for Intelligent Systems, Northwestern University, and Google, have developed the XR TEXT Trove. This research initiative meticulously catalogs over 170 text entry techniques designed specifically for XR applications. The TROVE functions as a well-organized repository, guiding the selection and analysis of various text input methods from both academic and industrial sectors.
Within this initiative, techniques are systematically categorized using 32 codes covering 13 interaction attributes like Input Device, Body Part for input, Concurrency, and Haptic Feedback Modality. Additionally, it includes 14 performance metrics such as Words Per Minute (WPM) and Total Error Rate (TER), offering a thorough snapshot of the current landscape of XR text entry techniques.
Our findings highlight several critical insights. Most significantly, the performance of text input is largely dictated by the number of input elements—whether they be fingers, controllers, or other character-selecting devices. Multi-finger typing is the only input method that approaches the speed and efficiency of traditional touch-typing on keyboards. As illustrated in the graphs, each additional input element enhances user speed by approximately 5 WPM.
Haptic feedback, the use of external surfaces, and a focus on fingertip-only visualization emerged as favorable strategies for enhancing typing performance. Typing on physical surfaces, rather than in mid-air, not only boosts comfort but also enhances efficiency by reducing continual muscle strain, which can lead to discomfort known as Gorilla Arm Syndrome.
Interestingly, despite various innovations, nothing has yet to fully usurp the traditional keyboard—which still achieves the highest typing speeds, likely due to its steep learning curve. We anticipate that breakthroughs in reducing the physical typing distances using AI and machine learning might lead to faster typing speeds in VR scenarios. XR is still searching for its equivalent of the smartphone’s transformative ‘swipe typing.’
In essence, the XR Text Trove’s extensive analysis marks a significant milestone in understanding text input in virtual and augmented reality. By offering a comprehensive, searchable database, we provide a pivotal resource for researchers and developers charting the course for more effective and user-friendly text inputs in our immersive future.
As detailed in our paper, this initiative stands to substantially benefit the XR community: “To advance XR research and design in this domain, we offer the database and its associated tool via the XR TEXT Trove website. The complete paper is set for presentation at the upcoming prestigious ACM CHI conference in Yokohama, Japan.
Moreover, several team members are also behind the creation of the Locomotion Vault, an endeavor that similarly collates VR locomotion techniques, aiming to give researchers and designers a head start in refining and advancing these methods.”