You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Development of a new immersive virtual reality (VR) headset-based dexterity training for patients with multiple sclerosis: Clinical and technical aspects



Impaired manual dexterity is frequent and disabling in patients with multiple sclerosis (MS), affecting activities of daily living and quality of life.


To develop a new immersive virtual-reality (VR) headset-based dexterity training to improve impaired manual dexterity in persons with MS (pwMS) while being feasible and usable in a home-based setting.


The training intervention was tailored to the specific group of pwMS by implementing a simple and intuitive application with regard to hardware and software. To be efficacious, the training intervention covers the main functions of the hands and arm relevant for use in everyday life.


Taking clinical, feasibility, usability as well as technical aspects with regard to hardware and software into account, six different training exercises using hand tracking technology were developed on the Meta quest 2 using Unity.


We report the developmental process of a new immersive virtual VR headset-based dexterity training for pwMS implementing clinical and technical aspects. Good feasibility, usability, and patient satisfaction was already shown in a feasibility study qualifying this training intervention for further efficacy trials.


Multiple sclerosis (MS) is a chronic inflammatory disease of the central nervous system and the most common cause of non-traumatic disability in young adults in western countries [1]. Despite increasing therapeutic options to ameliorate the disease course, most patients suffer from persistent neurological deficits over time.

Impaired manual dexterity is a frequent and relevant handicap that independently impairs activities of daily living (ADL) and quality of life (QoL). Additionally, it is associated with loss of work and the need providing care [2, 3, 4]. Manual dexterity is preserved longer in the disease course compared to walking capabilities, leading to a different meaningfulness throughout the disease [2, 5]. In early stages of the disease, arm-hand function is amongst others important for activities such as driving a car or doing creative activities. In later stages with usually lower extremity functions worsening first, manual dexterity is important to maintain daily functions such as dressing and eating, using a wheelchair or performing intermittent catheterization [5, 6]. Therefore, manual dexterity is an important function for neurorehabilitation to improve ADL and QoL in persons with MS (pwMS) [4].

Manual dexterity is usually trained with physical- or occupational therapy in a low-frequency outpatient setting using traditional methods such as “hands-on techniques” [7]. Accessibility to physical therapy was poor in a European survey in MS, with access to outpatient therapy varying from 34% to 41.3% and inpatient therapy varying from 17.4% to 28.5% [8]. Furthermore, accessibility differed significantly amongst regions and overall frequency of use was low (32.7%) [7, 8].

With recent technological innovations such as the development of applications (apps) for mobile phones, tablets as well as virtual reality (VR) Headsets, new treatment options arise that can be used on their own or as a supplement to classical therapies for various medical conditions [9]. Mobile devices are cost-effective and can be used independently enabling high-frequency home-based training compared to traditional methods or complex expensive approaches requiring inpatient treatments [10, 11]. Patients can be treated who otherwise have no access to therapies due to limited mobility, lack of therapy in the region, travel costs, or lack of time which are main reasons not to participate in classical therapies [7].

In this regard we previously performed home-based research projects in MS with conventional training methods and Tablet App Based Dexterity Trainings [12, 13, 14, 15, 16]. However, training with VR headsets offers additional promising therapeutic options and initial trials were previously performed [17].

The aim of this project was therefore to develop a new immersive VR headset-based dexterity training to improve impaired manual dexterity in pwMS while being feasible and usable in a home-based setting. We previously showed good feasibility, usability, and patient satisfaction of the training intervention in a feasibility study [18]. In the current paper, we describe the developmental process of the training intervention with regard to clinical and technical aspects.

2.Materials and methods

The training intervention was developed by the corresponding author and the Start-ups “12 Parsec” (Oberfeld 3, 6037 Root, Switzerland), “Holonautic AG” (Felmis-Allee 11, 6048 Horw, Switzerland) and “Westhive” (Hardturmstrasse 161, 8005 Zürich, Switzerland). The corresponding author is neurologist (clinician) and defined and specified the clinical aspects and requirements regarding clinical meaningfulness, feasibility, and efficacy of the training intervention. Within this process, key movements of the hand and arm were defined that had to be covered in the training intervention. The technical realization was carried out by Holonautic AG and 12 Parsec in close cooperation with the corresponding author.

Clinical aspects and requirements regarding the development of the training intervention

The major goal of hand functioning is to manipulate objects and use tools. Hand use in humans can be divided broadly into tasks requiring the use of multiple digits simultaneously in a grasp (multidigit grasping) and the use of individual movements in which one digit moves considerably more than other digits (individual finger movements). The latter is unique for humans compared to animals (including monkeys) leading to an extensive range of grasp possibilities [19].

Multidigit grasping, the most behavioral use of the hand, entails simultaneous motion of multiple digits. Movements that close the fingers around an object in a coordinated fashion start before contact. During reaching movements directed to objects, the fingers gradually preshape the entire hand to approximate the object contours as the hand approaches [19].

Individual finger movements are performed during multidigit grasping as well, enabling the hand to form to a specific object shape and permitting some fingers to be lifted off the object while maintaining a stable grasp. However, for fine motor tasks such as tying a knot or manipulation small objects, finger movements are individuated considerably more.

Herewith, the thumb and the index finger play a major role having the greatest degree of independence for such tasks, whereas the middle and ring fingers have the lowest. The thumb is a unique aspect of humans (and higher primates) which is related to its position on the hand allowing circumduction of the thumb. This facilitates opposition of the thumb to the digits which is required for all useful prehension of the hand.

Seven maneuvers make up most hand functions for object manipulation and tool use in daily life summarized in detail in the publication of Duncan et al. [20].

  • 1. The precision pinch;

  • 2. The oppositional pinch (subterminal pinch);

  • 3. Key pinch maneuvering;

  • 4. The chuck grip;

  • 5. The hook grip;

  • 6. In the power grasp;

  • 7. The span grasp maneuver.

Wrist movements and positioning, flexion-extension and pronation-supination movements of the elbow, and shoulder movements are mandatory to achieve precise and efficacies hand functioning as well. It enables the hand to approach objects adequately and tackle them from the right position. In addition, to perform a power grip, a stable wrist is needed [21].

In order to develop a training intervention that improves manual dexterity and its related ADL and QoL in pwMS, the above illustrated most important hand- and arm functions were integrated into the different training exercises as described in more detail below.

Feasibility/usability aspects and requirements regarding the development of the training intervention

Manual dexterity is usually preserved longer in the disease course compared to lower extremity functions [2]. PwMS with impaired manual dexterity a therefore likely to be older, have more disability, progressive disease courses, the need to use of walking aid, and poorer performances in static and dynamic balance tests, all factors strongly associated with risk of falls [22]. In addition, the chance of having relevant cognitive deficits is greater in this patient group [23]. These aspects differ from healthy people for whom, for example, games are programmed, and must be taken into account in terms of good feasibility and usability of the training intervention. This implies an easy handling of the device (hardware) and the training intervention itself (software), which should be as simple and intuitive as possible. In addition, instructions and help should be provided as describe in more details below.

Motion sickness (also called “kinetosis”) classically occurs on land, in the air or at sea [24]. However, modern simulation systems such as VR can induce motion sickness as well, amongst others called “simulator-” or “cybersickness” [24, 25, 26]. Motion sickness results from an intersensory conflict between the vestibular, visual, and proprioceptive systems under conditions of movements and is typically triggered by low-frequency vertical, lateral, angular, rotary motion [27]. Clinically, motion sickness amongst others presents with malaise, blurred vision, non-vertiginous dizziness, drowsiness, nausea and vomiting [24, 27]. Motions sickness may therefore restrict or prevent the use of VR-devices as treatment tools, especially in pwMS with additional bodily or cognitive deficits.

It is therefore important to develop the training intervention that motion sickness is avoided as far as possible. To prevent postural instability, falls and injuries, it should ideally be performed in a seated position.

Hardware determination

To find the appropriate hardware for use in the therapeutic field of MS, a comparative method (requirement matrix) was used regarding clinical, feasibility and usability requirements as well as technical possibilities and current technical limitations. To establish the requirement matrix, main criteria and the corresponding subtopics were classified from all parties (medicine, development, technology) as follows:

  • 1. Complexity of operations

    • How complex is the operation of the device?

    • What is required to reach operating status as quickly as possible?

    • How high does the user’s technical understanding need to be to reach operation mode.

    • Does the device run alone (standalone) or are subsystems required?

  • 2. Usability of the device

    • According to which definition is the device operated (controller vs. camera data)?

      • * Input management methods (= movement detection method)

      • * Input management device (= movement data generation)

    • How is the learning curve for operating the device defined?

  • 3. Software modifiability

    • Can own developments be installed on the system?

    • What are the existing development platforms?

    • Which engines can be used to develop the required apps?

    • How is monetization defined for in-house developments of the engine?

  • 4. Hardware modifiability

    • Can the playback system of the hardware be modified?

    • Can other components (e.g. for gesture control) be added?

    • How well can you react to changes of the application (adaptation to a changed area of application)?

Regarding the study population and home-based nature of the therapeutic intervention, the device should additionally operate as autonomously as possible. Laymen should be able to operate the device, handling of the device should be limited to starting and calling up the application, the device operation should not be cognitive challenging, and the device should be easily portable. Importantly, the device should have advanced finger-hand tracking technology that should detect the following movements (Pinch Grip, Finger rotation, Twisting motion of the fingers, Movements of individual fingers, Movements of several fingers, Wrist rotation, Small to medium range of motion of the arms).

Using the describe requirement matrix, the following, 2022 available devices for VR, augmented reality (AR) and mixed reality (MR), were examined

  • HTC Vive Pro

  • Pimax 5k

  • Meta Quest 2 (former Oculus Quest 2)

  • HoloLens 2

As main differences, the Meta Quest 2 and the HoloLens 2 can be operated autonomously without sub-systems (includes all necessary sensors and computing power). External cables, nor fixed stationary area are necessary though the systems can be used anywhere. In contrary the HTC Vive Pro and Pimax 5k need external sensors to determine the position in space, and sub-systems are required to calculate the digital content. The Ocrevus Quest 2 and the HoloLens 2 are additionally designed for immersive VR, augmented reality (AR), or mixed reality (MR) whereas the HTC Vive Pro and Pimax 5k are designed for VR only.

Applying the above-mentioned requirement matrix, the Meta Quest 2 was determined as most suitable device as illustrated in the Supplementary Materials.

Software determination


Several software engines such as Unity, Unreal Engine, CryEngine or ARKit 3 are available. Due to longstanding experience and familiarity, the development team chose Unity. The limited computing power of the VR headsets is dependent on a simple framework, which can still achieve good performance. Unity’s hardware requirements are significantly lower than those of for example Unreal Engine, which is predominantly designed for use in the area of high-performance systems such as the gaming industry or animation in cinema productions. Unity is also a framework that is widely used among developers making it is easier to find a team that can implement requirements using Unity. In addition, Unity offers free licensing with revenue-based restrictions.

VR application

Commercially available headsets and frameworks are usually developed for healthy persons with normal mobility of the upper extremities. Furthermore, they are mostly designed for gaming purposes and therefore do not address specific requirements for clinical purposes, for example the recognition of single digit movements. After testing currently available hand tracking framework of the Meta quest 2 headset regarding such needs, it became clear that a custom approach would be needed to fulfill all requirements.

Holonautic therefore created a custom framework to allow maximum flexibility, to handle specific use cases and to adapt ideally to the feedback received from the patients.

The current state of VR has multiple frameworks for integrating handtracking interactions into applications, but most are limited to specific hardware providers. This lack of consistency and vendor neutrality is challenging, as seen with the current lack of a solidified Application Programming Interface (API) for handtracking data through the OpenXR standard and certain headset manufacturers’ decision to not follow the standard. The majority of these frameworks also assume full user mobility, making it difficult to adapt the parameters for specific clinical applications.

To overcome these limitations, Holonautic has developed a custom framework that only uses finger and palm data to create a vendor-neutral interaction framework that can easily support various headset hardware with minimal changes to the higher-level API calls. These data points were used to visualize the hand and also to validate “grabbing”, “letting go” and other movements. This allows maximum flexibility to adapt the framework to the limited mobility of patients and support a wide range of hand tracking technologies.

Furthermore, it minimized dependency on the existing hand tracking of the Meta Quest 2 technology. This will allow supporting other headsets and manufacturers as the higher-level API abstract away from the lower-level data structures and vendor-specific frameworks, with minimal changes to the application itself.

The incorporation of this function made it possible to record and measure finger-/arm-/hand-movements correctly. The measurement of the movements is based on conventional forms of the therapy such as the Nine Hole Peg Test (9HPT). The transfer of the correct hand and arm movement from classic therapy to this new form in virtual space could thus be implemented. In this way, it was possible to precisely measure the intended movement, but allow enough tolerance in the movement to allow the patient to successfully complete the training session. The subtleties of the different movements were fine-tuned with each release of the software and could be successfully applied at the prototype stage. The solution, which was specially programmed for this purpose, could be used on consumer devices such as the Meta Quest 2 and does not require any porting efforts. We achieved the goal to create a product that could be used on standard devices. The software could be used without the patient having to learn complicated processes to start the application.

The data is collected via a cloud backend, which collects the data from the Meta Quest 2 and saves the data anonymously. The only condition for this was that the device had a WLAN or internet access. As a further alternative, the data can also be copied locally from the Meta Quest 2. Currently, we do not collect (existing data) on movements (= hand/wrist/arm movements) yet. This is however possible and interesting because such information could be used to analyze disability as well as efficacy endpoint and should be addressed in future developments.

To prevent motion sickness, we reduced the intersensory conflict and synchronized the visual system with the motion. This was supported by avoiding low-frequency movements (especially vertical ones) with an ideal latency in the range of 7–15 ms and focusing on the horizon or a distant point. In this regard, we provided an artificial horizon or horizon information with a simple structure to avoid visual and cognitive distraction as well as visual overburdening [24, 25]. So-called “camera dodging” (which simulates dynamic encounters in games) was avoided. Depth perception was not used and objects in the room were kept real sized for a natural appearing perception.


Development of the training intervention

Taking the above describe clinical, feasibility, usability as well as technical aspects with regard to hardware and software into account, six different training exercises were developed on the Meta quest 2 using Unity.

Figure 1 gives an overview of the training exercises. The training intervention was performed in sitting position to prevent falls and injury.

The detailed performance of the different exercises and the implementation of clinical aspects with regard to key movements is shown in Table 1. The implementation of feasibility and usability aspects describing the most important measures to improve feasibility of each training exercise is shown in Table 2.

Table 1

Description of exercises and medical implementations

ExerciseSee Fig. 1Key movementsDescriptionFrequency
Catching applesPicture 3Target orientated hand/arm movements Span graspApples flying towards the proband have to be catched.30x/session 1 session/hand
Finger circling (Index finger)Picture 4Training for all key hand functionsA virtual circle is attached to the tip of the index finger, which has to be traced with the movement of the index finger.10x/session 2 session/hand
Bending/stretching fingersPicture 5Training for all key hand functionsEach finger has to be bended to touch the palm and then fully extended.3x/finger/session 2 session/hand
Pinch gripPicture 6Precision pinch. Opposition pinch Target orientated finger/hand/arm movementsResting geometric shapes in front of the proband have to be grasped via precision pinch/opposition pinch and dropped in a basket in the lower part of the field of vision by opening the grip.10x/session 2 session/hand
Tracing shapesPicture 7Target orientated finger/hand/arm movementsDifferent shapes (triangle, square, lying eight, etc.) have to be colored about 80% with the extended index finger.5x/session 2 session/hand
Wrist rotationPicture 8Power grasp Span grasp Target orientated finger/hand/arm movements pronation-supination movement of the elbowA hollow cylinder with one bottom has to be grasped with a full hand closure. The cylinder has to be turned 180 to see the color of the bottom inside the cylinder. The cylinder has to be placed in the basket with the matching color by opening the grip.10x/session 2 session/hand

Figure 1.

Virtual reality headset-based immersive training intervention comprising six training exercises. (1) Meta quest 2. (2) Menu/Program overview. Training interventions (blue buttons) can be selected, and an instructional text and video is shown. Training is started pressing the green (Start) button. (3–8) Training interventions. (3) Catching apples. (4) Finger circling (Index finger). (5) Bending/stretching fingers. (6) Pinch grip. (7) Tracing shapes. (8) Wrist rotation.

Virtual reality headset-based immersive training intervention comprising six training exercises. (1) Meta quest 2. (2) Menu/Program overview. Training interventions (blue buttons) can be selected, and an instructional text and video is shown. Training is started pressing the green (Start) button. (3–8) Training interventions. (3) Catching apples. (4) Finger circling (Index finger). (5) Bending/stretching fingers. (6) Pinch grip. (7) Tracing shapes. (8) Wrist rotation.

Table 2

Description of training exercises regarding key feasibility and usability measures

ExerciseSee Fig. 1Description
Catching applesPicture 3Catching an apple is signaled via an acoustic signal
Finger circling (Index finger)Picture 4A virtual circle is attached to the tip of the index finger, which has to be traced with the movement of the index finger. The circle is colored throughout the movement and an acoustic signal sounds if the circle is closed.
Bending/stretching fingersPicture 5The finger to be used is colored. Green arrows at the finger tipps point to the direction of the current movements (Bending or stretching) down. The number of movements is displayed on the palm counting from 3 to 0. If three movements were performed, the next finger lights up. In addition, an acoustic signal sounds if the movement was performed properly.
Pinch gripPicture 6Objects (ring, cone, pyramid) appear one at a time in one of three white circles. An acoustic signal sounds if the object is put properly in the basket.
Tracing shapesPicture 7The object floats in front of the patient. The index finger to paint with is colored. The direction and the starting point are freely selectable. A bar at the backboard shows the progress. An acoustic signal sounds if the object is properly colored.
Wrist rotationPicture 8To ensure wrist rotation, the cylinder has to grasped with the full hand and be turned 180 towards the proband to see the color of the bottom inside the cylinder. The cylinder has to be placed in the basket with the matching color by opening the grip. An acoustic signal sounds if the object is put properly in the right basket.

Hand and finger tracking technology was exclusively used within the training intervention to enable a realistic, effective and precise interaction in a natural fashion and an increased immersion and presence, i.e. the subjective experience of being in a highly-immersive virtual environment [28]. The Meta Quest controllers could be used to install the Meta quest and open the training intervention. This can however be alternatively done with built in hand tracking technology as well.

After opening the app, a home screen arises with a menu dashboard including a written introduction and the six training exercises outlines in blue buttons (Fig. 1, Picture 2). The dashboard can be maneuver 3-dimenionally for easy use. After selecting a training exercise via pushing the respective button, a written explanation as well as a video showing the training exercise is displayed (Fig. 1, Picture 2). After reading the explanation and watching the video, participants can enter the respective training exercise via pushing the green START button (Fig. 1, Picture 2).

Summarized, in every training exercise, a blackboard is shown in the background in which the training task is describe such as “Catch the apples with the hand”. In addition, the hand to be used is stated and how many tasks were accomplished and still have to be done, i.e. “5/30 apples caught with the right hand” (Fig. 1, Picture 3).

All training exercises were performed with both hands using the right or left hand per session in an alternating fashion. For example, in “Catching apples”, 30 apples had to be caught with the right hand (1. Session) followed by the left hand (2. Session) followed by the right hand (3. Session) followed by the left hand (4. Session).

The change of hands is signaled via an acoustic signal and a text that is fading in. In addition, the hand to be currently used is envisioned whereas the other hand is transparent and cannot be used to perform the exercise (Fig. 1, pictures 3–7).

If a training exercise was performed properly, the blue button on the main menu is crossed so patients know which training exercises are completed. The training exercise can be left prematurely by looking down (on the floor). An exit button will appear that can be pushed. The training exercises can be done in random order.

Datapoints on the exercises such as exercise type, date/time exercise started/ended, left/right hand, duration, iteration count, exercises accomplished/failed etc. were collected and stored on the device inside a SQL lite database (SQLite file). The date and time used are based on the OS of the headset. The data is stored continuously as the application is used. In the first iteration, the data can only be accessed directly on the headset and can be downloaded through adb (Developer tool for android). This can be used for measuring adherence to the training intervention as well as outcome parameter.


VR is a computer-generated environment with scenes and objects that appear to be real, making the user feel they are immersed in their surroundings. Immersive VR devices will probably be highly suitable to be used as clinical devices in the future including virtual rehabilitation [11, 29].

Advantages of VR are the possibility to create realistic risk-free environments that are not realizable and/or financeable in the real world. These VR environments are easily adaptable to the needs of the user, and the possibility of gamification increases patient motivation [30]. As further advantage, VR can be performed in the patient’s home and monitored at a distance (telerehabilitation). The home-based setting allows frequent training independently of available hospital or community-based rehabilitation programs which makes training interventions available to a larger group of patients [11]. VR devices can be potentially used to measure outcome parameters such as adherence or the efficacy of interventions [31, 32]. This applies to the already collected data regarding exercise type, date/time exercise started/ended, left/right hand, duration, iteration count, exercises accomplished/failed. In addition, datapoints on the finger/hand/arm movements could be collected and used to analyses disability as well as efficacy endpoint of manual dexterity. This very interesting and useful possibilities would imply an evaluation of psychometric properties of the techniques.

To provide effective, accurate, safe an equal service to patients, medical devices should undergo legal metrology for regulation and standardization [33]. In addition, it is possible to monitor and guide patients virtually for example via avatars allowing intensive remote training [34]. Artificial intelligence (AI) has great potential in healthcare including medical devices and will further accelerate the therapeutic devices. However, international standards concerning of AI in medical devices are needed [35]. Modern hand- and finger tracking technologies enable more effective and precise interaction and an increased immersion and presence enabling training of manual dexterity in a natural fashion without the use of devices such as controllers [28].

Because of the described possibilities and advantages, we developed an immersive virtual reality (VR) headset-based dexterity training with the aim to improve manual dexterity in pwMS while being feasible and usable especially in a home-based setting.

To accomplish this, the training program had to be tailored to the specific group of pwMS having impaired manual dexterity and possibly additionally deficits such as, amongst others, the inability to walk or stand and cognitive deficits [1, 23]. For this reason, a simple application of hardware and software was a mandatory requirement for the training intervention to be feasible and usable. With regard to hardware, the Meta quest 2 was chosen being the most suitable device as a standalone “all in one” immersive VR headset with finger tracking technology. The training App could be easily started from the home screen after turning on the device and connecting it to the Wi-Fi. In order to facilitate the usability of the hardware and software, several different aids in text and video were created. These included written instructions on how to start and use the Meta Quest 2. In addition, the individual training programs were explained or visualized with text and video before they were carried out as describe above (Fig. 1).

Within each training session there were also numerous aids. The training and its progress were explained/shown in the background and several visual and acoustic aids were installed (Table 2, Fig. 1). For example, the change of hands was signaled via an acoustic signal and a text, the hand to be currently used was envisioned and the completion of tasks was underpinned with acoustic and visual signals (Finger circling; Bending/stretching fingers) as illustrated in Fig. 2. These efforts were successful as we already performed a home-based feasibility and usability study using this training intervention that showed high feasibility, usability, patient engagement, and patient satisfaction [18].

To achieve effectiveness, the development of the six training programs focused on integrating the main functions of the hands and arm relevant for use in everyday life, ranging from fine finger to full arm movements (Table 1). Within the mentioned feasibility and usability study, efficacy variables were evaluated including performance-based test as well as patient recorded outcome measurements. The preliminary results hint to an efficacies training intervention regarding improvements in manual dexterity [18]. However, randomized-controlled trials in this regard are necessary to finally address effectiveness of the training intervention. In addition, medical devices should undergo standardized post-market surveillance, however this process is currently not harmonized [36].


We report the development of a new immersive VR headset-based dexterity training for pwMS with the goal that the training intervention is feasible, usable, and effective in improving manual dexterity. To accomplish this, the training intervention was tailored to the specific group of pwMS by implementing a simple and intuitive application regarding hardware and software. To be efficacious, the training intervention covers the main functions of the hands and arm relevant for use in everyday life.

Good feasibility, usability, and patient satisfaction of the training intervention was already shown in a feasibility study. In addition, preliminary efficacy variables hinted to an efficacies training intervention regarding improvements in manual dexterity [18]. However, regarding effectiveness, randomized-controlled trials are necessary.


The study was financed by an unrestricted Grant by Roche.

Supplementary data

The supplementary files are available to download from


The authors have no acknowledgments.

Conflict of interest

The authors declare that they have no conflict of interest.



Kamm CP, Uitdehaag BM, Polman CH. Multiple sclerosis: Current knowledge and future outlook. Eur Neurol. (2014) ; 72: : 132-41. doi: 10.1159/000360528.


Johansson S, Ytterberg C, Claesson IM, Lindberg J, Hillert J, Andersson M, et al. High concurrent presence of disability in multiple sclerosis. Associations with perceived health. J Neurol. (2007) ; 254: : 767-773.


Chruzander C, Johansson S, Gottberg K, Einarsson U, Fredrikson S, Holmqvist LW, et al. A 10-year follow-up of a population-based study of people with multiple sclerosis in Stockholm, Sweden: Changes in disability and the value of different factors in predicting disability and mortality. J Neurol Sci. (2013) ; 332: : 121-127.


Yozbatiran N, Baskurt F, Baskurt Z, Ozakbas S, Idiman E. Motor assessment of upper extremity function and its relation with fatigue, cognitive function and quality of life in multiple sclerosis patients. J Neurol Sci. (2006) ; 246: : 117-22.


Giovannoni G, Cutter G, Sormani MP, Belachew S, Hyde R, Koendgen H, et al. Is multiple sclerosis a length-dependent central axonopathy? The case for therapeutic lag and the asynchronous progressive MS hypotheses. Mult Scler Relat Disord. (2017) ; 12: : 70-78.


Thomson A, Giovannoni G, Marta M, Gnanpavan S, Turner B, Baker D, et al. Importance of upper limb function in advanced multiple sclerosis. Mult. Scler. (2016) ; 22: : 676. P1277.


Řasová K, Freeman J, Cattaneo D, Jonsdottir J, Baert I, Smedal T, et al. Content and delivery of physical therapy in multiple sclerosis across europe: A survey. Int J Environ Res Public Health. (2020) ; 17: : 886.


Kobelt G, Thompson A, Berg J, Gannedahl M, Eriksson J; MSCOI Study Group; European Multiple Sclerosis Platform. European multiple sclerosis platform. New insights into the burden and costs of multiple sclerosis in Europe. Mult Scler. (2017) ; 23: : 1123-1136.


Cikajlo I, Pogačnik M. Movement analysis of pick-and-place virtual reality exergaming in patients with Parkinson’s disease. Technol Health Care. (2020) ; 28: (4): 391-402.


Webster A, Poyade M, Rooney S, Paul L. Upper limb rehabilitation interventions using virtual reality for people with multiple sclerosis: A systematic review. Mult Scler Relat Disord. (2021) ; 47: : 102610.


Yeroushalmi S, Maloni H, Costello K, Wallin MT. Telemedicine and multiple sclerosis: A comprehensive literature review. J Telemed Telecare. (2022) ; 26: (7-8): 400-413.


Kamm CP, Mattle HP, Müri RM, Heldner MR, Blatter V, Bartlome S, et al. Home-based training to improve manual dexterity in patients with multiple sclerosis: A randomized controlled trial. Mult Scler. (2015) ; 21: (12): 1546-56.


van Beek JJW, Van Wegen EEH, Bohlhalter S, Vanbellingen T. Exergaming-based dexterity training in persons with parkinson disease: A pilot feasibility study. J Neurol Phys Ther. (2019) ; 43: (3): 168-174.


van Beek JJW, van Wegen EEH, Rietberg MB, Nyffeler T, Bohlhalter S, Kamm CP, et al. Feasibility of a home-based tablet app for dexterity training in multiple sclerosis: Usability study. JMIR Mhealth Uhealth. (2020) ; 8: (6): e18204.


van Beek JJW, Lehnick D, Pastore-Wapp M, Wapp S, Kamm CP, Nef T, et al. Tablet app-based dexterity training in multiple sclerosis (TAD-MS): A randomised controlled trial. Disabil Rehabil Assist Technol. (2022) ; 29: : 1-11.


Vanbellingen T, Filius SJ, Nyffeler T, van Wegen EEH. Usability of videogame-based dexterity training in the early rehabilitation phase of stroke patients: A pilot study. Front Neurol. (2017) ; 8: : 654.


Bertoni R, Mestanza Mattos FG, Porta M, Arippa F, Cocco E, Pau M, et al. Effects of immersive virtual reality on upper limb function in subjects with multiple sclerosis: A cross-over study. Mult Scler Relat Disord. (2022) ; 65: : 104004.


Kamm CP, Blättler R, Kueng R, Vanbellingen T. Feasibility and usability of a new home-based immersive virtual reality headset-based dexterity training in multiple Sclerosis. Mult Scler Relat Disord. (2023) ; 71: : 104525.


Schieber MH, Santello M. Hand function: Peripheral and central constraints on performance. J Appl Physiol. (1985) ; 96: (6): 2293-300.


Duncan SF, Saracevic CE, Kakinoki R. Biomechanics of the hand. Hand Clin. (2013) ; 29: (4): 483-92.


Soubeyrand M, Assabah B, Bégin M, Laemmel E, Dos Santos A, Crézé M. Pronation and supination of the hand: Anatomy and biomechanics. Hand Surg Rehabil. (2017) ; 36: (1): 2-11.


Giannì C, Prosperini L, Jonsdottir J, Cattaneo D. A systematic review of factors associated with accidental falls in people with multiple sclerosis: A meta-analytic approach. Clin Rehabil. (2014) ; 28: (7): 704-16.


Benedict RHB, Amato MP, DeLuca J, Geurts JJG. Cognitive impairment in multiple sclerosis: Clinical management, MRI, and therapeutic avenues Lancet Neurol. (2020) ; 19: (10): 860-871.


Koch A, Cascorbi I, Westhofen M, Dafotakis M, Klapa S, Kuhtz-Buschbeck JP. The neurophysiology and treatment of motion sickness. Dtsch Arztebl Int. (2018) ; 115: (41): 687-696.


Oh H, Lee G. Feasibility of full immersive virtual reality video game on balance and cybersickness of healthy adolescents. Neurosci Lett. (2021) ; 760: : 136063.


Chander H, Shojaei A, Deb S, Kodithuwakku Arachchige SNK, Hudson C, Knight AC, Carruth DW. Impact of virtual reality-generated construction environments at different heights on postural stability and fall risk. Workplace Health Saf. (2021) ; 69: (1): 32-40.


Leung AK, Hon KL. Motion sickness: An overview. Drugs Context. (2019) ; 8: : 2019-9-4.


Buckingham Gavin. Hand Tracking for Immersive Virtual reality: Opportunities and Challenges. Front Virtual Real. 20 October 2021. Volume 2-2021.


Garrett B, Taverner T, Gromala D, Tao G, Cordingley E, Sun C. Virtual reality clinical research: promises and challenges. JMIR Serious Games. (2018) ; 6: (4): e10839.


Doumas I, Everard G, Dehem S, Lejeune T. Serious games for upper limb rehabilitation after stroke: A meta-analysis. J Neuroeng Rehabil. (2021) ; 18: (1): 100.


Craig CM, Stafford J, Egorova A, McCabe C, Matthews M. Can we use the oculus quest VR headset and controllers to reliably assess balance stability? Diagnostics (Basel). (2022) ; 12: (6): 1409.


Jost TA, Nelson B, Rylander J. Quantitative analysis of the Oculus Rift S in controlled movement. Disabil Rehabil Assist Technol. (2021) ; 16: (6): 632-636.


Badnjević A, Gurbeta L, Bošković D, Džemić Z. Measurement in medicine – Past, present, future Folia Med. Fac. Med. Univ. Saraeviensis. (2015) ; 50: (1): 43-46.


Brucker-Kley E, Kleinberger U, Keller T, Christen J, Keller-Senn A, Koppitz A. Identifying research gaps: A review of virtual patient education and self-management. Technol Health Care. (2021) ; 29: (6): 1057-1069.


Badnjević A, Avdihodžić H, Gurbeta Pokvić L. Artificial intelligence in medical devices: Past, present and future. Psychiatr Danub. (2021) May; 33: (Suppl 3): S336-S341.


Badnjević A, Pokvić LG, Deumić A, Bećirović LS. Post-market surveillance of medical devices: A review. Technol Health Care. (2022) ; 30: (6): 1315-1329. doi: 10.3233/THC-220284.