Enhancing touch sensibility by sensory retraining in a sensory discrimination task via haptic rendering

Villar Ortega, Eduardo; Aksöz, Efe Anil; Buetler, Karin A; Marchal-Crespo, Laura (2022). Enhancing touch sensibility by sensory retraining in a sensory discrimination task via haptic rendering. Frontiers in rehabilitation sciences, 3 Frontiers Media 10.3389/fresc.2022.929431

MakingSense-post-print-accepted.pdf - Accepted Version
Available under License Creative Commons: Attribution (CC-BY).

Download (19MB) | Preview
fresc-03-929431.pdf - Published Version
Available under License Creative Commons: Attribution (CC-BY).

Download (3MB) | Preview
Data_Sheet_1_Enhancing_20touch_20sensibility_20by_20sensory_20retraining_20in_20a_20sensory_20discrimination_20task_20via_20haptic_20rendering.PDF - Supplemental Material
Available under License Creative Commons: Attribution (CC-BY).

Download (165kB) | Preview

Stroke survivors are commonly affected by somatosensory impairment, hampering their ability to interpret somatosensory information. Somatosensory information has been shown to critically support movement execution in healthy individuals and stroke survivors. Despite the detrimental effect of somatosensory impairments on performing activities of daily living, somatosensory training—in stark contrast to motor training—does not represent standard care in neurorehabilitation. Reasons for the neglected somatosensory treatment are the lack of high-quality research demonstrating the benefits of somatosensory interventions on stroke recovery, the unavailability of reliable quantitative assessments of sensorimotor deficits, and the labor-intensive nature of somatosensory training that relies on therapists guiding the hands of patients with motor impairments. To address this clinical need, we developed a virtual reality-based robotic texture discrimination task to assess and train touch sensibility. Our system incorporates the possibility to robotically guide the participants' hands during texture exploration (i.e., passive touch) and no-guided free texture exploration (i.e., active touch). We ran a 3-day experiment with thirty-six healthy participants who were asked to discriminate the odd texture among three visually identical textures –haptically rendered with the robotic device– following the method of constant stimuli. All participants trained with the passive and active conditions in randomized order on different days. We investigated the reliability of our system using the Intraclass Correlation Coefficient (ICC). We also evaluated the enhancement of participants' touch sensibility via somatosensory retraining and compared whether this enhancement differed between training with active vs. passive conditions. Our results showed that participants significantly improved their task performance after training. Moreover, we found that training effects were not significantly different between active and passive conditions, yet, passive exploration seemed to increase participants' perceived competence. The reliability of our system ranged from poor (in active condition) to moderate and good (in passive condition), probably due to the dependence of the ICC on the between-subject variability, which in a healthy population is usually small. Together, our virtual reality-based robotic haptic system may be a key asset for evaluating and retraining sensory loss with minimal supervision, especially for brain-injured patients who require guidance to move their hands.

Item Type:

Journal Article (Original Article)


10 Strategic Research Centers > ARTORG Center for Biomedical Engineering Research > ARTORG Center - Motor Learning and Neurorehabilitation

Graduate School:

Graduate School for Cellular and Biomedical Sciences (GCB)

UniBE Contributor:

Villar Ortega, Eduardo Enrique, Aksöz, Efe Anil, Bütler, Karin, Marchal Crespo, Laura


600 Technology > 610 Medicine & health




Frontiers Media


[4] Swiss National Science Foundation ; [UNSPECIFIED] SENACYT and IFARHU




Eduardo Enrique Villar Ortega

Date Deposited:

03 Aug 2022 08:13

Last Modified:

05 Dec 2022 16:22

Publisher DOI:






Actions (login required)

Edit item Edit item
Provide Feedback