Contact

OfficeA440
AddressBd de Pérolles 90
1700 Fribourg
Switzerland
Emailyongjoon.thoo(at)unifr.ch
ORCID
GitHub
Google Scholar
LinkedIn

Background

  • MSc in Robotics, Ecole Polytechnique Fédérale de Lausanne (EPFL)
  • BSc in Microengineering, Ecole Polytechnique Fédérale de Lausanne (EPFL)

Yong-Joon Thoo

PhD Candidate in Computer Science, University of Fribourg

MSc in Robotics and BSc in Microengineering, École Polytechnique Fédérale de Lausanne (EPFL)

Research interests

My research focuses on advancing vision rehabilitation for blind and low-vision (BLV) individuals through co-located shared augmented reality (AR) systems. I design interactive training tasks that mirror real-world activities of daily living (ADLs), enabling clients to practice adaptive strategies in familiar environments. In this approach, both clients and low-vision therapists wear AR headsets, creating a shared environment where therapists can directly observe clients’ visual behaviors (e.g., gaze patterns, scanning strategies), provide tailored feedback in real time, and adapt training parameters to better support individual needs. Conducted in collaboration with vision rehabilitation centers, this work aims to foster engagement, improve accessibility, and strengthen the effectiveness of therapist-guided training.

This figure presents a real-world view of the co-located S-AR task between an LV participant and an LV therapist. Both individuals are seated at a table wearing AR headsets. Virtual striped mugs are overlaid on the table in a 5×5 grid pattern. A transparent selection box highlights an additional mug floating above the table, indicating the participant’s intended target. Visual gaze cues are overlaid: a red line shows the participant’s eye gaze, a green line indicates head orientation, and a semi-transparent cone on the table represents the headset’s field of view. A red trace on the table visualizes the participant’s recent eye-gaze path over the past 0.3 seconds. Labels identify the LV participant and the LV therapist.
Illustration of the co-located S-AR task with shared-gaze cues as seen from a therapist- or observer-role headset. The red and green lines represent the projected eye gaze and head gaze of the participant, respectively. The semi-transparent cone on the table surface indicates their FoV. The red trace on the table illustrates the trace of the eye-gaze within 0.3 seconds.

More broadly, I am interested in assistive and rehabilitative systems and have a background in robotics, where I have worked on both mobility support systems (e.g., robotic wheelchairs) and AR/VR-based functional assessment and training tools.

My research interests include:

  • Mixed Reality (MR)
  • Human-Computer Interaction
  • Human-Robot Interaction
  • Multimodal Interaction

Publications (* denotes equal contribution)

  • (Conditionally Accepted for SUI '25) Yong-Joon Thoo, Karim Aebischer, Nicolas Ruffieux, and Denis Lalanne. 2025. Enhancing Therapist-Guided Low-Vision Training with Projected Gaze Behaviors in Co-Located Shared AR.
  • (To appear) Yong-Joon Thoo, Karim Aebischer, Nicolas Ruffieux, and Denis Lalanne. 2025. Exploring Shared Augmented Reality for Low-Vision Training of Activities of Daily Living. In The 27th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’25), October 26–29, 2025, Denver, CO, USA. ACM, New York, NY, USA, 19 pages. https://doi.org/10.1145/3663547.3746371
  • Maximiliano Jeanneret Medina*, Yong-Joon Thoo*, Cédric Baudet, Jon E. Froehlich, Nicolas Ruffieux, and Denis Lalanne. 2025. Understanding Research Themes and Interactions at Scale within Blind and Low-vision Research in ACM and IEEE. ACM Trans. Access. Comput. 18, 2, Article 7 (June 2025), 46 pages. https://doi.org/10.1145/3726531
  • Maximiliano Jeanneret Medina, Yong-Joon Thoo, and Cédric Baudet. 2024. "3.1 Accessibilité numérique: évolution et état actuel". Innovation Booster Technologie et Handicap : Un dispositif, des personnes engagées et des projets pour une innovation sociale par la science (p. 157-169). Éditions Sociographe. https://doi.org/10.3917/agraph.nenc.2024.01.0157
  • Yong-Joon Thoo*, Maximiliano Jeanneret Medina*, Jon E. Froehlich, Nicolas Ruffieux, and Denis Lalanne. 2023. A Large-Scale Mixed-Methods Analysis of Blind and Low-vision Research in ACM and IEEE. In The 25th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '23), October 22–25, 2023, New York, NY, USA. ACM, New York, NY, USA, 20 pages. 10.1145/3597638.3608412
  • Yong-Joon Thoo, Jérémy Maceiras, Philip Abbet, Mattia Racca, Hakan Girgin and Sylvain Calinon. 2021. Online and Offline Robot Programming via Augmented Reality Workspaces. 10.48550/arXiv.2107.01884

Teaching

Current courses

Past courses

MSc / BSc Theses Supervision

Ongoing

Completed

Events

Conferences

Workshops