Archived

This content is available here for research, reference, and/or recordkeeping.

Author ORCID Identifier

https://orcid.org/0009-0000-6102-117X

Date Available

5-2-2026

Year of Publication

2026

Document Type

Doctoral Dissertation

Degree Name

Doctor of Philosophy (PhD)

College

Engineering

Department/School/Program

Electrical and Computer Engineering

Faculty

Jihye Bae

Faculty

Daniel Lau

Abstract

Upper limb motor impairments can severely limit a person’s ability to perform everyday reaching and grasping tasks. Electroencephalogram (EEG)-based brain machine interfaces (BMIs) offer a non-invasive approach for translating neural activity into control signals for assistive devices such as robotic arms. However, traditional EEG-based BMI studies have generally focused on externally cued paradigms, where both movement timing and target selection are specified by the experimenter rather than freely chosen by the user. In addition, shared control offers a practical framework for assistive BMI operation by dividing responsibility between the user and the intelligent robotic system. However, in many EEG-based shared control systems, the BMI often guides the robot incrementally toward the goal or relies on discrete directional commands.

This study examines the feasibility of goal-driven shared control for reaching and grasping using freewill EEG, where the BMI provides a goal-level estimate of the intended reaching goal while the robotic system manages downstream motion planning and execution. Using a freewill EEG dataset where the user freely determines both the initiation of movement and selection of the reaching target, the study investigates two neural functions required for this framework, namely movement intention detection and decoding of the intended 3D goal position.

Given the scarcity of publicly available freewill EEG datasets, this work contributes a large, continuous, multi-session EEG resource for reaching and grasping. The dataset includes recordings from 23 healthy participants across 49 recording sessions, supporting subsequent intention detection, goal position decoding, and shared control evaluation.

Reliable intention detection for goal-driven shared control should move beyond prior studies that mainly compare movement intention against selected resting intervals. This study evaluates movement intention detection under continuous monitoring, with non-intention defined from continuous EEG segments outside the time periods labeled as movement intention rather than from resting segments alone. The results demonstrate that freewill EEG can support reliable intention detection under this more realistic continuous monitoring setting.

For goal-driven shared control, the ability to directly estimate the intended reaching goal position is an essential component. Our findings show that the intended reaching goal position can be estimated directly from freewill EEG, with an EEGNet-based model for 3D goal position regression providing the strongest overall decoding performance. This goal representation was then integrated with movement intention detection in a proof-of-concept goal-driven shared control BMI framework with a simulated robotic arm. The simulation results showed that decoded freewill EEG outputs could be translated into meaningful robotic task behavior under a shared control strategy.

Overall, this work advances EEG-based BMIs toward more natural assistive control by demonstrating the feasibility of goal-driven shared control for reaching and grasping. This is supported by the movement intention detection and direct decoding of the intended 3D reaching goal position from freewill EEG.

Digital Object Identifier (DOI)

https://doi.org/10.13023/etd.2026.250

Archival?

Archival

Share

COinS