The control at will of our body is necessary for interaction with other elements of our environment.
However, due to various situations like diseases or injuries, normal mobility can be limited partially or totally. Robotic devices could be used in these cases to improve the life quality of the affected person.
Traditionally, human-machine interaction (HMI) requires manipulation of a physical element (touching a screen, pressing a button, etc.), that disabled people may have difficulties
to learn to use such existing devices and technology. Because of this, researchers in HMI
have developed user interfaces that make use of biomedical signals to
determine the intention or action that a user wants to perform.
In particular, brain-machine interfaces make use of readings of brain activity in
specific areas to determine the command to execute. These advances have been
achieved due to the knowledge we have about brain regions and
the functions related to the activation of the same.