A framework to provide robots with inner speech skill. This is a pioneristic approach for evaluating how inner speech could influence the human-robot interaction in a cooperative scenario
-
Updated
Jan 18, 2021 - Python
A framework to provide robots with inner speech skill. This is a pioneristic approach for evaluating how inner speech could influence the human-robot interaction in a cooperative scenario
Repository contains all code needed to work with ArEEG dataset
EEG-to-Text Model [BETA]
Consumer brain-computer interface for inner speech decoding. 8-channel EEG headband ($800) outperforms 128-channel clinical systems ($50K). EEGNet 35.5% accuracy (p=0.0006), cross-subject generalization (p=0.003). Real-time demo included.
The model of the tight link between the robot's inner speech and emotions
Public repository for my undergrad cognitive psychology thesis on the "inner voice"
Add a description, image, and links to the inner-speech topic page so that developers can more easily learn about it.
To associate your repository with the inner-speech topic, visit your repo's landing page and select "manage topics."