👩💻 Join our community of thousands of amazing developers!
The paper is still under review when I read it. It can be find here. Why this paper? This is one of the most recent papers on transformers and sEMG finger control. It also adds spiking neurons to the transformer architecture to introduce sparsity. the reported results are the highest for NinaPro DB8. Main Findings A minimum prediction latency of 3.5 ms: it is the lowest reported on this dataset (Tokens are processed one at-a-time, see below for more information). This is achieved by exploiting...