Balandino

Forum Replies Created

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • in reply to: Myo input utility and example #375

    Balandino
    Participant

    Hi Jonathan,

    I guess that it’s hard to get recognition of micro-gestures using the Myo armband. Surely not using just raw data. I guess better to rescale data in a way that they could be more meaningful to train a model. If you don’t have issues with cables or you have to perform gestures in a specific space without moving your body a lot, I would go with the Leap Motion, or something similar. It is way more precise for those things.

    Let me know if you get any better result 😉

    Cheers,
    Balandino.

  • in reply to: Myo input utility and example #372

    Balandino
    Participant

    Hi Jonatan,

    I just hacked Myo Mapper to hook it up with Wekinator, and gesture recognition works amazingly combining IMU data. EMG data a bit less. I did combine different machine learning techniques and I guess that Thalmic uses combined techniques. My intuition is that they use SVM and DTW, but I’m not a ml expert at all so do not trust my judgement.

    Apart from the video posted above I’m working on gesture recognition using EMG. I’m publishing soon something about it, meanwhile, if you want to have a look at this paper, where I did report some techniques to recognise throwing gestures by direct mapping. This is also a great paper which shows comparisons of EGM features. Of course, they are not the only features, which can be extracted using the Myo.

    Cheers,
    Balandino.

  • in reply to: Myo input utility and example #366

    Balandino
    Participant

    Hi Jonathan,

    I did some work using ml.lib, here a quick video. I’m currently working on gesture recognition using the Myo, but I’m having some troubles. What data and features do you train the model with?

    Cheers,
    Balandino.

    • This reply was modified 2 years, 10 months ago by  Balandino.
Viewing 3 posts - 1 through 3 (of 3 total)