Myo input utility and example

Welcome Forums Example inputs / feature extractors Myo input utility and example

Tagged: ,

This topic contains 5 replies, has 3 voices, and was last updated by  Rebecca 1 year, 4 months ago.

  • Author
    Posts
  • #326

    Jonathan
    Participant

    I created a little cli utility to pipe data from the Myo sensors to OSC for input to Wekinator. Instructions in the Readme, and there’s a brief tutorial.

    I’ve been a bit disappointed with the Myo — the default gestures are limiting and don’t work as reliably as I’d expected. I’m hoping that with a bit of machine learning I can create some interfaces for it that feel more natural… would be interested in others’ experiences, especially using the EMG data, since that’s what makes the Myo unique…

  • #366

    Balandino
    Participant

    Hi Jonathan,

    I did some work using ml.lib, here a quick video. I’m currently working on gesture recognition using the Myo, but I’m having some troubles. What data and features do you train the model with?

    Cheers,
    Balandino.

    • This reply was modified 2 years, 9 months ago by  Balandino.
  • #372

    Balandino
    Participant

    Hi Jonatan,

    I just hacked Myo Mapper to hook it up with Wekinator, and gesture recognition works amazingly combining IMU data. EMG data a bit less. I did combine different machine learning techniques and I guess that Thalmic uses combined techniques. My intuition is that they use SVM and DTW, but I’m not a ml expert at all so do not trust my judgement.

    Apart from the video posted above I’m working on gesture recognition using EMG. I’m publishing soon something about it, meanwhile, if you want to have a look at this paper, where I did report some techniques to recognise throwing gestures by direct mapping. This is also a great paper which shows comparisons of EGM features. Of course, they are not the only features, which can be extracted using the Myo.

    Cheers,
    Balandino.

    • #373

      Jonathan
      Participant

      Hi Balandino,

      Very interesting, thanks for the info and the links! Looking forward to reading through it. I haven’t worked on this in a while, but am still interested in learning and improving on what I was doing.

      Like you, I was able to get quite good results using IMU data with wekinator, less so with the EMG data. With the EMG data I was basically able to recreate the recognized gestures that the Myo ships with, but not much more. I’m also not an ML expert, so was basically just experimenting to see what worked best.

      Ultimately, I think I just had too high of expectations for the EMG capabilities of the Myo. I was really hoping to be able to train on much smaller hand and finger movements, but I’m not sure it’s possible with the available data. Maybe with the right filtering and training, it could work… I’m also not a DSP expert. But I haven’t given up yet, and am encouraged to find others using ML with the Myo as well!

      Best,
      Jonathan

  • #375

    Balandino
    Participant

    Hi Jonathan,

    I guess that it’s hard to get recognition of micro-gestures using the Myo armband. Surely not using just raw data. I guess better to rescale data in a way that they could be more meaningful to train a model. If you don’t have issues with cables or you have to perform gestures in a specific space without moving your body a lot, I would go with the Leap Motion, or something similar. It is way more precise for those things.

    Let me know if you get any better result 😉

    Cheers,
    Balandino.

  • #469

    Rebecca
    Keymaster

    Thanks both! I’ve linked to both your projects on the Wekinator examples page.

You must be logged in to reply to this topic.

  •  
  •  
  •  
  •  
  •