Links

MotioSuit is an open-source, active motion capture suit. It reads the angles at which each of the user's limbs are oriented and sends them over Bluetooth to the computer, where the model is updated in Blender to follow the movement.

The original intent for this project was to develop a full-body game controller, but other uses can be animating 3D models with natural movements and in a fraction of the time or even controlling a humanoid robot!

The microcontroller reads every sensor's orientation in quaternions and sends them bundled in a string over Bluetooth to the computer. There the python script in Blender receives the data and breaks it into individual sensor angles which are then passed to each bone in the armature to be updated in the 3D view.

The IMUs used here are Bosch's BNO055, which in addition to an accelerometer, gyroscope and magnetometer includes a 32-bit cortex M0+ microcontroller running a sensor fusion algorithm that is able to produce orientation in both Euler angles and quaternions.

These sensors work over I2C, but have only two selectable addresses. To overcome this, an I2C multiplexer board based on the PCA9548 is used to have different I2C buses with one or two sensors per bus. This expands the number of sensors from 2 (on the microcontroller's bus) to 18 ( eight new buses plus the microcontroller's one). Even more sensors can be added simply adding another multiplexer, up to eight boards can be connected to the same bus (If you need more... multiplex multiplexers! 🙂 ).

To make the suit plug-and-play a bluetooth to usb bridge module was added. The suit's Bluetooth is set as master and instructed to connect to the bridge's MAC address, while the computer searches for any device of type '/dev/ttyACMx'.

On the computer side, two Blender files are available in Github. The 'MotioSuit.blend' is the file ready to be used as in the video, whereas 'Armature.blend' contains the script, logic and armature needed but no model. Use this file to be able to control your own 3D model with the suit, by simply parenting the model to the bones.

To start the program press on "Start Game Engine". Optionally, press on "Record Animation" to record the session.

Both python 3 and pyserial need to be installed for the communication to work.

 

The benefit of using a suit based on IMUs versus other optical systems like visual markers or Kinect cameras is the lower cost, no need to set up a stage and the ability to be used outdoors.

 

 

36 thoughts on “MotioSuit

  1. Andy Hawkins Reply

    Heya,

    This is great. I’ve been looking for a low cost mocap solution for our college. I think this would make an interesting build project for the game developers I teach as well. Im going to try and build this.

  2. Alvaro Post authorReply

    Thanks!
    Please do, it’d be awesome if you shared pics/video of your build 🙂

  3. Andy Hawkins Reply

    Yep will definitely make a blog and share it. I need to get my head around the tech first, having only built RPi arcade machines and done a few Arduino projects.

  4. Scott Reply

    This is awesome. Thinking of having some of my high school students build a couple of these. What did parts cost you?

    • Alvaro Post authorReply

      Hi, great idea!
      I don’t recall exactly how much the total price was, but the main cost is in the sensors. While I designed and hand-soldered the board to keep the price at around 10€/sensor, assembled modules have recently started appearing online for that price, so that would be the way to go 🙂
      Apart from that, you’d need a bluetooth-enabled controller, a battery and the I2C multiplexer.

  5. Raul Lapeira Reply

    Hola alvaro:

    Me estoy haciendo uno para la liga de robots. Javi isabel me hablo de ti. porfa dame un toque es importante.

  6. boredj Reply

    Hi Alvaro, when i tried importing the blend into unity the inspector stated that there was not enough bones. Perhaps this needs to be updated

    • Alvaro Post authorReply

      Hi, that’s a Blender file, try opening it there. I’ve never used Unity so I’m afraid I can’t help you with that.

  7. Xavier Reply

    Hi Alrvaro,

    Congratulations Excellent project, I would like to receive your help I am trying to do something similar but instead of using sensors I need to use a web cam, first I am trying to move the bones in BGE mode using a python script but I have not managed to give with the solution to the problem.

    • Alvaro Post authorReply

      Thanks!
      Take a look at the python file MotioSuit.py, the blender part should be taken care of just by changing the serial communications to whatever method you choose to send the angles’ data.
      You would only need to write the angles to a 2D matrix, with one line per angle and one column per quaternion, the actual movement of the bones in BGE is from line 72 to the end.
      Hope this helps!

  8. Xavier Reply

    Thanks Alvaro,
    I am trying to modify the MotioSuit.py file for now to test I need to move a bone inside a mesh in BGE mode so I have manually assigned the angles to the varibale angles, but at the moment of pressing P, no movement is generated. I have sent a message to the mail with the test file blender, I hope to receive your help and acceptance.

  9. OAXP Reply

    Hey, please I want to make great Mocap with MPU 6050 sensors, Arduino Uno and Bluetooth. I don’t understand well, can you please explain me by e-mail? I founded a similar project : http://herrzig.ch/work/bewegungsfelder/ .
    I don’t know if I can combine them or not. Please Help me.

    • Alvaro Post authorReply

      Basically, I read the absolute orientation of the sensors in quaternions to avoid gimbal lock and sent the matrix of values over bluetooth to the computer running Blender with a python script that updated the model. Since the sensor’s address was 1bit, I used an I2C multiplexer to be able to read all the sensors one bus at a time.
      I don’t know if the MPU6050 calculates the orientation or if you’d have to take care of the sensor fusion algorithm as well.
      As for the other project, it looks cool but I’m not sure I can help you with it since I don’t know how it works.

  10. Fareeha Reply

    Your project is wonderful and your work is remarkably helpful. Kindly let me know what are your CPU specifications that you are using to run this animation in blender ?

    • Alvaro Post authorReply

      Thanks! Unfortunately, I can’t tell. This was done on a company laptop almost two years ago, I really don’t remember.
      Sorry!

  11. Ahmet NARMAN Reply

    Hello,

    We have been working on a project similar to yours using blender and your work really helped us understand what we are supposed to do. Thank you for that. We had a problem with serial connection though. We could establish serial connection using python IDE (pycharm) but while working on blender game engine, our access to the serial port was denied. We have been looking for solutions but could not find a conclusive one. Did you face with such a problem? Do you recommend us any way to solve this problem?

    Thank you.

    • Alvaro Post authorReply

      Hi! I’m glad you’ve found the project helpful!

      Actually yeah, I had the same problem at the beginning, I think you can solve it by either copying pyserial to blender’s internal python folder or by adding it to the system path with ” sys.path.append(“/usr/lib/python3/dist-packages”) ” in the script.

      Good luck!

  12. AMC Reply

    Hi Alvaro,

    i have the same problem as Xavier, when i try to move a mesh, i can move manually in the viewport, but then when i press P nothing is moving. Would be great if you could give some lead on this.

    Thanks!

    • Alvaro Post authorReply

      Hi!
      In your mail you mention that you’re unable to move the bones programmatically, but I understand that you can move them by hand and the mesh follows along, right?
      If that’s the case, could it be that the bones’ names don’t correspond to those in the script?

  13. AMC Reply

    Hi Alvaro,

    From the script i’m calling the bones by its right name, if not it would be throwing an error. Also, i’m printing out the rotational values i’m changing and they change, but visually the mesh it’s not deformed. Seems like it’s not rendering properly…

    • Alvaro Post authorReply

      I’d have to take a look to see if I can find the problem. Can you send it to me by mail or upload it somewhere?

  14. Dale Reply

    Hi Alvaro,

    This is an amazing project, well done.

    I saw on hackaday back in 2016 you were thinking of going wireless, can I ask how that is going and what components you were using to achieve that?

    • Alvaro Post authorReply

      Thanks!

      Well, I’ve only recently started working on the new version, so I’m still at the board development phase, but here’s what I’ve got at the moment (https://hackaday.io/project/88133-motio)

      I’m working with the same BNO055 IMU but this time the processor is the nRF51822 (Cortex M0) with integrated BLE and each module has its own battery with onboard charger

  15. andre Reply

    Great looking project, I am building a headset glove combo and was wondering if the sensors worked together feeding data corresponding to each other or if they are all giving separate motion data for the motion or if I could create the movement from a single IMU? I have the glove rotating at the moment but am trying to figure out how to make it move proportionally around in 3d space and not just orientation. Any tips would be most appreciated

    • Alvaro Post authorReply

      Thanks! Each sensor is just outputting their orientation data, what I did was create an armature in Blender that could move in a restricted way (links relative to each other) and then plug the sensor data in.

      You could do this yourself by knowing your model data (lengths of limbs, positions relative to each other…) and applying coordinate system transformations to understand how to plot the local data (sensor) in the global system (body)

      • Andre Reply

        Thank you for the reply. So the sensors work off of each other. My problem is that I have 2 sensors. 1 on the back of my hand and 1 would be on my headset. I’m still wrapping my head around how quaternions work. So they give not only rotation adjustments but the coordinates are 3d spatial as well? I’m using unity myself but I will play around in blender with my 1 sensor and see what I can learn. Keep up the good work. This stuff is pretty tricky to pick up on.

        • Alvaro Post authorReply

          Well no, they work independently, each sensor only gives you its absolute orientation, in either euler or quaternions depending on what you ask, although you’d want to use quaternions for robustness’ sake. Unity should be able to handle them directly. Good luck! 🙂

  16. Ganesh Kumar Reply

    So, how can I connect with 2.8 EEVEE? No game engine support in 2.8 right? The code also seems to update to latest 2.8. any help?

    • Alvaro Post authorReply

      Well, I’m not sure. Honestly I didn’t know there was a new 2.8 version until I read this, I haven’t used Blender in a while.
      Normally I’d say that it should be directly compatible, but I see that this version has a lot of improvements and workflow changes, so if you’re not able to get it to work something will need to be modified.
      Sorry for not being of much help and best of luck!

  17. Saad Reply

    Hi, this is an extremely noob-ish question, but do you have any circuit diagram or something that can help me connect Arduino to multiplexers and then the multiplexors to the BNUs.

    Im really new at this stuff, and want to work on a project like that.

    I actually did try a version of this by connecting MPU6050 with ESP8266 and sending data to PC via OSC. It worked (somewhat) but didn’t work out too well, I am having issues with drift but also it is getting some rotations completely wrong (especially child limbs, such as fore-arms).

    I want to give your project a try but not really sure where to even begin.

    • Alvaro Post authorReply

      Hi, I don’t have a diagram for it because it’s pretty simple really.
      Basically the BNO sensors only have two possible addresses, so you can only attach two to a single I2C bus.
      To solve this you just connect the arduino’s I2C bus to the chip’s input pins and then one or two sensors to each of the eight outputs.

  18. Nguyen Reply

    Hi,

    I am going through your source code of your amazing projects in other to learn something as I am an absolute newbie to Python and Blender.
    It is in the MotioSuit.py file, I tried to read through every documents online about quaternions but I can’t get clear of your code in the function:def updateAngles()

    I don’t get it why you used different methods to get the rotation_quaternion for ‘armR’,’forearmR’,’armL’,’forearmL’ as following:
    mathutils.Vector([angles[0][0],angles[0][1],angles[0][2],angles[0][3]])

    But for ‘trunk’,’upperLegR’,’lowerLegR’, ‘upperLegL’, ‘lowerLegL’ need to go through this procedure for example with the ‘trunk’ :
    trunk = mathutils.Quaternion((angles[4][0],angles[4][1],angles[4][2],angles[4][3]))
    correction = mathutils.Quaternion((1.0, 0.0, 0.0), math.radians(90.0))
    trunk_out = correction*trunk

    As far as I understand, with the MotioSuit.ino, the code is getting the quaternio value q0, q1, q2,q3 but how does it manage to get so many angles for the angle update funtion?
    I do apologise if my question is too basics and as if I’ve missed something or lack of knowledge.

    • Alvaro Post authorReply

      Hi, thanks!
      You’re right, the code isn’t very clear and definitely not efficient, but I had much less experience back then.

      First, the angles are being read from the serial port. The arduino is reading the quaternion values from each sensor and sending the data from all the sensors through the serial port. Blender then reads this data and separates the data first into sensors and then into quaternions for each sensor. That’s how all the sensor values are updated every time the function is called.
      For example, you can see that armR takes the values from sensor 0, and armL from sensor2:
      ob.channels[‘armR’].rotation_quaternion = mathutils.Vector([angles[0][0],angles[0][1],angles[0][2],angles[0][3]])
      ob.channels[‘armL’].rotation_quaternion = mathutils.Vector([angles[2][0],angles[2][1],angles[2][2],angles[2][3]])

      Secondly, the reason for the whole block of code [lines 51-70] is to apply a 90 degree correction to those specific angles, only because the model has its arms perpendicular to the rest of the body (don’t ask me why, it doesn’t make any sense but that’s how I did it). So because of the angle offset in the armature, the incoming data had to be “corrected”.
      You can use the code as it is, it works, but it will be performing unnecessary calculations at every update. If you’re planning on using this as a basis for a new project I would highly recommend you make sure the armature alignment makes sense and just update the values directly, rather than having to correct it later with software.

      Hope this helps!

      • Nguyen Reply

        Thank you Alvaro so much for your response and resolved the fuzzy cloud in my head.

        My I ask you an advise for my project about the wireless communication, I am thinking of building 9 different wearable units, each unit consists of 1 Arduino (Nano or many Micro) + 1 IMU sensor (6DOF) + 1 Bluetooth 5.0 module and a battery, they all sending data to the computer (an extra Arduino Uno board wired up with a Bluetooth 5.0 receiver module that sending data collected from the 9 wearable units to Blender through the serial communication port). Is this approach possible?

        And also regarding to the alignment of the coordinate system of the sensor against the model local segment, did you have to encounter that sort of issue? As I read some research documents that have mentioned this. I can’t fully understand the principle behind.

        Best Regards.

Leave a Reply

Your email address will not be published. Required fields are marked *