Announcement

Collapse
No announcement yet.

Open Source Motion Capture Suit - I am looking for supporters.

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Update 2!

    Possible positions of sensors on the glove:
    Click image for larger version

Name:	UE4_Editor_2016_10_07_17_55_01_12.jpeg
Views:	1
Size:	63.6 KB
ID:	1116352
    And simple animation, for future testing of the glove:


    P. S. Still working on electronic circuits!

    Post updated!
    Attached Files

    Comment


      #17
      Update 3, 2016.10.09.
      I bought one ESP-12 and 5 boards with CD74HC4067 (datasheet - http://www.ti.com/lit/ds/symlink/cd74hc4067.pdf), boards photo
      Click image for larger version

Name:	5PCS-CD74HC4067-16-Channel-Analog-Digital-Multiplexer-Breakout-Board-Module-For-Arduino.jpg_640x.jpg
Views:	1
Size:	54.5 KB
ID:	1116424
      Some basic schematic experiments with 4067
      Click image for larger version

Name:	screenshot_370.jpeg
Views:	1
Size:	175.6 KB
ID:	1116425
      And small video
      Last edited by Arthur Khusnutdinov; 10-09-2016, 08:43 AM.

      Comment


        #18
        So if its open source that mean its free?? As in give you my email address and you ship it to me free???
        ///DevDreReid///

        Comment


          #19
          Hello.

          I mean, that software (possible) will be free and electronic circuits of the MOCAP suit will be open-source.
          Yeah, it's means, that free will be tech - everybody will be able to buy elements from some shop (like AliExpress, Ebay), or from us, and everybody will be able to assemble suit by our electronic circuits.

          For example, Noitom is not shared documents, schemas, electronic circuits of their suit. You can only buy suit from them and you can't assemble their suit from scratch by yourself.

          Comment


            #20
            I just found this because i was researching the idea of using the mpu 9250s for markerless mocap. Or rather, a markerless suit that would help clean up markerless camera capture. There is a reason the neuron costs so much, they did the years of r and d.

            Here is a really in depth write up of DOF sensor comparison. It's not as easy as just plugging sensors into a brain. Also calibration is a pain in the *** with any inertial sensors.
            https://github.com/kriswiner/MPU-605...-Sensor-Fusion

            Why did you choose to use the orange pi+2?

            You should take a look at this site, they have a openish DIY Wearable IMU suit. Maybe get in contact with them?
            https://inmagicwetrust.wordpress.com...acking-sensor/

            Have you seen the CyberGlove series of products? My friend has an older cyberglove 2 and it's pretty sweet.

            Idk, i just bought a motion capture array for markerless capture. I luckily have a completely empty room for this. for someone with not an entire room to dedicate. a kinect 2 seems like a way cheaper and much more feasible approach than making a suit.
            Last edited by SaxonRah; 02-07-2017, 06:12 AM.
            Youtube
            Machine Learning C++ Plugin
            Lindenmayer System C++ Plugin

            Comment


              #21
              Originally posted by SaxonRah View Post
              I just found this because i was researching the idea of using the mpu 9250s for markerless mocap. Or rather, a markerless suit that would help clean up markerless camera capture. There is a reason the neuron costs so much, they did the years of r and d.

              Here is a really in depth write up of DOF sensor comparison. It's not as easy as just plugging sensors into a brain. Also calibration is a pain in the *** with any inertial sensors.
              https://github.com/kriswiner/MPU-605...-Sensor-Fusion

              Why did you choose to use the orange pi+2?

              You should take a look at this site, they have a openish DIY Wearable IMU suit. Maybe get in contact with them?
              https://inmagicwetrust.wordpress.com...acking-sensor/

              Have you seen the CyberGlove series of products? My friend has an older cyberglove 2 and it's pretty sweet.

              Idk, i just bought a motion capture array for markerless capture. I luckily have a completely empty room for this. for someone with not an entire room to dedicate. a kinect 2 seems like a way cheaper and much more feasible approach than making a suit.
              Hi. I good know Kris Winer, he is genius! But I don't agree with you about calibration - it is not difficult to calibrate.

              I need Orange Pi for easy experiments. But hub will be based on ESP8266 chip (with embedded WiFi).
              You should take a look at this site, they have a openish DIY Wearable IMU suit. Maybe get in contact with them?
              https://inmagicwetrust.wordpress.com...acking-sensor/
              Whoa! Such big boxes!!! ))) I planned to make it very small, with only one box on the fist. That box will process data for whole hand with fingers,
              fist, arm, etc
              . Ie, main hub of the torso and two boxes on both hands.

              The main problem is that I am not from EU/US, I am from CIS.
              So, even I will done such MOCAP device, I will not be able to sell that devices from Russia to EU/US...

              I am looking for some company, that could hire me - I am ready to relocate to US or EU to work as an electronics engineer.
              Last edited by Arthur Khusnutdinov; 02-07-2017, 06:40 AM.

              Comment


                #22
                Or as a game developer ).
                Last edited by Arthur Khusnutdinov; 02-07-2017, 06:41 AM.

                Comment


                  #23
                  Hey Arthur - Talk to Ikinema they are a mocap company based in the UK.

                  I am unsure If they are hiring. But it is worth a shot.

                  Comment


                    #24
                    Save money by using other sensors.

                    I was also thinking to start the same project just for fun and searching info in google I found this. I guess you can reduce costs by using gyroscope sensors instead of gyroscope-altitude-andmore sensors. For example if you rotate the arm 45 grades then the hand will also rotate and by knowing the angle of the arm and a sensor in the hand then you can calculate the position for each one. You will only need an altitude sensor to register if the person jump and acelerometer to register if he move forward, backward, left and right.

                    The software doesn't have to be too difficult. For example you could use blender and then just rotate each bone of the model according to the angle registered by the sensor. You will only need to create a module for blender (is pretty easy) and communicate to the orange pi using a socket. If you want to use other software instead of blender then you can also write a module for it. I estimate a day or two to create a functional module for blender and only a little bit more for a graphic interface.

                    You can also save more money by using arduino but with out experience with it (like in my case) the process would be more complicated.
                    I want to work on it in 2 or 3 months after solve a few things.

                    Comment


                      #25
                      Originally posted by ADRIANMRIT View Post
                      I was also thinking to start the same project just for fun and searching info in google I found this. I guess you can reduce costs by using gyroscope sensors instead of gyroscope-altitude-andmore sensors. For example if you rotate the arm 45 grades then the hand will also rotate and by knowing the angle of the arm and a sensor in the hand then you can calculate the position for each one. You will only need an altitude sensor to register if the person jump and acelerometer to register if he move forward, backward, left and right.

                      The software doesn't have to be too difficult. For example you could use blender and then just rotate each bone of the model according to the angle registered by the sensor. You will only need to create a module for blender (is pretty easy) and communicate to the orange pi using a socket. If you want to use other software instead of blender then you can also write a module for it. I estimate a day or two to create a functional module for blender and only a little bit more for a graphic interface.

                      You can also save more money by using arduino but with out experience with it (like in my case) the process would be more complicated.
                      I want to work on it in 2 or 3 months after solve a few things.
                      It is not good idea to use Arduino for such projects... Because of performance issues...
                      I have some results, but I froze my work, because of I haven't found producers/investors for my project.
                      No money - no honey...
                      Last edited by Arthur Khusnutdinov; 05-11-2017, 12:09 PM.

                      Comment


                        #26
                        Originally posted by HeadClot View Post
                        Hey Arthur - Talk to Ikinema they are a mocap company based in the UK.

                        I am unsure If they are hiring. But it is worth a shot.
                        Hi!
                        Thank you.
                        Here is answer from them (Ikinema):

                        Difficult to pay your dinner from this, not impossible but difficult. Market is small, only geeks, then you have to compare to IPI, Noitom, Orion and other low cost solutions …

                        Comment


                          #27
                          Hello, I just read this topic for first time and I think I spot some area where you have missed.

                          I master in body language and It need to get information for lower arm twist full 180 degree and thumb need to be fully 180 left to right of the palm hand and wheelie 360. Palm hand itself can be bend, try that by your thumb touch pinky finger.

                          And what is frame per second output for those motion capture suit/glove? Because if it fail to capture the body language, it will look too stiff or robot.

                          Comment


                            #28
                            Originally posted by Dan3DIM View Post
                            Hello, I just read this topic for first time and I think I spot some area where you have missed.

                            I master in body language and It need to get information for lower arm twist full 180 degree and thumb need to be fully 180 left to right of the palm hand and wheelie 360. Palm hand itself can be bend, try that by your thumb touch pinky finger.

                            And what is frame per second output for those motion capture suit/glove? Because if it fail to capture the body language, it will look too stiff or robot.
                            Hi. It will not be a very high-speed capture suit. In my experiments I polled sensors at 80 and 120 Hz. First prototype will work on <=160 Hz. Planned, that in production it will work on 200Hz.

                            Comment

                            Working...
                            X