FrontPage

What's this?

  • This product is "Auto Chasing Turtle".
  • By autonomous control, this robot recognizes people's face and approaches to the detected human. The scene that it is working in real time can be seen by iPad. It is built by using Kinect as sensor and using Linaro kernel + Android + openFrameworks as application framework.
  • YouTube Video
DSC_0012.jpg
DSC_0001.jpg
DSC_0006.jpg

The thing to prepare

Reference data

download

Detail explanation for source code

Hardware

  • Connect all hardwares.
  1. battery
  2. beagleboard-xM
  3. Kinect
  4. WiFiRouter

Software

What doing?

  1. Take the RGB camera's image
    1. Save this image as jpeg
  2. Try to recognize face
    1. If fail, search random.
  3. Calculate the course which it should follow.
    1. Move kinect's Angle
  4. Calculate the distance to target

Detect Face

  • Take the RGB camera's image
    • Take from Kinect's RGB camera by ofxDroidKinect.
      • testApp.cpp : 36 line
        void testApp::draw() {
           kinect.draw(0, 0, 480, 320);
           flame++;
           if (flame > 5) {
                flame = 0;
                colorImg->setFromPixels(kinect.getPixels(), kinect.width, kinect.height);
           }
        }
  • Save this image as jpeg
    • Android's FaceDetector class can't detect by the Kinect's Bitmap format. Therefore change format Bitmap to JPEG.
      • OFActivity.java : 53-72 line
        Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
        byte[] pixels = OFAndroid.getImgPixels();
        
        if (pixels != null) {
            for (int i=0;i<w;i++) {  
                for (int j=0;j<h;j++){
                    int index = (j * w + i) * 3;
                    bitmap.setPixel(i, j, Color.rgb(pixels[index++], pixels[index++], pixels[index]));
                }
           }
           
            try {
               // save adcard 
                FileOutputStream fos = new FileOutputStream("/sdcard/screenshot.jpg");
                bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
                fos.flush();
                fos.close();
            } catch (Exception e) {
                // check exception
            }
  • Try to recognize face
    • Use Android's FaceDetector class
      • OFActivity.java : 77-80 line
            FaceDetector.Face[] faces = new FaceDetector.Face[1];
            if (bitmap != null) {
                FaceDetector detector = new FaceDetector(w, h, faces.length);
                int numFaces = detector.findFaces(bitmap, faces);
  • If fail, search random.
      • OFActivity.java : 138-144 line
        } else {
            /*
             * search turn
             */ 
            Random random = new Random();
            int res = random.nextInt(3);
            if (res == 0)			DroidBot.getInstance().turnRight2();
            else if (res == 1)		DroidBot.getInstance().turnRight4();
            else if (res == 2)		DroidBot.getInstance().turnLeft2();
            else if (res == 3)		DroidBot.getInstance().turnLeft4();
        }

Calculate Course & Destance

  • Calculate the course which it should follow.
    • RGB image is divided into 4 pieces at a width direction. And the course is diceded by Face position.
      • OFActivity.java : 103-109 line
        if (pointX > 0 && pointX < w/4) {
            DroidBot.getInstance().turnRight2();    // right position
        } else if (pointX >= w/4 && pointX <= 3*w/4) {
            ;                                       // center position
        } else if (pointX > 3*w/4 && pointX <= w) {
            DroidBot.getInstance().turnLeft2();     // left position
        }
  • Move kinect's Angle
    • The maximum kinect's Angle is "30". And the kinect's Angle is diceded on the basis of it by Face position.
      • OFActivity.java : 117-119 line
        int angle = 30 - pointY*30/h;
        if (angle > 0 && angle <= 30)
            OFAndroid.setAngle(angle);
  • Calculate the distance to target
    • Get from Kinect's Z(depth) camera by ofxDroidKinect.
    • The kinect's Angle is diceded on the basis of it by Face position.
      • OFActivity.java : 127-132 line
        int dist = OFAndroid.getDistance(pointX, pointY);
        if (dist < 100)                     DroidBot.getInstance().walkBack4();
        else if (dist >= 100 && dist < 150) DroidBot.getInstance().walkToward4();
        else if (dist >= 150 && dist < 200) DroidBot.getInstance().walkToward8();
        else if (dist >= 200 && dist < 300) DroidBot.getInstance().walkToward16();
        else if (dist >= 300)               DroidBot.getInstance().walkToward32();

Control Robot

  • KONDO Animal is controled by serial.
  • KONDO Animal uses RCB-3 control unit. and send to control command by serial.

Connect iPad

  • iPad viewer is VNC client.
    • Using Android VNC Server.
      • init.rc
        service vncserver /data/androidvncserver -k /dev/input/event0 -t /dev/input/event0

Contact us

  • info @ siprop.org

At the last

  • If this is helpful to you, Please donate for "2011 Japanese Earthquake and Tsunami".
  • We are Japanese community team. We pray Japan revives.

Information

Donation