CSTA Day 2 AM: Out of Your Seat Comp Sci -Coding Using the Kinect

Out of Your Seat Comp Sci: Coding Using the Kinect (Doug Bergman)

www.innovativeteacher.org
@dougbergman
Doug Bergman, Porter-Gaud School, Charleston SC – K-12
In this session we looked at code and coding for kinetic movement in the Computer Science classroom using the Microsoft Kinect, and a demonstration of student projects, as well as a brief review of some of the code behind them. This included  interpreting skeletal data and also facial recognition. We’ll also look at voice recognition. We’ll take a look at some other kinetic movement instruments such as dancepad and/or LEAP motion. We’ve had some incredible projects over the years, and I’ll share how we’ve done it in our high school class. The concept of using your entire body as user input allows for a different type of thinking from your students. This session will include members of the audience interacting with my projects. I am also hoping to generate audience discussion about this type of out-of-your-seat experience.
Microsoft Kinect bar – ~$50 on eBay, easy to find from broken Xbox
Need to experience it

Data using is easy enough – getting the data is tricky.
Projects from Juniors (~16 years old, done 2 years of programming)
Kinect is really just a webacam with 3D depth sensing stuff, some internal processing
3 modes –

  • 1 webcam
  • 2 skeleton streamiing is one.
    Looks at various joints (25 or so)
  • 3 pixel depth mode

Use Visual Studio 2010 Express. Uses C# and XNA – Kinext SDK sits on this. Free

New camera out, so old ones are cheap.

Can also use Python, Processing, Java, Scratch for coding the Kinect.

Skeleton mode – tracks very accurately.
Kids need to think in 3D.

Basic logic of reading the Kinect
Will scale x and y to screen.
x and y values of each joint are sent.
Actions are based on position of joints and movemetn in a x/y sense.
Positions are updated ~50 times per secondary.

joint comparisons are simple if statements based on variables provided by Kinect software

eg is right hand close to hip?

Eg are hands crossed

x y in pixels, z in metres (very high accuracy) so knows in 3-space
Question – do you want to use data first, or do we want to develop code to get it (which is hard)
Strongly suggest use data and do cool stuff first.

Sample projects:

  • Concussion Checker
  • Discus Throw
  • Somersault with photo midway
  • Flexibility Checker
  • Flisght sim
  • Cheerleading drills
  • Grtaphic
  • Music exploration
  • Tennis stroke analysis
  • Tennis – can we analyse tennis serve? Store points in array
  • Dance off
  • Grammar teacher?
  • Self Defence practice
  • Cheerleading – gave index of cheer type, students then follow the different moves. Took snapshot of initial position, then identify changes in body position
  • Trivia Game – using movements like- Raise hand to start, move sholders to spin

eg If Math.Abs(rshoulderz-shoulderz) > .2
*detects throw of shoulder

  • Anatomy lesson -pick up scalpel to point to body organs
  • Concussion in football players: took initial data on eg – arms out, stand on one leg for 10 secs, etc. Then was able to re-do after suspected concussions, compare and identify if something wrong.
  • Run through time – used kindergarten graphic, running action to advance, jump to select. Kid did more exercise in class than in PE.

Detecting running – look at left and right ankle y values and see if they are greater than say 25

  • Skiing – looked at shoulder motion compared to body – left and right sway as you move down slalom
  • Basketball – imported video to portray nature of host detected by Kinect
  • Soccer penalties – again, video based on 9 goalie saves
  • Chemical molecules – drag atoms etc
  • 3D plane flying over ocean, contrl with arms and DancePad. Very significant mathematics!
  • Lots of 15 minute challenges

Doug emphasised the value of failure and ability to overcome this failure. He ran this as a4 month project, giving students time to fail.

“ I need to you to fail in order to learn”

Kinect features

Can use multiple cameras

based on rectangle for hands and also hidden ones, so can detect intersections , catching,
maths – absolute value, comparisons, trig etc.

Kinect camera also does voice recognition. eg can recognise the spoken word  “NEXT”

Make a parabola with you body, see it on the screen. Can freeze/unfreeze by voice.

Add words to speech engine. Then teach it the grammar on how to connect words.

Event handler returns string, then can apply confidence levels, and define actions based on the words.

Specific considerations for DoE Tasmania

 

The Kinect device is affordable, intelligent and common. The programming tasks demonstrated were not beginner tasks but they were within the reach of most year 9 and 10 students who might elect to study computing. The usefulness of the outcomes suggests that this is well worth considering as part of the 9/10 course and at Yr 11/12 senior secondary level. The main cautions are the time needed to implement a significant personal project by student, and the physical space needed to accommodate 25 students all trying to test programs that require physical movement. Integrating with HPE could give rise to some interesting possibilities.

 

Leave a Reply

Your email address will not be published. Required fields are marked *