LPC
  • Home
  • About Us
  • Projects
    • BlockyTalky
    • EPIC
    • Luminous Science
    • Weird Code Club
      • ARcadia
    • Creative++
  • Publications
  • Blog
  • Contact
  • Resources
  • Home
  • About Us
  • Projects
    • BlockyTalky
    • EPIC
    • Luminous Science
    • Weird Code Club
      • ARcadia
    • Creative++
  • Publications
  • Blog
  • Contact
  • Resources
Search

Prototyping A Sports Wearable With Microbits

6/20/2018

 

What are we doing?

We're building a create-your-own wearable kit to bring machine learning and statistics concepts to students. We envision students asking questions around sports performance such as, "How can I improve my tennis serve?", "What percentage of my shots are accurate over time?" or "Why doesn't our basketball team ever win?". They would then use wearable sensors to collect data from the body, analyze it using provided tools, and try to draw some conclusion. 

The bulk of this work involves building an app that allows the user to customize data collection and analysis from on-body sensors, and designing curricula to support use in secondary math and gym classes. This post will focus on our work on the application. To build the app, we need to enable four main functionalities:
  1. Capturing video and sensor data
  2. Allow user to segment gestures, assign labels and classifications
  3. Train machine learning algorithms on the segments provided by the user
  4. Test trained algorithm on new user input
We're currently prototyping with Micro:bits and iPhones. We just completed work on capturing video and sensor data, and wanted to share some of our demos and next steps!

Demo: Capturing video and sensor data

Our first goal was to make the Micro:bit's accelerometer data available in the app. We connected to the Micro:bit using bluetooth, calculated the magnitude of acceleration at each reading, and graphed the results over time. ​
In this video, we create a makeshift wearable, which is tied on Varun's wrist. Varun demonstrates the progress of the live graph by doing some jumping jacks. The live accelerometer data can be seen on the screen behind him.

​
At this stage, the accelerometer output was not actually synced to the video. In order to address this, we only have the app start collecting data from the accelerometer when the user presses the video record button, and stop when the user stops the video recording. The graph presents this data continuously over the time interval of the recording.

​The graph also acts as a timeline. The user can tap on parts of the graph and the video will go to that particular timestamp in the video. We are also now graphing x, y, and z acceleration instead of magnitude because it will be easier for the user to interpret that data.

​Check out what you can do with our app now:

Bridget uses the app to record two different hockey swings. Then, she watches the video playback with the acceleration x, y and z graph, and shifts through the video with the graph timeline feature.

​

Next Steps

An app like this generates a lot of data. We currently capture 10 (x,y,z) coordinates per second- for a five-minute video, we’d capture 3,000 points. We’ll also need to store segmented gestures, their labels, and classifications. This could get out of hand pretty quickly, so it’s important for us to develop a data storage system for the application. We’ll be using Apple's CoreData storage framework for this project because we are mainly concerned with local data storage (CoreData makes this pretty easy). 

We’re also building out 2. Allow user to segment gestures, assign labels and classifications. Users will identify the beginning and end of gestures, and assign a label (i.e. “slapshot”, “pass”). They’ll also be able to classify each gesture as “good” or “bad,” although we’ll later extend this to include an option for multiple classifications. This is an important next step in creating a foundation for users to explore machine learning.
Signing off,
Abigail Zimmermann-Niefield, Bridget Murphy and Varun Narayanswamy

Comments are closed.

    RSS Feed

  • Home
  • About Us
  • Projects
    • BlockyTalky
    • EPIC
    • Luminous Science
    • Weird Code Club
      • ARcadia
    • Creative++
  • Publications
  • Blog
  • Contact
  • Resources