top of page

Dev Log

Behind the scenes of how MOVE AR was made

April 10 - April 17

Learn Proper Exercise Form with Move AR

Learn Proper Exercise Form with Move AR

Play Video
User Testing Feedback

We got a lot of new user feedback last week, which really helped us find ways we could make our app much more useful. There was a lot of feedback, though, so we weren't able to incorporate all of it, but tried to get the most important changes, such as

  • telling the user useful information about each specific exercise (implemented by Patrick)

  • making it easier to move the model through drag-and-drop (implemented by Kavya)

  • making it easier to move the model through pinch (implemented by Kavya)

  • starting the model in front-facing camera for some exercises (implemented by Patrick)

  • making the front-facing camera joystick control rotation rather than position (implemented by Patrick)

Motion Capture

This is a play-through of the Android build, which doesn't have a recording feature

We added some new animations for exercises through motion capture. Our project stakeholder was the person we recorded for the animations. We went to the Visualization Studio to get motion capture data. My work is that after the motion capture data is cleaned and saved as fbx files, I import them to unity and adjust animation configurations. As we didn’t get enough animations from motion capture, I added more animations from Mixamo. Currently the motion capture animations are not perfectly paired with the model we are using. In the future, we can adjust the model skeleton to improve these animations. (Implemented by Chengyu)

Backend

Because we had so many new exercises and animations, we had to build an interface to easily add new exercises. New exercises can easily be added by entering data into the editor. Then, the entire UI and the connections between screens are generated during runtime. This helped adding more exercises become much easier. It also made exercise-specific info screens much easier to create. (Implemented by Patrick)

Marketing

We recorded a marketing video. We all met together to make the script and recorded most of the voice lines together. Then, Kavya edited it, and we reviewed it and gave feedback for more changes. (Implemented by Kavya)

CV and Body Tracking

Although we were able to get CV to map out the user's body location last week, we decided not to work on CV and body tracking this week. Mainly it's because we didn't want to introduce a huge feature for the exhibition with no time for testing. 

Video Recording

Samuel had technical difficulties with building the app using XCode so Hussain and Samuel worked together this week. We started the week by getting the recorded videos to save to an in-app gallery. Through this gallery, the user can view, rename, and/or delete a recording they have made. This only worked on IOS devices because we did not have the paid version of the Unity asset for video recording. We were debating whether or not it would be worthwhile to complete the creation of our own Android plugin for video recording. We ended up still attempting to do this because it is a key feature. Although we got close, we were unable to get the plugin fully functional for android in time for the showcase (Implemented by Hussain and Samuel)

April 3 - April 10

UI Improvements

We wanted to improve our UI to make it easier to navigate all the features of the app. We noticed in our playtests that there were some features of the app (info screen and AR settings menu) that users usually didn't try. We made sure to make buttons for these more visible so that users will look at them. We also improved our instructions, since many users were confused at certain points previously. (Implemented by Patrick)

info.png
Favorites

We implemented the backend for selecting favorite exercises and made a separate tab in the UI to view favorite exercises. This took a little more time than expected though, since I might have over-engineered the implementation. In retrospect, this was probably a task I should not have spent so much time doing. But at least now, our app is very scalable! (Implemented by Patrick)

Human Model Customizations

We improved our settings menu to customize the model used by the user. We now have more feedback for which model is currently selected and also accurate images for each model. We created several new models for different body types (male, female, child) and also added options for different opacity (solid or transparent). We also imported more animations from Mixamo and images of different exercises. We also implemented the backend for exercise selection to show the correct model, animation, and muscle highlights. (Implemented by Kavya, Chengyu, and Patrick)

CV and Body Tracking

We found during the playtest, that the users would like to be compared to the model in some way to gauge if they were achieving the correct form demonstrated by the model. This aligned with our original goal as well. Body Tracking has been incorporated into the application via Unity’s ARFoundation package that can effectively place a model on top of the user. We still need to figure out exactly how to use this to give the user feedback / what type of feedback the user would want. More testing will be needed before this can actually be incorporated completely. We aim to have this done by next week! (Implemented by Samuel)

Video Recording

The asset that we were using for video recording was unfortunately not cross-platform in the free version. We worked on making a plugin ourselves that would work on Android. We are having some difficulties with loading our plugin, but the feature is nearly complete. Additionally, we worked on fixing up the gallery scene such that users can save videos straight to the in-app gallery.
(Implemented by Hussain)

In Progress / Next Sprint

Next sprint, our focus will be

  • Add more instructions if anything is still unintuitive after the user test

  • Add as many exercise animations as we can find

  • Incorporate CV in a useful way (we might not have time)

  • Marketing

March 27 - April 3

Planning and UI

We implemented several new screens / functionalities for this milestone, including model placement with buttons, video recording and playback, menu screens and settings, and muscle highlighting. We added to our Figma to plan out these new changes. For the UI screen, we added a menu with a link to purchase a premium plan, a screen to change AR settings, a screen about account settings, and a link to the website. (Implemented by Patrick)

Interact Mode / Character Model Placement

We wanted to add ways for the user to edit the model’s position, rotation, and size after it was placed. We added several different buttons to allow the user to do this. We also added a visibility button that the user can use to hide all other buttons in case the screen looks too cluttered. We tried to make it clear when the user is in cursor mode (to place the model initially) vs when they are in view mode (to view and adjust the model’s position, rotation, size). These adjustments also work with the front-facing camera in order to make up for the front camera's lack of AR support. However, the front-facing camera only switches correctly on Android so far, so we are still figuring out how to make it work for IOS. Finally, we also added a way for the user to change playback speed. (Implemented by Patrick and Samuel)

Human Model Customizations

We added a new screen to customize the model used by the user. This model will change to different body types depending on the user’s preferences. We will improve this screen’s UI in the next sprint hopefully. (Implemented by Kavya)

Human Model Highlights

Based on user feedback, we decided to change our model to be solid-colored. We are using muscle highlighting to tell the user to be able to tell which muscles are being targeted in each exercise. In blender, we divide the meshes of the human body model into 11 parts and assign different materials to each part. In the Unity scene, we write a script to assign material at runtime to make controllable model highlighting. (Implemented by Chengyu)

​

The different buttons currently don’t correspond to the correct animations / muscle groups. We plan to build functionality next sprint when we will hopefully be able to make our own animations. We contacted VizStudio to see if we could request a session to learn about motion capture so that we could record our own animations, but they have not responded yet. In the meantime, we will use Mixamo as a proof of concept.

Video Recording and Playback

I found a suitable plug-in for our development needs that allows us to capture a screen recording and play it back to the user. Additionally, I built a UI panel that allows the user to view, save, or share their video. I created a gallery scene where the user can view all recorded videos that they have saved. I did this by creating a video file manager that keeps track of all of our videos. (Implemented by Hussain)

In Progress / Next Sprint

There are 2 main things we didn’t finish this sprint. First is using motion capture to record our own animations into Unity. Second is finding a project stakeholder. Next sprint, we will spend more time trying to accomplish both of these! In the next sprint, we also want to investigate using some computer vision to judge if a user is in the correct form. We also noticed several bugs that we added to the backlog for next sprint. 

March 20 - March 27

Idea

We began by brainstorming possible ideas. We wanted to create something with strong AR emphasis, yet we also wanted to make sure our product played into the strengths of present-day AR technology rather than the weaknesses. Visualization is one of AR's biggest strong suits and it is not too technically challenging for mobile AR, so we looked for ideas related to visualization. 

​

As a team, we like the idea of using AR to visualize proper form for exercises, physical therapy motions, or sports motions. This has a strong AR emphasis and relies on AR's strong suits. This is our initial idea, but we are unsure which route to go (exercise, physical therapy, sports, etc).

Design

We created a basic Figma design as well as a tech doc to outline our ideas and the functionality of different features. Currently, we are focusing on the theme of exercise for the UI, but it could change later on. After selecting an exercise in the selection screen, the user will go to AR mode, in which they can place an AR model performing the correct exercise motion and examine it in AR. (Implemented by Patrick)

Screenshot 2023-03-25 201843.png
Selection UI

We implemented the selection screen to select different animations to be shown in AR. We made sure to use anchors and layout groups so that the UI would scale well with different screen sizes. (Implemented by Patrick)

Screenshot 2023-03-25 202533.png
Info Screen

We added an information button on the selection screen of the app that allows users to see instructions on how the app works and how to use it properly. Clicking on the button brings up a panel that has text instructions. The panel can be closed with a close button or by clicking on the information button again. (Implemented by Kavya)

Scene Transition

We added simple scene transitions on “Bicep Curl” and “Lunge” buttons to point to two AR scenes. (Implemented by Chengyu)

Front-Facing Camera
Video Recording Playback

We experimented with camera switching in the AR scene. Currently tapping a button can switch between cameras. However, the user facing camera can not work with AR functions. Besides, objects need to be remapped after switching cameras. (Implemented by Chengyu)

We experimented with adding a video recording and playback feature. This feature was left out of this milestone because we were unable to find an efficient way to capture a recording without purchasing an API. We experimented with using screen captures to save individual frames and play them back but were unable to get it to work on our devices. (Implemented by Hussain)

Model Movement

We added an ability to move the 3D model to anywhere in the room with a cursor to help aid the user where to place it. Sadly, this feature was left out of the current milestone deliverable due to merging complications. It will be introduced in the following week! (Implemented by Samuel)

  • Facebook
  • Twitter
  • Instagram
  • LinkedIn

© 2023 by MOVE AR. Powered and secured by Wix

bottom of page