Capture.U Practice - Learn More
Learn More helps you to process and analyze the exercises that you performed in the Practice mode within Learn in the Capture.U app. This space will be updated regularly, so please check back for more content including more exercise examples!
All scripts posted and discussed within Learn More are for reference only and have not been independently reviewed or validated.
Accelerometer
The following topics walk you through how to process and analyze the data collected from the accelerometers. They are followed by a short discussion.
Review
In this exercise, the IMU was affixed to the upper back (or chest) of the participant. For the first two seconds of data collection, the participant was instructed to stand still. The participant was then asked to perform a squat at a smooth (or relaxed) pace, pause for two seconds, and then perform another squat at a faster pace.
Processing data
To process your data after collecting in Real-Time Insight, complete the following steps. Make sure you have downloaded Capture.U Practice
from https://www.vicon.com/software/models-and-scripts/.
-
In the Capture.U app, tap the Export button
at the top right of Real-Time Insight. - From the available export options, choose one that allows you to (eventually) save the trial locally onto your computer. For example, e-mail it to yourself and then download the file from your e-mail on your PC.
- Save the file to a location that you can access easily on your PC.
- On your PC, extract (ie, unzip) the contents of the zipped folder.
- Open the Google Sheet (Practice_Accelerometer) and follow the instructions to process your data.
To find a link to the Google Sheet on your PC, see the PDF inside the downloaded Capture.U Practice
folder. If you need assistance with accessing the Google Sheet, see Accessing Google Sheets.
Discussion
Why use resultant acceleration?
For the squat, the instructions set up Real-Time Insight to visualize the acceleration in a single axis (ie, the x-axis). This mimicked the instructions within the Visualize section of the Accelerometer module in the Education mode; that is, we suggested the squat is analyzed as a linear movement inline with one of the accelerometer's axes. When the participant is in their initial position (ie, standing), this makes sense, as the direction of the x-axis is well-aligned with the vertical axis, which is how the center of mass would typically move during the squat (Figure 1). However, as the participant squats, the sensor x-axis starts to diverge from this vertical axis while the z-axis becomes more aligned with the vertical axis (Figure 2). As such, examining the squat with just one axis may no longer be practical. Is there a better approach, using the data captured from Real-Time Insight?
Let's first consider what data is exported in Real-Time Insight. In addition to the individual axes, Real-Time Insight enables you to visualize and export the resultant acceleration, which is the overall magnitude of acceleration. As the participant was unlikely to have too much forward/backward and side-to-side movement during the squat, using the resultant acceleration is an approach that is likely to yield better insights than looking at just one individual sensor axis. In an ideal scenario, knowing the orientation of the sensor at each moment in the squat would yield the most precise results. This would enable us to calculate the components of each sensor axis that was in the direction of the vertical axis. This can be an important consideration if you're conducting a similar study on your own.
Analyzing the graph
Within the Accelerometer Practice module, the Analyze page helped to interpret the shape of the curve calculated during the A-B-A movement. We can use a similar strategy here when analyzing the squat. The following analysis assumes that the sensor was oriented 'logo-up'. Here is a sample graph of the movement with some key events marked and explained below:
Key stages in a squat
A: Rest – Participant is standing still
B: Initial (downward) acceleration – Participant begins to descend in their squat
C: Max (downward) acceleration – Participant is still descending in their squat
D: Peak (downward) velocity – When the acceleration crosses zero, it signifies a peak in velocity. When the acceleration transitions from negative to positive, this indicates a peak negative velocity. Participant is still descending in their squat, but the rate at which they are descending has decreased (ie, less negative)
E: Maximum squat depth – Participant transitions from going down in their squat to standing back up.
A trend to note is that a peak in acceleration (C) always precedes a peak in velocity (D), when acceleration crosses from negative to positive, which is then followed by the peak in position (E), when velocity crosses from negative to positive (not shown).
Things to think about...
When analyzing the data, you may want to think about these additional points. We will include discussions of each of these questions (and others) in a future update.
- Using the knowledge about the first half of the squat (down), can you identify the events in the second half (up)?
- In the Acceleration module within Education, we discussed the trade-off between using the low-g and high-g accelerometers. How would the analysis be different if the high-g accelerometer data was analyzed instead?
- Can we integrate the data to get velocity? What considerations may be required? (Hint: Look at the
Practice_Gyroscope
sheet for an idea on how you can integrate the data within the worksheet itself).
Back to Topics
Gyroscope
The following topics walk you through how to analyze the data collected from the gyroscope. They are followed by a short discussion.
Review
In this exercise, the IMU was affixed to the (left) wrist of the participant. For the first two seconds of data collection, the participant was instructed to stand still. The participant was then asked to perform a bicep hammer curl at a smooth (or relaxed) pace, pause for two seconds, and then perform another bicep hammer curl at a faster pace.
Processing data
To process your data after collecting in Real-Time Insight, complete the following steps. Make sure you have downloaded Capture.U Practice Scripts
from https://www.vicon.com/software/models-and-scripts/.
-
In the Capture.U app, tap the Export button
at the top right of Real-Time Insight. - From the available options, choose one that allows you to (eventually) save the trial locally onto your computer. For example, e-mail it to yourself and then download the file from your e-mail on your PC.
- Save the file to a location that you can access easily on your PC,
- On your PC, extract (ie, unzip) the contents of the zipped folder.
- Open the Google Sheet (Practice_Gyroscope) and follow the instructions displayed to process your data.
To find a link to the Google Sheet on your PC, see the PDF inside the downloaded Learn Practice
folder. If you need assistance with accessing the Google Sheet, see Accessing Google Sheets.
Discussion
Why the z-axis?
Before we begin interpreting, is it clear why the z-axis was chosen for analysis? To help explain, let's first identify three stages of the movement: 0 degrees, 45 degrees, and 90 degrees.
Next, let's look at the sensor in isolation (ie, independently of the movement). What is the orientation of just the sensor? It should look something like this:
Of all the exercises within the Gyroscope Education module, which rotation did this match most? Hopefully, you also chose the rotation about the z-axis.
What makes this difficult to interpret is the fact that the sensor's point of rotation is not the middle of the sensor, but rather the elbow. As such, there is not only rotation of the sensor, but displacement too. However, when we are looking at the gyroscope data, we really only need to look at how its orientation changes to help us determine its direction of rotation and angular velocity.
Analyzing the graph
Within the Gyroscope Education module, the Analyze page helped to identify the difference between a positive and negative angular rotation and interpret the shape of the curve as the sensor was rotated 90 degrees in both directions. We can use a similar strategy here when analyzing the bicep hammer curl. The following analysis assumes that the sensor was placed as advised in the module (ie, top of logo on the thumb side). Here is a sample graph of the movement with some key stages identified:
Key stages in a bicep hammer curl
A: Rest – Participant is standing still with their (left) arm by their side.
B: Initial (negative) angular velocity – Participant begins to flex their elbow; sensor is rotating clockwise (ie, negative).
C: Max (negative) angular velocity – Participant is still flexing their elbow, but the rate of angular velocity change begins to decrease.
D: Angular velocity approaches zero – As the angular velocity transitions from negative to positive, this signifies when the elbow has reached maximum flexion.
Integration
Within the Practice_Gyroscope
example, the angular velocity data was integrated to provide an 'angle'. Angle is in quotation marks because it can be unclear what this angle represents: we can see that the angle is changing, but what does it actually tell us about the orientation of the arm at that time? What we need is some information about the sensor's original orientation. This is the exact reason why we instructed the participant to stand with their arm at their side so that the initial condition would be known. Even so, we don't know if the elbow was straight or bent to 15 degrees,, but at least we have some initial frame of reference.
As mentioned, one thing that is difficult about integrating the angular velocity is the noise and drift within a gyroscope. We can see its effects particularly in the third and fourth flexion-extension cycle. Its unlikely that the participant hyper-extended the elbow more with each cycle. Instead, this shows how those sources of error accumulate over time and effect the angle estimation. This is a very important consideration when trying to use gyroscopes alone to measure orientation directly.
Things to think about...
When analyzing the data, you may want to think about these additional points. We will include discussions of each of these questions (and others) in a future update
- Using the events from the first half of the movement, can you identify the events in the second half?
- How could we reduce the effects of the noise and drift in the gyroscope using simple calculations in the sheet?
Back to Topics
Global angles
The following topics walk you through one way that you can analyze the global angle data. They are followed by a short discussion.
Review
In this exercise, the IMU was strapped to the (left) wrist of the participant. For the first two seconds of data collection, the participant was instructed to start with their arm by their side and thumb pointing forwards. The participant was then asked to perform three separate movements, with two seconds between each, beginning from this initial position: an elbow flexion-extension (ie, hammer curl), shoulder abduction, and forearm pronation. As data was collected in Real-Time Insight, you should have been able to view the angle outputs as the movements were performed.
What is a global angle?
If you recall, a global angle is the orientation of the sensor's Local Coordinate System (LCS) relative to a Global Coordinate System (GCS). Global angles are a commonly used measurement (eg, foot contact angle or pelvic tilt), though they may not often be referred to as such. To take either measurement, some determination was used to establish the orientation of the GCS. In a traditional motion capture environment, a calibration wand is used to establish the GCS, while in clinical settings, the physician is likely to use the ground or a table to establish their reference system. With an IMU, this is inherently more difficult because the IMU itself establishes the GCS and it cannot simply be seen by the naked eye.
Local angles
What makes IMUs somewhat difficult to interpret is that their individual sensors (accelerometer, gyroscope, and magnetometer) all output their data relative to their own LCS while global angles outputs its data relative to the GCS. So is there a way that we can convert the global angles back into the same coordinate system as the individual sensors?
The following topics walk you through how to convert your global angles into local angles using sample scripts.
Processing data
To process your data after collecting in Real-Time Insight, complete these steps. Make sure you have downloaded Capture.U Practice Scripts
from https://www.vicon.com/software/models-and-scripts/. A sample trial (Practice_GlobalAngles.csv
) has also been included in the folder.
-
In the Capture.U app, tap the export button
at the top right of Real-Time Insight. - From the available export options, choose one that allows you to (eventually) save the trial locally onto your computer. For example, e-mail it to yourself and then download the file from your e-mail on your PC.
- Save the file to a location that you can access easily on your PC.
- On your PC, extract (ie, unzip) the contents of the zipped folder.
- To process in MATLAB, see Processing in MATLAB. To Process in Python, see Processing in Python.
While MATLAB requires a license, Python is completely open source and freely available for use. Python requires the environment to be set up correctly to run the examples presented here.
Processing in MATLAB
The following describes the steps necessary to process your data in MATLAB using the sample scripts. You need the folder Practice_Matlab
and the two files Practice_GlobalAngles.m
and ViconUtils.m
..
For reference, a video for processing is available here.
- Make sure that MATLAB can access the folder where both files are presently located. If you need to move the files, make sure you take the contents of the entire folder (ie, both files) and place them in the same folder in another directory. If necessary, set the path in MATLAB for the new folder.
- On the Editor tab, click Run in the toolbar.
- Navigate to the trial you want to analyze and select it to open. If you cannot collect a trial, use the sample trial (
Practice_GlobalAngles.csv)
that is included in the folder. - When the script finishes running, two graphical outputs (Figure 1) are produced; the first is the helical angles as shown within Real-Time Insight (converted to degrees) and the second is the local angle, also expressed as a helical angle (in degrees).
The MATLAB sample script was written in MATLAB 2019a. While it should work seamlessly in other versions of MATLAB, you may want to ensure your version is not older than this.
Processing in Python
Complete the following steps to process your data in Python using the sample scripts. You need the folder Practice_Python
and the two files Practice_GlobalAngles.py
and ViconUtils.py
.
For reference, a video for processing is available here.
-
To open the Command Prompt, press the Windows key on your keyboard and in the Search field, enter
cmd
. - In the Command Prompt window, at the prompt, enter
python
.
This step is to verify that you have Python 3 installed.
If you do not have Python 3 installed, see Installing Python 3 modules.
If you have Python 3 installed (command window will display the version number of your Python 3 installation), typeexit()
so that it goes back to the original directory (C:\Windows\System32 in this example). - Make sure you have the following modules installed: pandas, matplotlib, numpy, openpyxl.
For instructions on how to install Python 3 modules, see Installing Python 3 Modules. -
In the Command Prompt window, make sure your prompt is at the original directory (eg, C:\Windows\System32).
If it is not, typeexit()
and press Enter. -
At the prompt, type
python <your directory>\Practice_GlobalAngles.py
and press Enter. Please note,<your directory>
is the folder in which you have saved thePractice_GlobalAngles.py
andViconUtils.py
files.If your script fails to run, ensure that you have Python 3 installed and not a distribution of Python 3 (eg, Anaconda).
-
Navigate to the trial that you want to analyze and select it to open. Make sure that the file you import is a .csv file.
If you cannot collect a trial, use the sample trial (Practice_GlobalAngles.csv)
that has been included in the folder. - When the script finishes running, two graphical outputs (Figure 1) are produced; the first is the helical angles as shown within Real-Time Insight and the second is the local angle, also expressed as a helical angle.
This Python sample script was written in Python 3.9. While it should work seamlessly in other versions of Python 3, you may want to ensure your version is not older than this.
Interpreting and understanding processed data
Now that we have processed the data, is it clear how the data was re-aligned and what its purpose is? Let's first look at the graph for the global angles (ie, Original Orientation). If you recall, the participant was asked to remain still with the arm to the side for the first two seconds. You should be able to see this flat line at the beginning. To examine the first movement (elbow flexion-extension), we recommended looking at the z-axis. Looking at the data post-collection, we can see that the angles in the other axes (x and y) changed with the movement as well. What is going on? Remember, this is a global angle, so it is the orientation of the IMU relative to the GCS. Unless the LCS of the sensor was aligned perfectly with the GCS, we would expect the cross-talk observed here. The exact amount of cross-talk will vary as it ultimately depends on how the IMU was oriented relative to the GCS during the trial. We should see this cross talk in the other two movements as well, which makes it difficult to tell whether the angle in an axis is changing because of the movement or due to the cross-talk. So what we really want to do is to isolate the outputs to the specific axis in which we expected the movement to occur. In other words, we want to convert the global angles to an angle relative to its LCS – a local angle!
Let's take a closer look at how this was done in the sample script.
What is cross-talk?
In the context of angles and human movement, cross-talk refers to an unwanted transfer of angles from the expected primary axis of movement to a secondary axis. This can occur when the measurement axis is not aligned with the movement axis.
Going back to the initial step in the set of instructions, the participant was told to stand still with their arm at their side for the first 2 seconds. The purpose of this step was to calibrate a new, known reference system using the global angle of this static pose and use it to re-calibrate all future global angles during the set of movements. In other words, all angles are relative to this static position and not the GCS after this calibration step. However, this is not a simple 'zeroing' process; that is, we cannot simply subtract the global angle during the static pose from the entire data set. Instead we need to convert the global angles into a rotation matrix (also known as a direct cosine matrix) or a quaternion. In this script, we decided to use rotation matrices. As such, the global angles of each frame (currently expressed in helical angles) are converted to a rotation matrix and compared to the rotation matrix of the static pose to obtain a relative rotation matrix. This relative rotation matrix represents the change in orientation of the IMU relative to that initial static pose. We can then convert this back into a helical angle so that we can interpret it visually (ie, Re-aligned Orientation).
Looking at the Re-aligned Orientation graph, the angles now seem to be more isolated to the axis that we might have initially expected. The predominant angle change during elbow flexion-extension is in the z-axis, the shoulder ab/adduction is in the x-axis and the ulnar/radial rotation is about the y-axis. However, there still seems to be some cross-talk. What is going on here? First, we likely didn't take the re-aligning far enough – we would actually want to align it to the coordinate system of the joint. This is difficult without additional information and in this particular set of exercises, two joints were moving. This is a first glimpse into some of the challenges of calculating a joint angle from IMUs. Second, human movements are not always 'pure' in that they don't always occur in one axis even though they may appear to do so to the naked eye. As you were performing the elbow flexion-extension, regardless of how hard you tried to isolate the movement in the local z-axis of the IMU, there would always be some angular change in the x and y axes.
Things to think about...
When analyzing the data, you may want to think about these additional points. We will include discussions of each of these questions (and others) in a future update
- In the gyroscope practice example, we noted the effects of noise and drift on the angle. Can you compare the integrated gyroscope data with the re-aligned global angles? How do they compare?
- Using what you have learned from converting global angles into local angles, can you devise a strategy that would enable you to find the relative angle between IMUs? That is, can you calculate what the orientation of one IMU is relative to another? As a hint, what is the first thing calculated when the data has been extracted from the sensor?
Back to Topics
Help with scripts
The following topics provide additional information that can help you set up your trials for processing.
Accessing Google Sheets
The following steps help you to access and edit the Google Sheet for analyzing your collected trials.
- Go to the link for the appropriate Google Sheet. You need a Google account to access.
- In the toolbar, select
File > Make a Copy
. - In the dialog box, save the sheet to
My Drive
and clickOK
.
The copied version opens automatically.
Back to Accelerometer
Back to Gyroscope
Installing Python 3
These instructions help you install Python 3 in Windows 10. The example uses Python 3.9, but other versions follow the same methodology.
- Go to https://www.python.org/downloads/ and download Python 3.
- After download, double-click the executable.
In this example, is the executable. - On the install screen, make sure you add Python 3 to the PATH and use the default install option.
When setup is successful, click Close. - To open the Command Prompt, click the Windows menu button or press the Windows key on your keyboard.
- In the Search field, enter
cmd
. - In the Command Prompt window, enter
python
.
This enables you to verify that you have Python 3 installed. - If in the previous step, you notice that your Python version is Python 2, you must add Python 3 to the path manually.
For step-by-step instructions, see this Vicon video.
Back to Processing in Python
Installing Python 3 modules
To install modules in Python 3, complete the following steps. Remember, we will need to install numpy, matplotlib, pandas and openpyxl.
- Click the Windows button at the bottom left of your screen or press the Windows key on your keyboard. If you have a Command Prompt already open, close it, and complete the remaining steps.
- In the Search field, enter
cmd
. - If you want to check the modules already installed, in the Command Prompt window, at the prompt enter
pip3 list
.
Make sure the modules numpy, matplotlib, pandas, and openpyxl are listed. - To install numpy, enter
pip3 install numpy
.
After installation is complete, a message that looks like this is displayed, though the specific version of numpy may be different: - In the same Command Prompt window, repeat these steps for matplotlib, pandas and openpyxl (ie,
pip3 install matplotlib
, etc)
Back to Processing in Python