Fitness Trackers come and go. One day you are enjoying your Jawbone Up or the Facebook Moves App and the next day you are looking for a new way to collect your fitness data. While turnover is expected for the new-ish health wearables market, it can be annoying switching devices after finding one you like. Since I spend an inordinate amount of my life working with trackers from multiple manufacturers and have my own preconceived notions of which trackers I prefer, I thought it might be useful to run these devices through a comparison test to see how well they perform against each other. For the sake of brevity and to get this article published, I’m splitting the comparison into three parts: setup, testing, and findings.
For the purposes of this comparison, we tested the most popular device manufacturers that provide an interface that sync user data with third-party developers like ChallengeRunner. In some cases, we tested multiple devices from the same manufacturer such as Fitbit, Garmin, and Withings, but this did not seem to make much difference in the results. Indeed, while results from manufacturer to manufacturer vary, device makers seem to use the same technology across their various fitness tracker offerings. For example, using a new Fitbit Charge 3 delivered similar results to an older Fitbit Flex.
You’ll notice that the trackers tested (and phones used), for the most part, are not the highest-priced models on the market. However, these represent the devices used in testing and building our systems and more closely represent how the average user is tracking their data.
We also used two smartphones in the comparison: an Apple iPhone SE and an Motorola Moto G5. Since smartphones can be used as trackers, we used them to collect data for Apple Health using the iPhone as well as Samsung Health and Google Fit using the Motorola. Once again, these are not top of the line phones but they do represent typical user devices and, more importantly, actually fit in your pocket while you run!
In order to compare tracker results, I needed to perform the same activity for a prescribed duration multiple times. Since I would be testing the four features that most trackers have in common (steps, active minutes, distance, and calories burned), I decided on the most obvious method that would test all features at once: running or walking on a treadmill. With the activity selected, I then needed to determine how long to I should set the treadmill and at what intensity. Several trackers will not start tracking active minutes until the user has reached at least 10 minutes of activity. In addition, some trackers require a moderate or greater level of intensity in order to start tracking active minutes. In the end, I chose running at 6.0 MPH for 20 minutes because it would trigger active minutes and it would result in a distance of exactly 2 miles. Alternately, I chose to walk for 5 minutes at 2 miles per hour to test how the device would behave under the most common conditions. For the purposes of this comparison, a Proform Thinline treadmill was used for all testing.
With that, I began testing trackers every other day over the course of three months in order to collect a descent sample size. In the next article, I will review the data collected during testing and how the devices compared.<< Prev Next >>