Physiological and kinanthropometric measurements are an essential part of sport and exercise science, as they can be used to monitor, evaluate, and develop training programs. Testing conditions can be tightly controlled under laboratory settings, with a number of tests that can be reproduced to relatively known degrees of accuracy with documentation of reliability testing. A possible limitation of these tests is the absence of ecological validity. Practitioners often rely on field tests to measure and evaluate performance, either by choice to enhance familiarity and ecological validity for the athlete, or due to time, space, or facility constraints. Maximizing the portability of equipment needed in the field would help the practitioner, and advances in technology mean that smaller technologies are capable of much more. A recent article from Cardinale and Varley (17) reviewed wearable technologies to monitor training, such as global positioning system (GPS) units, heart rate monitors, and accelerometers. However, some technologies do not require wearables, only the mobile device itself to collect data through downloadable applications (apps). With some of the most recent advances, it is not unfathomable that coaches can collect most of their data using only their mobile device. However, the validity and reliability of these data can often be unknown. The purpose of this review is to critically appraise the literature in this area and identify variables that can be measured using commercially available apps on a mobile device.
Capacity for Apps to Collect Physiological and Kinanthropometric Data
In terms of collecting physiological data, mobile devices can be used in 2 primary ways; (a) by acting as the data logger and interface for a peripheral attachment and (b) using the external sensors (e.g., microphone and camera) and internal processors of the device itself to collect and interpret signals. It is beyond the scope of this review to comment on the engineering of the methods in depth, instead the focus of this section is to review the validity and practical use of the latter method, i.e., collection and interpretation using only the mobile device.
Heart Rate Measurement
Heart rate is a fundamental physiological measurement in the sport, health, and exercise sciences. The criterion, or “gold-standard,” remains to be the electrocardiogram (ECG), which can be impractical in the field. A number of telemetry devices have been validated against the ECG for use in more practical situations (81,108); however, these devices also come with cost implications for multiple units, and the placement of a chest strap may be deemed intrusive by some clients. Furthermore, the requirement for extra hardware may limit the widespread use (98). This may particularly be the case in more health-related environments such as fitness centers and rehabilitation units. Practitioners in these areas may only have manual palpation methods available to them, which have been demonstrated to be inaccurate (41,59). It is in such cases that the technology within ubiquitously available mobile devices may be of benefit. The most simplistic of apps to facilitate heart rate measurement acts in a similar way to a metronome, whereby the screen is tapped every time a pulse has been palpated. This method is presumably designed to reduce error by separating the tasks of palpating and counting. However, Peart et al. (83) found that 1 such app on an iOS iPad mini 2 (“Tap the Pulse” by Orangesoft LLC) had greater discrepancy to telemetry measurements when compared with manual methods (r2 = 0.636, coefficient of variation [CV] = 7% and r2 = 0.851, CV = 3%, respectively).
More advanced measurements use technology known as photoplethysmography (PPG). Photoplethysmography is the technology currently used in fingertip pulse oximeters, and works on the basis that when capillaries are filled with blood light is obstructed, and more light can pass through as blood is retracted. Pelegris et al. (85) explain that it is this change in average brightness that acts as the signal for the device to interpret and extract heart rate readings from. The same authors looked to validate their technology that calculated heart rate taken from a stream of picture frames when the finger was held against the camera lens and flash of an HTC Tatoo (Android 1.6) mobile phone, compared with a pulse oximeter. Unfortunately, the main focus of this article seemed to be the description of the technology and there is little information about how the technology was actually validated. The raw data are provided in the article, and the correlation between methods has been calculated as moderate (r = 0.6) with an average 4 b·min–1 difference between methods. Popescu et al. (90) and Losa-Iglesias et al. (62) both assessed the capabilities of 2 commercially available apps that worked on the same premise of applying the fingertip to the device's camera and flash. Popescu et al. (90) compared “Cardiowatch” by Radu Ionescu on an iPhone to an ECG machine, and Losa-Iglesias et al. (62) compared “Heart Rate Plus” by AVDApps on a Samsung Galaxy Note phone to a pulse oximeter, with both studies reporting a typical difference of ±3–4 b·min–1 between measurement methods.
Although the contact PPG technology seems to be able to measure resting heart rate relatively accurately, data from Wackel et al. (109) suggest that error may increase as heart rate increases. These authors reported resting values measured with “Instant Heart Rate” by Azumio and “Heart Beat Rate” by Bio2imaging on an iPhone 5 to be within ±4 b·min–1 of an ECG measurement (r = 0.99) in pediatric patients, similar to the aforementioned work (62,85,90). However, when the apps were used during a period of tachycardia (156–272 b·min–1), the average difference compared with an ECG increased to 18 b·min–1 (up to 47 b·min–1), and the correlation reduced to r = 0.56. This has obvious implications for sport and exercise, as heart rate measurements are likely to take place after exercise. It should be considered though that the use of such technology postexercise may be most likely to be used after submaximal predictor tests, where the heart rate is unlikely to be as high as those observed by Wackel et al. (109). Although the tachycardic range witnessed by Wackel et al. (109) was from 156 b·min–1, the majority were greater than 200 b·min–1. Ho et al. (51) measured heart rates in 126 children admitted to hospital on 4 different apps on an iPhone 4S at the earlobe and fingertip alongside an ECG machine. The heart rates from the apps were more closely correlated with the ECG at the earlobe rather than finger, with correlations ranging from r2 = 0.215 to 0.857. App A considerably outperformed the other 3 apps with anomalous results appearing to start at approximately 160 b·min–1. Unfortunately, the authors did not provide the names of the apps tested. The only known study to test contact PPG technology on mobile devices after exercise was conducted by Mitchell et al. (70). Participants had their heart rate measured at rest and after a 1-minute step test, so replicating the conditions under which the technology is perhaps most likely to be used. Measurements were taken using the same “Instant Heart Rate” by Azumio app used by Wackel et al. (109) on an iOS and Android phone, and a Polar telemetry chest strap. Intraclass correlation coefficients (ICCs) with the telemetry method (with 95% confidence intervals) were 0.97 (0.95–0.98) and 0.95 (0.92–0.96) at rest, and 0.90 (0.86–0.93) and 0.94 (0.91–0.96) after exercise for the iOS and Android phones, respectively. The authors concluded that both platforms could be used with confidence; however, when viewing the Bland-Altman plots, the error again appears to increase as heart rate increases.
Kong et al. (56) have suggested that PPG may be made more accurate by using contactless methods, as the contact force on the sensor may affect the waveform of the signals. Contactless PPG using a webcam on a laptop has been described by Poh et al. (89). This technology works on a similar principle to the contact PPG but instead observes video recordings of the face. A number of freely available apps make use of this contactless PPG method and instruct users to hold the device's camera in front of their face until a reading has been taken. Peart et al. (83) investigated 2 contactless PPG apps at rest on an iPad mini 2, “What's my heart rate” by ViTrox Technologies and “Cardiio” by Cardiio Inc, reporting average differences compared with a Polar telemetry monitor of 1 and 2 b·min–1, and correlations of r2 = 0.918 and r2 = 0.646, respectively. In a subsequent study, “What's my heart rate” was used to collect heart rates after a 1-minute step test (84). Average heart rate after the test was measured as 129 b·min–1 using a Polar telemetry strap, but only 84 b·min–1 using the app. Furthermore, when the heart rates were used to estimate aerobic capacity, average values were 17% higher when using the app.
Heart rate alone may only be of limited interest to some practitioners, and many may instead be more interested in the regularity of the heart beat. An abstract with limited information from Sardana et al. (98) reports high sensitivity and reliability for an iPhone app to identify atrial fibrillation (AF). McManus and colleagues describe apps that can identify AF as well as premature atrial contractions and premature ventricular contractions (65,66). Although such measurements may not be of widespread interest to sport and exercise scientists, the ability to determine regularity will be, particularly when considering measurements such as heart rate variability (HRV) for monitoring responses and adaptation to training (87). At present, there is limited means to measure HRV using the mobile device alone, although some studies have described valid measurement with chest strap or fingerpad peripherals by ithlete (HRV Fit Ltd, Ashurst, UK) that attach to a mobile phone (34,49), sensitive enough to track changes over a period of 3 weeks (35). However, some self-contained apps are currently being developed. Scully et al. (100) describe an app that can take 720 × 480 pixel resolution video recordings that can then be analyzed for HRV using Matlab, and Guede-Fernández et al. (45) have developed a noncommercially available app for HRV. Interestingly, the SD of the beat to beat error differed between devices (Motorola Moto X, Motorola Solutions, Chicago, IL, USA, and Samsung S5, Samsung Group, Seoul, South Korea), identifying potential transferability issues between research and practice. The only known commercially available HRV app present in the literature is “HRV4Training” by Marco Altini. This app uses the device's camera to obtain PPG data from the user's fingertip, from which peak to peak intervals are used to identify the root-mean-square of the successive differences and calculate HRV (1). A recent article in press has described the validation of the app against an ECG machine (88), and it has been demonstrated that measurements from the “HRV4Training” App are sensitive enough to detect changes in HRV after intense training (1). Plews et al. (88) did not provide the name of the device used to validate the app against an ECG but did specify a frame rate requirement of 30 Hz. Furthermore, 2 studies implementing the app have collected data from 532 (2) and 797 (1) participants, respectively, demonstrating that it offers real potential to collect large amounts of free-living data outside laboratory settings.
Folke et al. (36) suggest that tidal volume (VT) and respiratory rate (RR) are 2 basic vital signs that breathing monitoring should provide. Methods of recording VT typically includes the use of a spirometer that can be either portable (e.g., hand-held) or much larger (e.g., simple float). Respiratory rate can be obtained by simple human observation or through more sophisticated procedures such as breath-by-breath gas analysis or transthoracic impedance. Although Reyes et al. (91) acknowledge the existence of clinical measures of VT and RR, they also highlight the limitations and disadvantages of existing equipment, in particular, the limited access outside clinical and research settings. Further limitations in existing methods include high costs, specialist personnel, and lack of portability (79,91). Respiratory function can be assessed through numerous ways through the different smartphone hardware including the camera, microphone, and accelerometer.
Reyes et al. (91) used the frontal camera of an HTC One M8 smartphone with the Android v4.4.2 (KitKat) operating system to acquire a chest movement signal which demonstrated a strong relationship (r2 > 0.9) with a spirometer when recording VT. Nam et al. (79) demonstrated similar findings, concluding accurate estimation of breathing rate on the same HTC device. However, although Reyes et al. (91) did not find statistically significant bias in recording VT, the authors questioned whether the error estimate was acceptable for home use. Although the investigation demonstrated reliability and validity in estimating VT and RR, there was still the presence of limitations inherent to contactless optical procedures. Motion artifacts are present in any contactless/noncontact optical procedure of data acquisition, and previous research has demonstrated that artifact removal improves estimation of RR (101,105). Furthermore, Nam et al. (79) suggested that clothing affected the video signal, for example, plain designs compared with striped or nonuniform designs produced smaller relative changes in recorded chest and abdominal movements. Beyond the limitations of the data acquisition and processing, noncontact optical procedures in estimating respiratory parameters lack practical applicability to a more general use setting. Reyes et al.'s (91) procedure requires calibration per individual use with a spirometer, and a qualitative observation of changes in VT is recommended if calibration instrumentation is not available. Reyes et al. (92) did extend their work to demonstrate the efficacy of smartphone use when calibrated with a low-cost incentive spirometer, whereby individuals inspired to a target volume. However, at this stage, it could be argued that there is currently a redundancy in using a smartphone to record respiratory parameters, although there is a need to calibrate using additional equipment. Furthermore, Reyes et al. (92) themselves suggest “the development of an inexpensive and portable breathing monitoring system for on-demand VT, and RR estimation capabilities is still pending for the general population.” Therefore technically, Reyes et al. (91,92) have developed software for a smartphone to record respiratory data independently, but reliability is questionable without the use of additional hardware.
Both Reyes et al. (91) and Nam et al. (79) have demonstrated the valid and reliable use of smartphone hardware to record parameters of lung function. However, in keeping with the theme of this article, neither author has investigated the validity and reliability of a specific smartphone software application that is commercially available for public use. There are currently a range of apps available that provide estimations of RR obtained from tapping on the screen of a smartphone or tablet device, similar to apps such as “Tap the Pulse” (Orangesoft LLC, Minsk and Brest, Belarus) for determining heart rate. Current apps available that use this procedure include “RRate” (PART BC Children's), “Medtimer” (Tigerpixel, Rickmansworth, UK), and “Medirate” (MobileMed Sarl, Lausanne, Switzerland). Karlen et al. (55) assessed the accuracy of the “RRate” app by showing prerecorded videos to hospital staff, and asking them to tap on the screen of an iPod touch (third generation) every time they witnessed the child on the screen breathe. The purpose was to enhance efficiency and accuracy of RR estimations by replacing absolute counts with continuous time intervals. It was reported that the use of the app reduced collection time from 60 seconds to 8.1 ± 1.2 seconds, with a typical error of only 2.2 breaths per minute.
Anthropometry and Range of Motion
Body composition has been assessed in a number of ways including hydrostatic weighing (21) and Dual Energy X-ray Absorptiometry (DXA/DEXA) (53) with some disagreement on the gold standard. There is, however, agreement that these methods present difficulties such as expense, time-consumption, access, and portability (54,63). Such equipment is typically restricted to University laboratories and research settings, and therefore difficult to access for some practitioners such as primary health care workers, nutritionists, fitness instructors, and personal trainers.
With developments in technology, comes the potential for more cost-effective solutions in measuring and assessing body composition. Farina et al. (29) consider 2D imaging, using frontal and lateral images obtained from a standard digital camera, an alternative to costly 3D systems. Using 2D images to provide accurate anthropometric data is not a new development (52). More recent applications of digitizing 2-dimensional images to provide anthropometric include providing hand measurements for the production of work gloves (46). However, these applications of 2-dimensional images only provide surface measurements and do not make inferences on tissue composition. Farina et al. (29) examined the use of a smartphone built-in camera to obtain digital whole-body images to estimate human body composition, finding a negligible (p = 0.96) 0.02 and 0.07 kg difference in estimated fat mass between the app and DXA in females and males, respectively. Android version 4.2.2 on a Huawei G730 smartphone (resolution 540 × 960 pixels or 51.8 megapixels) or iOS 9.2 on an iPhone 5s (resolution 1,136 × 640 pixels or 72.7 megapixels). The study used bespoke, in-house, software as a proof of concept to suggest their findings were “promising” for the use of a smartphone application to monitor body fat. LeanScreen (Postureco, Trinity, FL, USA) is a software application that uses 2-dimensional (2D) photographs taken using a smartphone or tablet to estimate percentage body fat (BF) by digitizing a series of girths. Shaw et al. (102) assessed the reliability of this software application on an iPad mini against skinfold measurements and bioelectrical impedance, which were considered as other field measures comparable with use of a tablet device (i.e., cost and portability). There were no significant differences between the methods for estimated percentage BF (%BF) (p = 0.818) and ICCs demonstrated the reliability of each method to be good (≥0.974). However, the absolute reproducibility, as measured by coefficient of variance and typical error of measurement, was much higher in skinfold measurements and bioelectrical impedance (≤1.07 and ≤0.37, respectively) compared with LeanScreen (6.47 and 1.6%). The authors concluded that the LeanScreen smartphone/tablet application is not suitable for a single, one-off, measurement of %BF and that individual variance should be measured to determine minimal worthwhile change.
Previous studies have investigated the use of smartphones in more applied anthropometry contexts such as posture assessment. PostureScreen Mobile is a smartphone application, from the same company that produced LeanScreen (PostureCo, Inc., Trinity, FL, USA), that assesses posture using 2-dimensional photographs taken by smartphone or tablet. Boland et al. (10) examined intrarater and interrater agreement of PostureScreen Mobile in assessing standing static posture on an iPad. The authors concluded to have found acceptable levels of agreement for 3 different examiners of varying experience. However, the investigators consisted of a doctor of physical therapy (U.S.-licensed physiotherapist) and 2 undergraduate students with the authors making no reference to their undergraduate program of study. Of the 13 postural measures that PostureScreen Mobile provides (head shift lateral, head shift longitudinal, head tilt, shoulder shift lateral, shoulder shift longitudinal, shoulder tilt, ribcage shift, hip shift lateral, hip shift longitudinal, hip tilt, head weight, effective head weight, and knee shift), interrater agreement (ICC) ranged from 0.10 to 1.00 in the fully clothed condition and from 0.26 to 1.00 in the minimal clothing condition. Boland et al. (10) rationalized their investigation by suggesting the measures from the app would only have value if they could be reliable across multiple trials. However, they only assessed intrarater agreement for the doctor of physical therapy. Considering that PostureScreen Mobile is commercially available to public, the reliability of this app can be questioned based on the investigation by Boland et al. (10).
In relation to specific postural anomalies, Driscoll et al. (27) used an iPhone 4 to examine the reliability of Scolioscreen (Spinologics, Inc., Montreal, Canada) to assess adolescent idiopathic scoliosis by measuring maximum angle of trunk inclination. The “Scolioscreen” app is additional to the actual Scolioscreen which is a scoliometer design to house any smartphone contains inclinometer hardware. The manufacturers state that the Scolioscreen can be combined with any app that measure inclinations. However, Driscoll et al. (27) investigated the reliability of the scolioscreen-smartphone combination and the smartphone alone. In all 3 investigators used (Spine Surgeon, Nurse, Patient Parent), intraobserver and interobserver reliability was higher (0.94–0.89) with the scolioscreen-smartphone combination than the smartphone alone (0.89–0.75). Furthermore, the smartphone alone demonstrated lower consistency (ICC = 0.86) with the gold standard (Spine Surgeon using standard scoliometer) than the scolioscreen-smartphone (ICC = 0.95). At this stage, using a smartphone independent of additional equipment does not offer an effective alternative for examining scoliosis.
The validity and reliability of goniometric data obtained using smartphone photography has previously been examined. “DrGoniometer” (CDM, Milano, Italy) has been shown to validly measure flexion at the elbow and knee (31,33), as well as external rotation of the shoulder (71). In addition to providing reliable and valid measures of joint range of motion, photographic-based apps are advantageous by inevitably provide a lasting record of the measurement, i.e., the actual photograph (69). Although Ferriero et al. (32) propose the potential applications of photographic-based apps in telemedicine, Milani et al. (69) argue apps of this type have the same limitations of standard digital photography such as handling instability and imprecision. Therefore, photographic-based apps offer nothing alternative to a standard digital camera. Furthermore, conventional long-arm goniometers can be purchased at the lower cost to “DrGoniometer” (CDM Srl, Italy). Given that photographic-based goniometry apps cannot record range of motion in dynamic conditions in the same way that conventional long-arm goniometers cannot, it is argued that this type of smartphone application does not offer a more practical nor cost-effective solution to existing instruments.
Accelerometer-based apps may provide an effective alternative to a conventional long-arm goniometer. These apps use the triaxial accelerometer hardware built into smartphones, traditionally serving as position sensors for the use in video games by measuring inclination of the smartphone device (82). Ockendon and Gilbert (82) have demonstrated high reliability (r = 0.947) and validity of a smartphone accelerometer-based app (iPhone 3GS, Apple, Cupertino, CA, USA). Furthermore, the authors also found greater interrater reliability compared with a traditional goniometer. Given that most practitioners that typically assess range of motion (e.g., physiotherapists and strength and conditioning coaches) would do so independently, it can be argued that interrater reliability is not relevant to this context. However, the same study did demonstrate superior intrarater reliability compared with the traditional method, offering support for accelerometer-based apps as a viable alternative to traditional methods of goniometry. Milani et al. (69) argue that accelerometer-based, photographic-based, and magnetometer-based apps all possess the same limitation in that they can only measure range of motion in static conditions. Therefore, for smartphone applications to be considered as an effective alternative, they must be able to validly and reliably measure angular movement in dynamic conditions, e.g., active rotations. More recently, Bittel et al. (9) used the accelerometer of an iPhone 4 to measure extension and flexion movements concurrently with an isokinetic dynamometer at a range of different speeds (30, 60, 90, 120, and 150°·s−1). The authors demonstrated limits of agreement of 2° between the smartphone and the dynamometer.
To summarize, previous investigations have demonstrated interrater and intrarater reliability as well as validity of photography-based, accelerometer-based, and magnetometer-based goniometer apps. Although the review by Milani et al. (69) provides a comprehensive discussion on the efficacy of currently available smartphone apps, a more up-to-date review is required now that more recent investigations such as Bittel et al. (9) have demonstrated validity and reliability of the iPhone accelerometer to measure angular changes in dynamic conditions. However, there is currently no app commercially available with this specific function.
Capacity for Apps to Analyze Physical Performance
One of the main problems that strength and conditioning coaches face is how to objectively quantify the physical capabilities of their athletes (37,57). Measuring physical performance is, indeed, a key part of any training program because it allows the practitioner to monitor and adjust workloads (44,76), analyze fatigue (47,106), detect talents (38,72), identify weaknesses (97), or prevent injuries (16,67,68). Thus, a common practice when designing strength and conditioning programs is to measure specific variables of interest that could help in the prescription of the training stimulus (42,44,57,76); however, the technology and expertise required to do so is often expensive and non–user-friendly, especially for coaches or teams outside big organizations or universities. For this, the rise of smartphones, which currently include several sensors specifically designed to measure physical performance (such as heart rate monitors, GPS, or accelerometers) are gaining popularity in the fitness and health community (4,11,107). For example, fitness and health apps are among the top fitness trends in the list elaborated by the American College of Sports Medicine (107). However, just a few of the thousands of fitness apps available are scientifically validated (11). Thus, the purpose of this section is to provide an updated review of some of the most relevant studies that have analyzed the validity and reliability of smartphone apps for the measurement of several variables related to physical performance.
Resistance training prescription is based on the well-known 1 repetition maximum (1RM) paradigm, by which intensities are designed as a percentage of the maximal load the athlete can lift just once (57,99). However, measuring the 1RM requires the performance of a maximal lift which may not be appropriate for all populations, especially those with little expertise in lifting heavy weights because it could lead to inaccurate results and might increase the risk of injuries (44).
Several alternatives, such as performing repetitions to failure or using the rate of perceived exertion has been used to predict the 1RM with submaximal loads (26,77). However, it has been advocated that the most accurate methodologies consist of measuring the speed of the barbell. This is due to the fact that it has been extensively demonstrated that there is a very strong (r2 > 0.97) relationship between the load in terms of %1RM and the velocity at which each load is lifted (18,76,86). Thus, a new resistance training paradigm, often described as velocity-based training, has emerged based on systematic measurements of barbell velocity to adjust and prescribe training intensities because each %1RM has a specific velocity range (22,44,76). The gold standard for the measurement of barbell velocity is high-frequency linear transducers (23,76); however, its cost, above $2,000 in most cases, prevent its use in small organizations or clubs with little resources.
Trying to address this limitation, an iOS app named “PowerLift” has been recently validated for the measurement of barbell velocity in the bench press exercise in resistance-trained males (5). To do this, authors measured several repetitions in a group of powerlifters with a linear transducer (working at 1 kHz) and the “PowerLift” app on an iPhone 6 (iOS 9.3.2) simultaneously, and then compared the results. “PowerLift,” which consists of the recording and ulterior analysis of a slow-motion video of the lift thanks to the high-speed camera on the most recent iOS devices, was significantly correlated with the linear transducer (r = 0.94) and showed a small standard error of estimate (SEE = 0.008 min·s−1) in the measurement of barbell velocity. Moreover, there were no significant differences between the 1RM predicted by the velocity measured with the linear transducer or the app, meaning that “PowerLift” could be a less expensive yet accurate and valid alternative for the estimation of maximal strength.
Muscular Power or Impulse: Vertical Jump Height
The measurement of vertical jump height has been used extensively in the literature to assess muscle power, detect talents, or analyze neuromuscular fatigue (6,23,58,95). Considering that vertical jumping is an essential ability in many sports (4,25,95), its measurement is often a key part of any performance analysis. Several approaches have been proposed to measure the height, an athlete can reach during a vertical jump (7,30,40,95), although the most accurate typically consist of the measurement of either the take-off velocity or flight time of the jump. This is because these parameters can calculate the vertical displacement of the center of mass using well-known Newtonian equations (95). Although force platforms are often considered the gold standard for the measurement of vertical jumps by measuring the take-off velocity of the athlete (23,95), several systems based on the detection of the flight time (such as infrared platforms) have become popular in the strength and conditioning community because they are less expensive, more portable, and can still provide very accurate measures of jump height (4,7,43). One of those systems is an iPhone app (“My Jump”) which measures the flight time of the jump, thanks to the slow-motion recording capabilities on the iPhone 5s and later (4,103). With a simple video analysis in which the take-off and landing of the jump are visually detected by the user within the app, “My Jump” calculates the flight time of the jump in an accurate, valid, and reliable way. The performance of the app has been confirmed widely in the literature over recent years, showing levels of correlation above 0.96 and a systematic bias less than 10 mm in comparison with reference systems (4,39,104).
Human Locomotion: Running and Sprinting
The analysis of human locomotion is of great interest for both performance and injury prevention purposes (68,74,80,96). For example, several mechanical variables such as ground-contact time, leg stiffness, or the horizontal force applied to the ground has been shown to be related with running and maximal sprinting performance (73,93,96). Moreover, studies have suggested that the asymmetries between legs in some of these variables could be used as a relevant indicator related to risk of injury (12,50). As with the performance variables described above, the measurement of running and sprinting mechanics has usually required advanced measurement systems such as instrumented treadmills, force platforms, timing gates, or radar guns (15,75,94); expensive technology which most coaches do not have access to. Using the same approach than with the jumping and resistance training apps mentioned above, 2 new apps also based on high-speed video analysis were recently validated for the measurement of running and sprinting mechanics on an iPhone 6 (iOS 9.2.1, 240 frames per second) (3,94). The first one, “Runmatic,” was tested against an infrared platform for the detection of contact and flight times during running at several speeds ranging 10–20 km·h−1 in male runners (3). Moreover, the app made use of some validated spring-mass model equations that allow the calculation of different mechanical variables based on contact time, flight time, and simple anthropometrics (74). The app was shown to be valid and reliable for the measurement of leg stiffness, vertical oscillation of the center of mass, maximal force applied to the ground, and stride frequency (r = 0.94–0.99, bias = 2.2–6.5%). The second one, “My Sprint,” was also shown to be highly valid and reliable for the measurement of 30-m sprint time and the production of horizontal force, velocity, and power in male sprinters in comparison with timing gates and a radar gun, with no significant differences between devices (94). Thus, these apps allow the practitioner to measure important variables related with running and maximal sprinting without the need for any advanced instruments.
Distance Tracking Using Global Positioning System and Accelerometer Sensors
When talking about running, probably the most popular variable in the sports technology industry is the distance covered using GPS signals (and, consequently, running pace) (14,28,48). Several wearable devices (mainly watches) have been used both in practice and research to measure running distances (13,78), although the inclusion of GPS sensors on most smartphones in recent years has catalyzed the creation of apps which take advantage of that technology to track distances and running pace (24). In fact, distance trackers are among the top 20 fitness trends for 2017 (107); however, there is a lack of evidence regarding their validity and reliability. One recent study analyzed the validity and reliability of an iOS app designed to measure distances during running using the GPS included in the iPhone smartphones (8). To do this, researchers had subjects run on a 400-m track for a total of 2,400 m while wearing an iPhone in an armband and then compared the values of distance and speed obtained by the app with the actual values. The app underestimated both distance and speed by 3–4%, meaning an absolute difference of approximately 100 m or 0.7 km·h−1. However, the good test-retest reliability observed (i.e., comparing values in 2 separate trials) and the relatively low bias between the app and the actual distance made the authors conclude that the app might be appropriate to track running in the general population, although it might be not adequate for trained athletes.
Another widespread variable related to walk or running is step count (20,64). Specifically, it has been proposed that a minimum count of 10,000 steps per day is associated with good levels of daily physical activity and health status (20,64). For this, many of the most popular wearable devices available in the market are focused in steps tracking using acceleration data to provide users with information about their step count (11,19,60). Of course, because smartphones include accelerometers, literally thousands of step-tracking apps have been developed to count the steps of the users without the use of external devices. However, a recent study has showed that these apps lack accuracy in comparison with a professional pedometer, probably due to the low quality of the accelerometers included in most smartphones (61). In this investigation, researchers compared a reference pedometer to 3 Android-based step-tracking apps (“Runtastic,” “Pacer Works,” and “Tayutau”) on a Samsung Galaxy S4 GT-I9500 under laboratory conditions, and each participant's own respective smartphone in a free-living setting.
A summary of the currently available apps described in the scientific literature is available in Tables 1 and 2 of this review. Mobile apps have the potential to transform data collection in the field, particularly for practitioners that face space, cost, and time constraints. A number of apps have been validated to collect physiological and anatomical measurements such as heart rate and range of motion, and physical performance measurements such as vertical jump height, barbell velocity, and contact times. However, practitioners and athletes should exercise caution and be critical when integrating apps into their training practices, as this review has identified some areas where research support is lacking. Furthermore, although the accuracy of some apps has been validated, their low-cost commercial availability makes them widely available to a lay audience. Therefore, it is important that app developers consider implementing clear guidance on result interpretation for all potential users. A final consideration is the limited information on transfer between devices, due to the majority of articles testing the apps on a single platform, and the regular technological updates from manufacturers. Care has been taken in this review to provide as much information as possible about the device used in the described studies, and readers should make a judgment as to the appropriateness for their own device.
Carlos Balsalobre-Fernández is the developer of the PowerLift, My Jump, and Runmaticapps described in this review.
1. Altini M, Amft O. HRV4Training: Large-scale longitudinal training load analysis in unconstrained free-living settings using a smartphone application. Conf Proc IEEE Eng Med Biol Soc, Orlando, USA, 2016.
2. Altini M, Van Hoof C, Amft O. Relation between estimated cardiorespiratory fitness and running performance in free-living: An analysis of HRV4Training data. Presented at International Conference on Biomedical and Health Informatics, Orlando, USA, 2017.
3. Balsalobre-Fernández C, Agopyan H, Morin JB. The validity and reliability of an iPhone app for measuring running mechanics. J Appl Biomech 33: 222–226, 2017.
4. Balsalobre-Fernández C, Glaister M, Lockey RA. The validity and reliability of an iPhone app for measuring vertical jump performance. J Sports Sci 33: 1574–1579, 2015.
5. Balsalobre-Fernández C, Marchante D, Muñoz-López M, Jiménez SL. Validity and reliability of a novel iPhone app for the measurement of barbell velocity and 1RM on the bench-press exercise. J Sports Sci 36: 64–70, 2018.
6. Balsalobre-Fernández C, Tejero-González CM, Del Campo-Vecino J. Hormonal and neuromuscular responses to high level middle and long-distance competition. Int J Sport Physiol Perf 9: 839–844, 2014.
7. Balsalobre-Fernandez C, Tejero-Gonzalez CM, Del Campo-Vecino J, Bavaresco N. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps. J Strength Cond Res 28: 528–533, 2014.
8. Benson AC, Bruce L, Gordon BA. Reliability and validity of a GPS-enabled iPhoneTM “app” to measure physical activity. J Sports Sci 33: 1421–1428, 2015.
9. Bittel AJ, Elazzazi A, Bittel DC. Accuracy and precision of an accelerometer-based smartphone app designed to monitor and record angular movement over time. Telemed J E Health 22: 302–309, 2016.
10. Boland DM, Neufeld EV, Ruddell J, Dolezal BA, Cooper CB. Inter-and intra-rater agreement of static posture analysis using a mobile application. J Phys Ther Sci 28: 3398–3402, 2016.
11. Bort-Roig J, Gilson ND, Puig-Ribera A, Contreras RS, Trost SG. Measuring and influencing physical activity with smartphone technology
: A systematic review. Sports Med 44: 671–686, 2014.
12. Brown SR, Feldman ER, Cross MR, Helms ER, Marrier B, Samozino P, Morin JB. The potential for a targeted strength training programme to decrease asymmetry and increase performance: A proof-of-concept in sprinting. Int J Sport Physiol Perf Press 24: 1–13, 2017.
13. Buchheit M, Al Haddad H, Simpson BM, Palazzi D, Bourdon PC, Di Salvo V, Mendez-Villanueva A. Monitoring accelerations with GPS in football: Time to slow down? Int J Sport Physiol Perf 9: 442–445, 2014.
14. Buchheit M, Gray A, Morin JB. Assessing stride variables and vertical stiffness with GPS-embedded accelerometers: Preliminary insights for the monitoring of neuromuscular fatigue on the field. J Sport Sci Med 14: 698–701, 2015.
15. Bundle MW, Powell MO, Ryan LJ. Design and testing of a high-speed treadmill to measure ground reaction forces at the limit of human gait. Med Eng Phys 37: 892–897, 2015.
16. Butler RJ, Crowell HP, Davis IM. Lower extremity stiffness: Implications for performance and injury. Clin Biomech 18: 511–517, 2003.
17. Cardinale M, Varley MC. Wearable training monitoring technology
: Applications, challenges and opportunities. Int J Sport Physiol Perf 12: 1–23, 2017.
18. Chapman M, Larumbe-Zabala E, Gosss-Sampson M, Colpus M, Triplett NT, Naclerio F. Perceptual, mechanical and electromyographic responses to different relative loads in the parallel squat. J Strength Cond Res 33: 8–16, 2019.
19. Chowdhury EA, Western MJ, Nightingale TE, Peacock OJ, Thompson D. Assessment of laboratory and daily energy expenditure estimates from consumer multi-sensor physical activity monitors. PLoS One 12: e0171720, 2017.
20. Chu AHY, Ng SHX, Paknezhad M, Gauterin A, Koh D, Brown MS, Müller-Riemenschneider F. Comparison of wrist-worn Fitbit Flex and waist-worn ActiGraph for measuring steps in free-living adults. PLoS One 12: e0172535, 2017.
21. Colantonio E, Dâmaso AR, Caranti DA, Pinheiro MM, Tufik S, MTd Mello. Clinical performance of 3-body fat measurements in obese adolescents 15 to 18 years-old. Rev Bras Med 72: 77–82, 2015.
22. Conceição F, Fernandes J, Lewis M, Gonzaléz-Badillo JJ, Jimenéz-Reyes P. Movement velocity as a measure of exercise intensity in three lower limb exercises. J Sports Sci 34: 1099–1106, 2016.
23. Cormie P, Deane R, McBride JM. Methodological concerns for determining power output in the jump squat. J Strength Cond Res 21: 424–430, 2007.
24. Del Rosario MB, Redmond SJ, Lovell NH. Tracking the evolution of smartphone sensing for monitoring human movement. Sensors (Basel) 15: 18901–18933, 2015.
25. Delextrat A, Cohen D. Physiological testing of basketball players: Toward a standard evaluation of anaerobic fitness. J Strength Cond Res 22: 1066–1072, 2008.
26. Dohoney P, Chromiak JA, Lemire D, Abadie BR, Kovacs C. Prediction of one repetition maximum (1-RM) strength from a 4–6 RM and a 7–10 RM submaximal strength test in healthy young adult males. J Exer Physiol Online 5: 54–59, 2002.
27. Driscoll M, Fortier-Tougas C, Labelle H, Parent S, Mac-Thiong JM. Evaluation of an apparatus to be combined with a smartphone for the early detection of spinal deformities. Scoliosis 9: 10, 2014.
28. Ehrmann FE, Duncan CS, Sindhusake D, Franzsen WN, Greene DA. GPS and injury prevention in professional soccer. J Strength Cond Res 30: 360–367, 2015.
29. Farina GL, Spataro F, De Lorenzo A, Lukaski H. A smartphone application for personal assessments of body composition and phenotyping. Sensors (Basel) 16: 2163, 2016.
30. Ferreira LC, Schilling BK, Weiss LW, Fry AC, Chiu LZF. Reach height and jump displacement: Implications for standardization of reach determination. J Strength Cond Res 24: 1596–1601, 2010.
31. Ferriero G, Sartorio F, Foti C, Primavera D, Brigatti E, Vercelli S. Reliability of a new application for smartphones (DrGoniometer) for elbow angle measurement. PM R 3: 1153–1154, 2011.
32. Ferriero G, Vercelli S, Sartorio F, Foti C. Accelerometer-and photographic-based smartphone applications for measuring joint angle: Are they reliable. J Arthroplasty 29: 448–449, 2014.
33. Ferriero G, Vercelli S, Sartorio F, Lasa SM, Ilieva E, Brigatti E, Ruella C, Foti C. Reliability of a smartphone-based goniometer for knee joint goniometry. Int J Rehab Res 36: 146–151, 2013.
34. Flatt AA, Esco MR. Validity of the ithleteTM smart phone application for determining ultra-short-term heart rate variability. J Hum Kinet 39: 85–92, 2013.
35. Flatt AA, Esco MR. Evaluating individual training adaptation with smartphone-derived heart rate variability in a collegiate female soccer team. J Strength Cond Res 30: 378–385, 2016.
36. Folke M, Cernerud L, Ekström M, Hök B. Critical review of non-invasive respiratory monitoring in medical care. Med Biol Eng Comput 41: 377–383, 2003.
37. Folland JP, Williams AG. The adaptations to strength training. Sports Med 37: 145–168, 2007.
38. Gabbett T, Georgieff B, Domrow N. The use of physiological, anthropometric, and skill data to predict selection in a talent-identified junior volleyball squad. J Sports Sci 25: 1337–1344, 2007.
39. Gallardo-Fuentes F, Gallardo-Fuentes J, Ramírez-Campillo R, Balsalobre-Fernández C, Martínez C, Caniuqueo A, Cañas R, Banzer W, Loturco I, Nakamura FY, Izquierdo M. Intersession and intrasession reliability and validity of the my jump app for measuring different jump actions in trained male and female athletes. J Strength Cond Res 30: 2049–2056, 2016.
40. García-Ramos A, Štirn I, Padial P, Argüelles-Cienfuegos J, De B, Strojnik V, Feriche B. Predicting vertical jump height from bar velocity. J Sports Sci Med 14: 256–262, 2015.
41. Garner RT, Wagner DR. Validity of certified trainer-palpated and exercise-palpated post-exercise heart rate. J Ex Phys Online 16, 2013.
42. Giroux C, Rabita G, Chollet D, Guilhem G. What is the best method for assessing lower limb force-velocity relationship? Int J Sports Med 36: 143–149, 2015.
43. Glatthorn JF, Gouge S, Nussbaumer S, Stauffacher S, Impellizzeri FM, Maffiuletti NA. Validity and reliability of optojump photoelectric cells for estimating vertical jump height. J Strength Cond Res 25: 556–560, 2011.
44. Gonzalez-Badillo JJ, Sánchez-Medina L. Movement velocity as a measure of loading intensity in resistance training. Int J Sports Med 31: 347–352, 2010.
45. Guede-Fernández F, Ferrer-Mileo V, Ramos-Castro J, Fernández-Chimeno M, García-González MA. Real time heart rate variability assessment from android smartphone camera photoplethysmography: Postural and device influences. Conf Proc IEEE Eng Med Biol Soc 2015: 7332–7335, 2015.
46. Habibi E, Soury S, Zadeh AH. Precise evaluation of anthropometric 2D software processing of hand in comparison with direct method. J Med Signals Sens 3: 256–261, 2013.
47. Halson SL. Monitoring training load to understand fatigue in athletes. Sports Med 44: 139–147, 2014.
48. Haugen T, Buchheit M. Sprint running performance monitoring: Methodological and practical considerations. Sports Med 46: 641–656, 2016.
49. Heathers JA. Smartphone-enabled pulse rate variability: An alternative methodology for the collection of heart rate variability in psychophysiological research. Int J Psychophysiol 89: 297–304, 2013.
50. Hewit J, Cronin J, Hume P. Multidirectional leg asymmetry assessment in sport. Strength Cond J 34: 82–86, 2012.
51. Ho CL, Fu YC, Lin MC, Chan SC, Hwang B, Jan SL. Smartphone applications (apps) for heart rate measurement in children: Comparison with electrocardiography monitor. Pediatr Cardiol 35: 726–731, 2014.
52. Hung PCY, Witana CP, Goonetilleke RS. Anthropometric measurements from photographic images. Comput Syst 29: 764–769, 2004.
53. Hussain Z, Jafar T, uz Zaman M, Parveen R, Saeed F. Correlations of skin fold thickness and validation of prediction equations using DEXA as the gold standard for estimation of body fat composition in Pakistani children. BMJ Open 4: e004194, 2014.
54. Kälvesten J, Lui LY, Brismar T, Cummings S. Digital X-ray radiogrammetry in the study of osteoporotic fractures: Comparison to dual energy X-ray absorptiometry and FRAX. Bone 86: 30–35, 2016.
55. Karlen W, Gan H, Chiu M, Dunsmuir D, Zhou G, Dumont GA, Ansermino JM. Improving the accuracy and efficiency of respiratory rate measurements in children using mobile devices. PLoS One 9: e99266, 2014.
56. Kong L, Zhao Y, Dong L, Jian Y, Jin X, Li B, Feng Y, Liu M, Liu X, Wu H. Non-contact detection of oxygen saturation based on visible light imaging device using ambient light. Opt Express 21: 17464–17471, 2013.
57. Kraemer WJ, Ratamess NA. Fundamentals of resistance training: Progression and exercise prescription. Med Sci Sport Exer 36: 574–688, 2004.
58. Laffaye G, Wagner PP, Tombleson TIL. Countermovement jump height: Gender and sport-specific differences in the force-time variables. J Strength Cond Res 28: 1096–1105, 2014.
59. Laukkanen RM, Virtanen PK. Heart rate monitors: State of the art. J Sports Sci 16: 3–7, 1998.
60. Lee JM, Kim Y, Welk GJ. Validity of consumer-based physical activity monitors. Med Sci Sports Exerc 46: 1840–1848, 2014.
61. Leong JY, Wong JE. Accuracy of three Android-based pedometer applications in laboratory and free-living settings. J Sports Sci 35: 14–21, 2016.
62. Losa-Iglesias ME, Becerro-de-Bengoa-Vallejo R, Becerro-de-Bengoa-Losa KR. Reliability and concurrent validity of a peripheral pulse oximeter and health–app system for the quantification of heart rate in healthy adults. Health Inform J 22: 151–159, 2016.
63. Lowry DW, Tomiyama AJ. Air displacement plethysmography versus dual-energy x-ray absorptiometry in underweight, normal-weight, and overweight/obese individuals. PLoS One 10: e0115086, 2015.
64. Mantovani AM, Duncan S, Codogno JS, Lima MCS, Fernandes RA. Different amounts of physical activity measured by pedometer and the associations with health outcomes in adults. J Phys Act Health 13: 1183–1191, 2016.
65. McManus DD, Chong JW, Soni A, Saczynski JS, Esa N, Napolitano C, Darling CE, Boyer E, Rosen RK, Floyd KC. Pulse-smart: Pulse-based arrhythmia discrimination using a novel smartphone application. J Cardiovasc Electrophysiol 27: 51–57, 2016.
66. McManus DD, Lee J, Maitas O, Esa N, Pidikiti R, Carlucci A, Harrington J, Mick E, Chon KH. A novel application for the detection of an irregular pulse using an iPhone 4S in patients with atrial fibrillation. Heart Rhythm 10: 315–319, 2013.
67. Mendiguchia J, Martinez-Ruiz E, Edouard P, Morin JB, Martinez-Martinez F, Idoate F, Mendez-Villanueva A. A multifactorial, criteria-based progressive algorithm for hamstring injury treatment. Med Sci Sport Exer 49: 1482–1492, 2017.
68. Mendiguchia J, Samozino P, Martínez-Ruiz E, Brughelli M, Schmikli S, Morin JB, Méndez-Villanueva A. Progression of mechanical properties during on-field sprint running after returning to sports from a hamstring muscle injury in soccer players. Int J Sports Med 35: 690–695, 2014.
69. Milani P, Coccetta CA, Rabini A, Sciarra T, Massazza G, Ferriero G. Mobile smartphone applications for body position measurement in rehabilitation: A review of goniometric tools. PM R 6: 1038–1043, 2014.
70. Mitchell K, Graff M, Hedt C, Simmons J. Reliability and validity of a smartphone pulse rate application for the assessment of resting and elevated pulse rate. Physiother Theor Pract 32: 494–499, 2016.
71. Mitchell K, Gutierrez SB, Sutton S, Morton S, Morgenthaler A. Reliability and validity of goniometric iPhone applications for the assessment of active shoulder external rotation. Physiother Theor Pract 30: 521–525, 2014.
72. Mohamed H, Vaeyens R, Matthys S, Multael M, Lefevre J, Lenoir M, Philppaerts R. Anthropometric and performance measures for the development of a talent detection and identification model in youth handball. J Sports Sci 27: 257–266, 2009.
73. Moore IS. Is there an economical running technique? A review of modifiable biomechanical factors affecting running economy. Sports Med 46: 793–807, 2016.
74. Morin JB, Dalleau G, Kyröläinen H, Jeannin T, Belli A. A simple method for measuring stiffness during running. J Appl Biomech 21: 167–180, 2005.
75. Morin JB, Slawinski J, Dorel S, de villareal ES, Couturier A, Samozino P, Brughelli M, Rabita G. Acceleration capability in elite sprinters and ground impulse: Push more, brake less? J Biomech 48: 3149–3154, 2015.
76. Muñoz-López M, Marchante D, Cano-Ruiz MA, Chicharro JL, Balsalobre-Fernández C. Load, force and power-velocity relationships in the prone pull-up exercise. Int J Sport Physiol Perf 2: 1–22, 2017.
77. Naclerio F, Larumbe-Zabala E. Relative load prediction by velocity and the omni-res 0–10 scale in parallel squat. J Strength Cond Res 31: 1585–1591, 2017.
78. Nagahara R, Botter A, Rejc E, Koido M, Shimizu T, Samozino P, Morin JB. Concurrent validity of GPS for deriving mechanical properties of sprint acceleration. Int J Sport Physiol Perf 12: 129–132, 2016.
79. Nam Y, Kong Y, Reyes B, Reljin N, Chon KH. Monitoring of heart and breathing rates using dual cameras on a smartphone. PLoS One 11: e0151013, 2016.
80. Nielsen RO, Buist I, Parner ET, Nohr EA, Sørensen H, Lind M, Rasmussen S. Foot pronation is not associated with increased injury risk in novice runners wearing a neutral shoe: A 1-year prospective cohort study. Br J Sport Med 48: 440–447, 2014.
81. Nunan D, Jakovljevic DG, Donovan G, Hodges LD, Sandercock GR, Brodie DA. Levels of agreement for RR intervals and short-term heart rate variability obtained from the Polar S810 and an alternative system. Eur J Appl Physiol 103: 529–537, 2008.
82. Ockendon M, Gilbert RE. Validation of a novel smartphone accelerometer-based knee goniometer. J Knee Surg 25: 341–346, 2012.
83. Peart DJ, Shaw MP, Rowley CG. Validity of freely available mobile applications for recording resting heart rate. Ann Biol Res 5: 11–15, 2014.
84. Peart DJ, Shaw MP, Rowley CG. An investigation into a contactless photoplethysmographic mobile application to record heart rate post-exercise: Implications for field testing. Biomed Hum Kinet 7: 95–99, 2015.
85. Pelegris P, Banitsas K, Orbach T, Marias K. A novel method to detect heart beat rate using a mobile phone. Conf Proc IEEE Eng Med Biol Soc 2010: 5488–5491, 2010.
86. Picerno P, Iannetta D, Comotto S, Donati M, Pecoraro F, Zok M, Tollis G, Figura M, Varalda C, Di Muzio D, Patrizio F, Piacentini MF. 1RM prediction: A novel methodology based on the force–velocity and load–velocity relationships. Eur J Appl Physiol 116: 1–9, 2016.
87. Plews DJ, Laursen PB, Stanley J, Kilding AE, Buchheit M. Training adaptation and heart rate variability in elite endurance athletes: Opening the door to effective monitoring. Sports Med 43: 773–781, 2013.
88. Plews DJ, Scott B, Altini M, Wood M, Kilding AE, Laursen PB. Comparison of heart rate variability recording with smart phone photoplethysmographic, Polar H7 chest strap and electrocardiogram methods. Int J Sport Physiol Perf 14: 1–17, 2017.
89. Poh MZ, McDuff DJ, Picard RW. Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt Express 18: 10762–10774, 2010.
90. Popescu AL, Ionescu RT, Popescu D. Cardiowatch: A solution for monitoring the heart rate on a mobile device. UPB Scientific Bull 78: 63–74, 2016.
91. Reyes B, Reljin N, Kong Y, Nam Y, Chon K. Tidal volume and instantaneous respiration rate estimation using a smartphone camera. IEEE J Biomed Health Inform 21: 764–777, 2016.
92. Reyes BA, Reljin N, Kong Y, Nam Y, Ha S, Chon KH. Employing an incentive spirometer to calibrate tidal volumes estimated from a smartphone camera. Sensors (Basel) 16: 397, 2016.
93. Rogers SA, Whatman CS, Pearson SN, Kilding AE. Assessments of mechanical stiffness and relationships to performance determinants in middle-distance runners. Int J Sport Physiol Perf 14: 1–23, 2017.
94. Romero-Franco N, Jiménez-Reyes P, Castaño-Zambudio A, Capelo-Ramírez F, Rodríguez-Juan JJ, González-Hernández J, Toscano-Bendala FJ, Cuadrado-Peñafiel V, Balsalobre-Fernández C. Sprint performance and mechanical outputs computed with an iPhone app: Comparison with existing reference methods. Eur J Sport Sci 17: 1–7, 2016.
95. Samozino P, Morin JB, Hintzy F, Belli A. A simple method for measuring force, velocity and power output during squat jump. J Biomech 41: 2940–2945, 2008.
96. Samozino P, Rabita G, Dorel S, Slawinski J, Peyrot N, Saez de Villarreal E, Morin JB. A simple method for measuring power, force, velocity properties, and mechanical effectiveness in sprint running. Scand J Med Sci Sports 26: 648–658, 2015.
97. Samozino P, Rejc E, Di Prampero PE, Belli A, Morin JB. Optimal force-velocity profile in ballistic movements–altius: Citius or fortius? Med Sci Sports Exerc 44: 313–322, 2012.
98. Sardana M, Saczynski J, Esa N, Floyd K, Chon K, Chong JW, McManus D. Performance and usability of a novel smartphone application for atrial fibrillation detection in an ambulatory population referred for cardiac monitoring. J Am Coll Cardiol 67: 844, 2016.
99. Schoenfeld BJ. Is there a minimum intensity threshold for resistance training-induced hypertrophic adaptations? Sports Med 43: 1279–1288, 2013.
100. Scully CG, Lee J, Meyer J, Gorbach AM, Granquist-Fraser D, Mendelson Y, Chon KH. Physiological parameter monitoring from optical recordings with a mobile phone. IEEE Trans Biomed Eng 59: 303–306, 2012.
101. Shao D, Yang Y, Liu C, Tsow F, Yu H, Tao N. Noncontact monitoring breathing pattern, exhalation flow rate and pulse transit time. IEEE Trans Biomed Eng 61: 2760–2767, 2014.
102. Shaw MP, Robinson J, Peart DJ. Comparison of a mobile application to estimate percentage body fat to other non-laboratory based measurements. Biomed Hum Kinet 9: 94–98, 2017.
103. Stanton R, Kean CO, Scanlan AT. My Jump for vertical jump assessment. Br J Sport Med 49: 1157, 2015.
104. Stanton R, Wintour S-A, Kean CO. Validity and intra-rater reliability of MyJump app on iPhone 6s in jump performance. J Sci Med Sport 20: 518–523, 2017.
105. Sun Y, Hu S, Azorin-Peris V, Greenwald S, Chambers J, Zhu Y. Motion-compensated noncontact imaging photoplethysmography to monitor cardiorespiratory status during exercise. J Biomed Opt 16: 077010–077019, 2011.
106. Taylor JL, Amann M, Duchateau J, Meeusen R, Rice CL, Taylor J. Neural contributions to muscle fatigue: From the brain to the muscle and back again. Med Sci Sport Exer 48: 2294–2306, 2016.
107. Thompson WR. Worldwide survey of fitness trends for 2017. ACSMs Health Fit J 20: 8–17, 2016.
108. Vanderlei L, Silva R, Pastre C, Azevedo FMd, Godoy M. Comparison of the Polar S810i monitor and the ECG for the analysis of heart rate variability in the time and frequency domains. Braz J Med Biol Res 41: 854–859, 2008.
109. Wackel P, Beerman L, West L, Arora G. Tachycardia detection using smartphone applications in pediatric patients. J Pediatr 164: 1133–1135, 2014.