US20160058378A1 - System and method for providing an interpreted recovery score - Google Patents
System and method for providing an interpreted recovery score Download PDFInfo
- Publication number
- US20160058378A1 US20160058378A1 US14/934,084 US201514934084A US2016058378A1 US 20160058378 A1 US20160058378 A1 US 20160058378A1 US 201514934084 A US201514934084 A US 201514934084A US 2016058378 A1 US2016058378 A1 US 2016058378A1
- Authority
- US
- United States
- Prior art keywords
- user
- recovery
- activity
- score
- interpreted
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000011084 recovery Methods 0.000 title claims abstract description 325
- 238000000034 method Methods 0.000 title claims description 108
- 230000000694 effects Effects 0.000 claims description 339
- 230000033001 locomotion Effects 0.000 claims description 69
- 230000003287 optical effect Effects 0.000 claims description 48
- 238000004590 computer program Methods 0.000 claims description 12
- 241000746998 Tragus Species 0.000 claims description 9
- 230000002503 metabolic effect Effects 0.000 description 99
- 238000004891 communication Methods 0.000 description 38
- 238000011068 loading method Methods 0.000 description 37
- 238000012544 monitoring process Methods 0.000 description 27
- 230000015654 memory Effects 0.000 description 20
- 230000009183 running Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 19
- 238000005259 measurement Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 15
- 230000009184 walking Effects 0.000 description 14
- 238000005516 engineering process Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000002459 sustained effect Effects 0.000 description 8
- 230000009182 swimming Effects 0.000 description 8
- 210000003128 head Anatomy 0.000 description 7
- 210000003491 skin Anatomy 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 210000005069 ears Anatomy 0.000 description 5
- 230000001965 increasing effect Effects 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 230000004622 sleep time Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 238000012806 monitoring device Methods 0.000 description 4
- 230000005019 pattern of movement Effects 0.000 description 4
- 230000037081 physical activity Effects 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 210000000707 wrist Anatomy 0.000 description 4
- 230000015556 catabolic process Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 210000000613 ear canal Anatomy 0.000 description 3
- 210000000883 ear external Anatomy 0.000 description 3
- 239000000835 fiber Substances 0.000 description 3
- 230000037323 metabolic rate Effects 0.000 description 3
- 239000005060 rubber Substances 0.000 description 3
- 230000000276 sedentary effect Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 210000003423 ankle Anatomy 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 210000002615 epidermis Anatomy 0.000 description 2
- 238000009532 heart rate measurement Methods 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 229920003023 plastic Polymers 0.000 description 2
- 229920001296 polysiloxane Polymers 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001502 supplementing effect Effects 0.000 description 2
- 230000002618 waking effect Effects 0.000 description 2
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 241001124569 Lycaenidae Species 0.000 description 1
- 206010038743 Restlessness Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000037213 diet Effects 0.000 description 1
- 235000005911 diet Nutrition 0.000 description 1
- 230000009189 diving Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 210000002683 foot Anatomy 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000006213 oxygenation reaction Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 229920000747 poly(lactic acid) Polymers 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000012781 shape memory material Substances 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 210000005010 torso Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6815—Ear
- A61B5/6817—Ear canal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Biofeedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02416—Measuring pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
- A61B5/02427—Details of sensor
Definitions
- the present disclosure relates generally to fitness monitoring devices, and more particularly to a system and method for providing an interpreted recovery score.
- Previous generation heart rate monitors and fitness tracking devices generally enabled only a monitoring of a user's heart rate.
- Currently available fitness tracking devices now add functionality that measures the user's heart rate variability.
- One issue with currently available fitness tracking devices and heart rate monitors is that they do not account for the performance or recovery state of the user in a scientific, user-specific way. In other words, currently available solutions do not normalize the heart rate variability measurement to be specific to the user.
- Another issue is that currently available solutions do not learn how the user's normal recovery levels are reflected in measurements of the user's heart rate variability.
- Embodiments of the present disclosure include systems and methods for providing an interpreted recovery score.
- the apparatus includes a fatigue level module that detects a fatigue level.
- the apparatus also includes a dynamic recovery profile module that creates and updates a dynamic recovery profile based on an archive.
- the archive includes historical information about the fatigue level.
- the apparatus includes an interpreted recovery score module that creates and updates an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- the interpreted recovery score is specific to a measuring period.
- the apparatus for providing an interpreted recovery score in one embodiment, also includes an initial recovery profile module that creates an initial recovery profile.
- the initial recovery profile is based on a comparison of the user information to normative group information.
- the dynamic recovery profile module creates and updates the dynamic recovery profile further based on the initial recovery profile.
- the apparatus includes a recovery status module that provides a recovery status based on the interpreted recovery score.
- the recovery status in one instance, is one of the following: fatigued, recovered, and optimal.
- the interpreted recovery score module performs a comparison of the interpreted recovery score to the fatigue level, and tracks the comparison over time.
- the apparatus in one embodiment, includes a recovery recommendation module that provides an activity recommendation based on the interpreted recovery score.
- At least one of the fatigue level module, the dynamic recovery profile module, and the interpreted recovery score module is embodied in a wearable sensor.
- one or more of these modules may be embodied in a biometric sensor (e.g. heartrate sensor, motion sensor, etc.) mechanically coupled to a pair of earphones that can be worn in a user's ears.
- the earphones may further be configured to communicate with a computing device to provide and/or display an interpreted recovery score to a user.
- One embodiment of the present disclosure involves a method for providing an interpreted recovery score.
- the method includes detecting a fatigue level.
- the method includes creating and updating a dynamic recovery profile based on an archive.
- the archive includes historical information about the fatigue level.
- the method also includes creating and updating an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- the method for providing an interpreted recovery score includes creating an initial recovery profile based on a comparison of the user information to normative group information. In another embodiment, creating and updating the dynamic recovery profile is further based on the initial recovery profile.
- the dynamic recovery profile in one embodiment, phases out the initial recovery profile as an amount of the historical information in the archive increases.
- the method includes providing a recovery status.
- the recovery status is based on the interpreted recovery score.
- the recovery status in one instance, is one of the following: fatigued, recovered, and optimal.
- the method includes performing a comparison of the interpreted recovery score to the fatigue level, and the method includes tracking the comparison over time.
- the method includes providing an activity recommendation based on the interpreted recovery score.
- the method includes receiving an external interpreted recovery score and comparing the external interpreted recovery score to the interpreted recovery score.
- the method includes comparing the interpreted recovery score to a past interpreted recovery score.
- the interpreted recovery score is associated with a measuring period and the past interpreted recovery score is associated with a past measuring period.
- At least one of the operations of detecting the fatigue level, creating and updating the dynamic recovery profile, and creating and updating the interpreted recovery score includes using a sensor configured to be attached to the body of a user.
- a sensor configured to be attached to the body of a user.
- these operations may be implemented by using a biometric sensor (e.g. heartrate sensor, motion sensor, etc.) mechanically coupled to a pair of earphones.
- the earphones are configured to be worn in a user's ears and may be further configured to communicate (e.g. transmit detected biometric data, receive audio signals, etc.) with another computing device.
- One embodiment of the disclosure includes a system for providing an interpreted recovery score.
- the system includes a processor and at least one computer program residing on the processor.
- the computer program is stored on a non-transitory computer readable medium having computer executable program code embodied thereon.
- the computer executable program code is configured to detect a fatigue level.
- the computer executable program code is configured to create and update a dynamic recovery profile based on an archive.
- the archive includes historical information about the fatigue level.
- the computer executable program code is further configured to create and update an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- FIG. 1 illustrates a perspective view of example communications environment in which embodiments of the disclosed technology may be implemented.
- FIG. 2A illustrates a perspective view of an example pair of biometric earphones that, in some embodiments, is the activity monitoring device used to implement the technology disclosed herein.
- FIG. 2B illustrates an example architecture for circuitry of the biometric earphones of FIG. 2A .
- FIG. 3A illustrates a perspective view of a particular embodiment of a biometric earphone, including an optical heartrate sensor that may be used to implement the technology disclosed herein.
- FIG. 3B illustrates a side perspective view of placement of the optical heartrate sensor of the earphones of FIG. 3A when they are worn by a user.
- FIG. 3C illustrates a frontal perspective view of placement of the optical heartrate sensor of the biometric earphone of FIG. 3A when they are worn by a user.
- FIG. 3D illustrates a cross-sectional view of an over-the-ear configuration of dual-fit earphones in accordance with the disclosed technology.
- FIG. 3E illustrates a cross-sectional view of an over-the-ear configuration of the dual-fit earphones of FIG. 3D .
- FIG. 3F illustrates a cross-sectional view of an under-the-ear configuration of the dual-fit earphones of FIG. 3D .
- FIG. 4 is a block diagram illustrating an example computing device that may be used to implement embodiments of the disclosed technology.
- FIG. 5 illustrates modules of an example activity monitoring application that may be used to implement embodiments of the disclosed technology.
- FIG. 6 is an operational flow diagram illustrating a method of prompting a user to adjust the placement of earphones in the user's ear to ensure accurate biometric data collection by the earphones' biometric sensors.
- FIG. 7 illustrates an example system for providing an interpreted recovery score.
- FIG. 8 illustrates an example apparatus for providing an interpreted recovery score.
- FIG. 9 illustrates another example apparatus for providing an interpreted recovery score.
- FIG. 10A is an operational flow diagram illustrating an example of a method for creating and updating an interpreted recovery score.
- FIG. 10B is an example of a metabolic loading table.
- FIG. 10C is an example of an activity intensity library.
- FIG. 10D is an example of an archive table.
- FIG. 11 is an operational flow diagram illustrating an example of a method for providing an interpreted recovery score including providing a recovery status.
- FIG. 12 is an operational flow diagram illustrating an example of a method for providing an interpreted recovery score including comparing the interpreted recovery score to an external interpreted recovery score.
- FIG. 13 illustrates an activity display that may be associated with an activity display module of the activity monitoring application of FIG. 5 .
- FIG. 14 illustrates a sleep display that may be associated with a sleep display module of the activity monitoring application of FIG. 5 .
- FIG. 15 illustrates an activity recommendation and fatigue level display that may be associated with an activity recommendation and fatigue level display module of the activity monitoring application of FIG. 5 .
- FIG. 16 illustrates a biological data and intensity recommendation display that may be associated with a biological data and intensity recommendation display module of the activity monitoring application of FIG. 5 .
- FIG. 17 illustrates an example computing module that may be used to implement various features of the systems and methods disclosed herein.
- the present disclosure is directed toward systems and methods for providing an interpreted recovery score.
- the disclosure is directed toward various embodiments of such systems and methods.
- the systems and methods are directed to a device that provides an interpreted recovery score.
- the device may be a pair of earphones including one or more biometric sensors (e.g. heartrate sensor, motion sensor, etc.) mechanically coupled thereto.
- the earphones may be configured with electronic components and circuitry for processing detected user biometric data and providing user biometric data to another computing device (e.g. smartphone, laptop, desktop, tablet, etc.).
- FIGS. 1-6 illustrate, by way of example, embodiments that utilize such biometric earphones.
- the systems and methods of the present disclosure may be implemented using various activity monitoring devices. Indeed, the figures are not intended to be exhaustive or to limit the disclosure to the precise form disclosed.
- FIG. 1 illustrates an example communications environment in accordance with an embodiment of the technology disclosed herein, the embodiment employing biometric earphones.
- earphones 100 communicate biometric and audio data with computing device 200 over a communication link 300 .
- the biometric data is measured by one or more sensors (e.g., heart rate sensor, accelerometer, gyroscope) of earphones 100 .
- sensors e.g., heart rate sensor, accelerometer, gyroscope
- computing device 200 may comprise any computing device (smartphone, tablet, laptop, smartwatch, desktop, etc.) configured to transmit audio data to earphones 100 , receive biometric data from earphones 100 (e.g., heartrate and motion data), and process the biometric data collected by earphones 100 .
- computing device 200 itself may collect additional biometric information that is provided for display. For example, if computing device 200 is a smartphone it may use built in accelerometers, gyroscopes, and a GPS to collect additional bio
- Computing device 200 additionally includes a graphical user interface (GUI) to perform functions such as accepting user input and displaying processed biometric data to the user.
- GUI graphical user interface
- the GUI may be provided by various operating systems known in the art, such as, for example, iOS, Android, Windows Mobile, Windows, Mac OS, Chrome OS, Linux, Unix, a gaming platform OS, etc.
- the biometric information displayed to the user can include, for example a summary of the user's activities, a summary of the user's fitness levels, activity recommendations for the day, the user's heart rate and heart rate variability (HRV), and other activity related information.
- HRV heart rate and heart rate variability
- User input that can be accepted on the GUI can include inputs for interacting with an activity tracking application further described below.
- the communication link 300 is a wireless communication link based on one or more wireless communication protocols such as BLUETOOTH, ZIGBEE, 1302.11 protocols, Infrared (IR), Radio Frequency (RF), etc.
- the communications link 300 may be a wired link (e.g., using any one or a combination of an audio cable, a USB cable, etc.)
- FIG. 2A is a diagram illustrating a perspective view of exemplary earphones 100 .
- FIG. 2A will be described in conjunction with FIG. 2B , which is a diagram illustrating an example architecture for circuitry of earphones 100 .
- Earphones 100 comprise a left earphone 110 with tip 116 , a right earphone 120 with tip 126 , a controller 130 and a cable 140 .
- Cable 140 electrically couples the right earphone 110 to the left earphone 120 , and both earphones 110 - 120 to controller 130 .
- each earphone may optionally include a fin or ear cushion 117 that contacts folds in the outer ear anatomy to further secure the earphone to the wearer's ear.
- earphones 100 may be constructed with different dimensions, including different diameters, widths, and thicknesses, in order to accommodate different human ear sizes and different preferences.
- the housing of each earphone 110 , 120 is rigid shell that surrounds electronic components.
- the electronic components may include motion sensor 121 , optical heartrate sensor 122 , audio-electronic components such as drivers 113 , 123 and speakers 114 , 124 , and other circuitry (e.g., processors 160 , 165 , and memories 170 , 175 ).
- the rigid shell may be made with plastic, metal, rubber, or other materials known in the art.
- the housing may be cubic shaped, prism shaped, tubular shaped, cylindrical shaped, or otherwise shaped to house the electronic components.
- the tips 116 , 126 may be shaped to be rounded, parabolic, and/or semi-spherical, such that it comfortably and securely fits within a wearer's ear, with the distal end of the tip contacting an outer rim of the wearer's outer ear canal.
- the tip may be removable such that it may be exchanged with alternate tips of varying dimensions, colors, or designs to accommodate a wearer's preference and/or fit more closely match the radial profile of the wearer's outer ear canal.
- the tip may be made with softer materials such as rubber, silicone, fabric, or other materials as would be appreciated by one of ordinary skill in the art.
- controller 130 may provide various controls (e.g., buttons and switches) related to audio playback, such as, for example, volume adjustment, track skipping, audio track pausing, and the like. Additionally, controller 130 may include various controls related to biometric data gathering, such as, for example, controls for enabling or disabling heart rate and motion detection. In a particular embodiment, controller 130 may be a three button controller.
- controls e.g., buttons and switches
- biometric data gathering such as, for example, controls for enabling or disabling heart rate and motion detection.
- controller 130 may be a three button controller.
- the circuitry of earphones 100 includes processors 160 and 165 , memories 170 and 175 , wireless transceiver 180 , circuitry for earphone 110 and earphone 120 , and a battery 190 .
- earphone 120 includes a motion sensor 121 (e.g., an accelerometer or gyroscope), an optical heartrate sensor 122 , and a right speaker 124 and corresponding driver 123 .
- Earphone 110 includes a left speaker 114 and corresponding driver 113 .
- earphone 110 may also include a motion sensor (e.g., an accelerometer or gyroscope), and/or an optical heartrate sensor.
- a biometric processor 165 comprises logical circuits dedicated to receiving, processing and storing biometric information collected by the biometric sensors of the earphones. More particularly, as illustrated in FIG. 2B , processor 165 is electrically coupled to motion sensor 121 and optical heartrate sensor 122 , and receives and processes electrical signals generated by these sensors. These processed electrical signals represent biometric information such as the earphone wearer's motion and heartrate. Processor 165 may store the processed signals as biometric data in memory 175 , which may be subsequently made available to a computing device using wireless transceiver 180 . In some embodiments, sufficient memory is provided to store biometric data for transmission to a computing device for further processing.
- optical heartrate sensor 122 uses a photoplethysmogram (PPG) to optically obtain the user's heart rate.
- PPG photoplethysmogram
- optical heartrate sensor 122 includes a pulse oximeter that detects blood oxygenation level changes as changes in coloration at the surface of a user's skin. More particularly, in this embodiment, the optical heartrate sensor 122 illuminates the skin of the user's ear with a light-emitting diode (LED). The light penetrates through the epidermal layers of the skin to underlying blood vessels. A portion of the light is absorbed and a portion is reflected back.
- LED light-emitting diode
- the optical sensor may be positioned on one of the earphones such that it is proximal to the interior side of a user's tragus when the earphones are worn.
- optical heartrate sensor 122 may also be used to estimate a heart rate variable (HRV), i.e. the variation in time interval between consecutive heartbeats, of the user of earphones 100 .
- HRV heart rate variable
- processor 165 may calculate the HRV using the data collected by sensor 122 based on a time domain methods, frequency domain methods, and other methods known in the art that calculate HRV based on data such as the mean heart rate, the change in pulse rate over a time interval, and other data used in the art to estimate HRV.
- logic circuits of processor 165 may further detect, calculate, and store metrics such as the amount of physical activity, sleep, or rest over a period of time, or the amount of time without physical activity over a period of time.
- the logic circuits may use the HRV, the metrics, or some combination thereof to calculate a recovery score.
- the recovery score may indicate the user's physical condition and aptitude for further physical activity for the current day.
- the logic circuits may detect the amount of physical activity and the amount of sleep a user experienced over the last 48 hours, combine those metrics with the user's HRV, and calculate a recovery score.
- the calculated recovery score may be based on any scale or range, such as, for example, a range between 1 and 10, a range between 1 and 100, or a range between 0% and 100%.
- the logic circuits may use the HRV, the metrics, or some combination thereof to calculate an interpreted recovery score as described in more detail in connection with FIGS. 7-12 .
- earphones 100 wirelessly receive audio data using wireless transceiver 180 .
- the audio data is processed by logic circuits of audio processor 160 into electrical signals that are delivered to respective drivers 113 and 123 of left speaker 114 and right speaker 124 of earphones 110 and 120 .
- the electrical signals are then converted to sound using the drivers.
- Any driver technologies known in the art or later developed may be used. For example, moving coil drivers, electrostatic drivers, electret drivers, orthodynamic drivers, and other transducer technologies may be used to generate playback sound.
- the wireless transceiver 180 is configured to communicate biometric and audio data using available wireless communications standards.
- the wireless transceiver 180 may be a BLUETOOTH transmitter, a ZIGBEE transmitter, a Wi-Fi transmitter, a GPS transmitter, a cellular transmitter, or some combination thereof.
- FIG. 2B illustrates a single wireless transceiver 180 for both transmitting biometric data and receiving audio data
- a transmitter dedicated to transmitting only biometric data to a computing device may be used.
- the transmitter may be a low energy transmitter such as a near field communications (NFC) transmitter or a BLUETOOTH low energy (LE) transmitter.
- NFC near field communications
- LE BLUETOOTH low energy
- a separate wireless receiver may be provided for receiving high fidelity audio data from an audio source.
- a wired interface e.g., micro-USB
- micro-USB micro-USB
- FIG. 2B also shows that the electrical components of headphones 100 are powered by a battery 190 coupled to power circuitry 191 .
- a battery 190 coupled to power circuitry 191 .
- Any suitable battery or power supply technologies known in the art or later developed may be used.
- a lithium-ion battery, aluminum-ion battery, piezo or vibration energy harvesters, photovoltaic cells, or other like devices can be used.
- battery 190 may be enclosed in earphone 110 or earphone 120 .
- battery 102 may be enclosed in controller 130 .
- the circuitry may be configured to enter a low-power or inactive mode when earphones 100 are not in use.
- mechanisms such as, for example, an on/off switch, a BLUETOOTH transmission disabling button, or the like may be provided on controller 130 such that a user may manually control the on/off state of power-consuming components of earphones 100 .
- processors 160 and 165 , memories 170 and 175 , wireless transceiver 180 , and battery 190 may be enclosed in and distributed throughout any one or more of earphone 110 , earphone 120 , and controller 130 .
- processor 165 and memory 175 may be enclosed in earphone 120 along with optical heartrate sensor 122 and motion sensor 121 .
- these four components are electrically coupled to the same printed circuit board (PCB) enclosed in earphone 120 .
- PCB printed circuit board
- audio processor 160 and biometric processor 165 are illustrated in this exemplary embodiment as separate processors, in an alternative embodiment the functions of the two processors may be integrated into a single processor.
- FIG. 3A illustrates a perspective view of one embodiment of an earphone 120 , including an optical heartrate sensor 122 , in accordance with the technology disclosed herein.
- FIG. 3A will be described in conjunction with FIGS. 3B-3C , which are perspective views illustrating placement of heartrate sensor 122 when earphone 120 is worn in a user's ear 350 .
- earphone 120 includes a body 125 , tip 126 , ear cushion 127 , and an optical heartrate sensor 122 .
- Optical heartrate sensor 122 protrudes from a frontal side of body 125 , proximal to tip 126 and where the earphone's nozzle (not shown) is present.
- FIGS. 1 illustrates a perspective view of one embodiment of an earphone 120 , including an optical heartrate sensor 122 , in accordance with the technology disclosed herein.
- FIGS. 3B-3C are perspective views illustrating placement of heartrate sensor 122 when earphone 120 is worn in a user
- 3B-3C illustrate the optical sensor and ear interface 340 when earphone 120 is worn in a user's ear 350 .
- optical heartrate sensor 122 is proximal to the interior side of a user's tragus 360 .
- optical heartrate sensor 122 illuminates the skin of the interior side of the ear's tragus 360 with a light-emitting diode (LED).
- LED light-emitting diode
- the light penetrates through the epidermal layers of the skin to underlying blood vessels. A portion of the light is absorbed and a portion is reflected back. The light reflected back through the skin is then obtained with a receiver (e.g., a photodiode) of optical heartrate sensor 122 and used to determine changes in the user's blood flow, thereby permitting measurement of the user's heart rate and HRV.
- a receiver e.g., a photodiode
- earphones 100 may be dual-fit earphones shaped to comfortably and securely be worn in either an over-the-ear configuration or an under-the-ear configuration.
- the secure fit provided by such embodiments keeps the optical heartrate sensor 122 in place on the interior side of the ear's tragus 360 , thereby ensuring accurate and consistent measurements of a user's heartrate.
- FIGS. 3D and 3E are cross-sectional views illustrating one such embodiment of dual-fit earphones 600 being worn in an over-the-ear configuration.
- FIG. 3F illustrates dual-fit earphones 600 in an under-the-ear configuration.
- earphone 600 includes housing 610 , tip 620 , strain relief 630 , and cord or cable 640 .
- the proximal end of tip 620 mechanically couples to the distal end of housing 610 .
- the distal end of strain relief 630 mechanically couples to a side (e.g., the top side) of housing 610 .
- the distal end of cord 640 is disposed within and secured by the proximal end of strain relief 630 .
- the longitudinal axis of the housing, H x forms angle ⁇ 1 with respect to the longitudinal axis of the tip, T x .
- the longitudinal axis of the strain relief, S y aligns with the proximal end of strain relief 630 and forms angle ⁇ 2 with respect to the axis H x .
- ⁇ 1 is greater than 0 degrees (e.g., T x extends in a non-straight angle from H x , or in other words, the tip 620 is angled with respect to the housing 610 ).
- ⁇ 1 is selected to approximate the ear canal angle of the wearer. For example, ⁇ 1 may range between 5 degrees and 15 degrees.
- ⁇ 2 is less than 90 degrees (e.g., S y extends in a non-orthogonal angle from H x , or in other words, the strain relief 630 is angled with respect to a perpendicular orientation with housing 610 ).
- ⁇ 2 may be selected to direct the distal end of cord 640 closer to the wearer's ear.
- ⁇ 2 may range between 75 degrees and 85 degrees
- x 1 represents the distance between the distal end of tip 620 and the intersection of strain relief longitudinal axis S y and housing longitudinal axis H x .
- the dimension x 1 may be selected based on several parameters, including the desired fit to a wearer's ear based on the average human ear anatomical dimensions, the types and dimensions of electronic components (e.g., optical sensor, motion sensor, processor, memory, etc.) that must be disposed within the housing and the tip, and the specific placement of the optical sensor.
- x 1 may be at least 18 mm. However, in other examples, x 1 may be smaller or greater based on the parameters discussed above.
- x 2 represents the distance between the proximal end of strain relief 630 and the surface wearer's ear.
- ⁇ 2 may be selected to reduce x 2 , as well as to direct the cord 640 towards the wearer's ear, such that cord 640 may rest in the crevice formed where the top of the wearer's ear meets the side of the wearer's head.
- ⁇ 2 may range between 75 degrees and 85 degrees.
- strain relief 630 may be made of a flexible material such as rubber, silicone, or soft plastic such that it may be further bent towards the wearer's ear.
- strain relief 630 may comprise a shape memory material such that it may be bent inward and retain the shape.
- strain relief 630 may be shaped to curve inward towards the wearer's ear.
- the proximal end of tip 620 may flexibly couple to the distal end of housing 610 , enabling a wearer to adjust ⁇ 1 to most closely accommodate the fit of tip 620 into the wearer's ear canal (e.g., by closely matching the ear canal angle).
- earphones 100 in various embodiments may gather biometric user data that may be used to track a user's activities and activity level. That data may then be made available to another computing device (e.g. smartphone, tablet), which may provide a GUI for interacting with the data using a software activity tracking application installed on the computing device.
- FIG. 4 is a block diagram illustrating example components of one such computing device 200 including an installed activity tracking application 210 .
- computing device 200 comprises a connectivity interface 201 , storage 202 with activity tracking application 210 , processor 204 , a graphical user interface (GUI) 205 including display 206 , and a bus 207 for transferring data between the various components of computing device 200 .
- GUI graphical user interface
- Connectivity interface 201 connects computing device 200 to earphones 100 through a communication medium.
- the medium may comprise a wireless network system such as a BLUETOOTH system, a ZIGBEE system, an Infrared (IR) system, a Radio Frequency (RF) system, a cellular network, a satellite network, a wireless local area network, or the like.
- the medium may additionally comprise a wired component such as a USB system.
- Storage 202 may comprise volatile memory (e.g. RAM), non-volatile memory (e.g. flash storage), or some combination thereof.
- storage 202 may store biometric data collected by earphones 100 .
- storage 202 stores an activity tracking application 210 , that when executed by processor 204 , allows a user to interact with the collected biometric information.
- a user may interact with activity tracking application 210 via a GUI 205 including a display 206 , such as, for example, a touchscreen display that accepts various hand gestures as inputs.
- activity tracking application 210 may process the biometric information collected by earphones 100 and present it via display 206 of GUI 205 .
- earphones 100 may filter and/or process the collected biometric information prior to transmitting the biometric information to computing device 200 . Accordingly, although the embodiments disclosed herein are described with reference to activity tracking application 210 processing the received biometric information, in various implementations various processing and/or preprocessing operations may be performed by a processor 160 , 165 of earphones 100 .
- activity tracking application 210 may be initially configured/setup (e.g., after installation on a smartphone) based on a user's self-reported biological information, sleep information, and activity preference information. For example, during setup a user may be prompted via display 206 for biological information such as the user's gender, height, age, and weight. Further, during setup the user may be prompted for sleep information such as the amount of sleep needed by the user and the user's regular bed time.
- this self-reported information may be used in tandem with the information collected by earphones 100 to display activity monitoring information using various modules.
- activity tracking application 210 may be used by a user to monitor and define how active the user wants to be on a day-to-day basis based on the biometric information (e.g., accelerometer information, optical heart rate sensor information, etc.) collected by earphones 100 .
- activity tracking application 210 may comprise various display modules, including an activity display module 211 , a sleep display module 212 , an activity recommendation and fatigue level display module 213 , and a biological data and intensity recommendation display module 214 .
- activity tracking application 210 may comprise various processing modules 215 for processing the activity monitoring information (e.g., optical heartrate information, accelerometer information, gyroscope information, etc.) collected by the earphones or the biological information entered by the users. These modules may be implemented separately or in combination. For example, in some embodiments activity processing modules 215 may be directly integrated with one or more of display modules 211 - 214 .
- activity monitoring information e.g., optical heartrate information, accelerometer information, gyroscope information, etc.
- each of display modules 211 - 214 may be associated with a unique display provided by activity tracking app 210 via display 206 . That is, activity display module 211 may have an associated activity display, sleep display module 212 may have an associated sleep display, activity recommendation and fatigue level display module 213 may have an associated activity recommendation and fatigue level display, and biological data and intensity recommendation display module 214 may have an associated biological data and intensity recommendation display.
- application 210 may be used to display to the user an instruction for wearing and/or adjusting earphones 100 if it is determined that optical heartrate sensor 122 and/or motion sensor 121 are not accurately gathering motion data and heart rate data.
- FIG. 6 is an operational flow diagram illustrating one such method 400 of an earphone adjustment feedback loop with a user that ensures accurate biometric data collection by earphones 100 .
- execution of application 210 may cause display 206 to display an instruction to the user on how to wear earphones 100 to obtain an accurate and reliable signal from the biometric sensors.
- operation 410 may occur once after installing application 210 , once a day (e.g., when user first wears the earphones 100 for the day), or at any custom and/or predetermined interval.
- feedback is provided to the user regarding the quality of the signal received from the biometric sensors based on the particular position that earphones 100 are being worn.
- display 206 may display a signal quality bar or other graphical element.
- application 210 may cause display 206 to display to the user advice on how to adjust the earphones to improve the signal, and operations 420 and decision 430 may subsequently be repeated. For example, advice on adjusting the strain relief of the earphones may be displayed. Otherwise, if the signal quality is satisfactory, at operation 450 , application may cause display 206 to display to the user confirmation of good signal quality and/or good earphone position. Subsequently, application 210 may proceed with normal operation (e.g., display modules 211 - 214 ).
- FIGS. 13-16 illustrate a particular exemplary implementation of a GUI for app 210 comprising displays associated with each of display modules 211 - 214 .
- FIG. 7 is a schematic block diagram illustrating an example of a system 700 for providing an interpreted recovery score.
- System 700 includes apparatus for providing interpreted recovery score 702 , communication medium 704 , server 706 , and computing device 708 .
- apparatus for providing an interpreted recovery score 702 may in some embodiments be earphones 100 of FIGS. 2-3E
- computing device 708 may in some embodiments be the same as computing device 200 in FIG. 4
- embodiments of the present disclosure may also take other forms, as has been noted.
- FIGS. 7-9 are described in terms of apparatus 702 and computing device 708 to convey that the systems and methods disclosed herein may also be implemented using other various devices without departing from the scope of the technology disclosed herein.
- Communication medium 704 may be implemented in a variety of forms.
- communication medium 704 may be an Internet connection, such as a local area network (“LAN”), a wide area network (“WAN”), a fiber optic network, internet over power lines, a hard-wired connection (e.g., a bus), and the like, or any other kind of network connection.
- Communication medium 704 may be implemented using any combination of routers, cables, modems, switches, fiber optics, wires, radio, and the like.
- Communication medium 704 may be implemented using various wireless standards, such as Bluetooth, Wi-Fi, 4G LTE, etc.
- One of skill in the art will recognize other ways to implement communication medium 704 for communications purposes.
- Server 706 directs communications made over communication medium 704 .
- Server 706 may be, for example, an Internet server, a router, a desktop or laptop computer, a smartphone, a tablet, a processor, a module, or the like.
- server 706 directs communications between communication medium 704 and computing device 708 .
- server 706 may update information stored on computing device 708 , or server 706 may send information to computing device 708 in real time.
- Computing device 708 may take a variety of forms, such as a desktop or laptop computer, a smartphone, a tablet, a processor, a module, or the like.
- computing device 708 may be a processor or module embedded in a wearable sensor, a pair of earphones, a bracelets, a smart-watch, a piece of clothing, an accessory, and so on.
- computing device 708 may be substantially similar to devices embedded in or otherwise coupled to earphones 100 .
- Computing device 708 may communicate with other devices over communication medium 704 with or without the use of server 706 .
- computing device 708 includes apparatus 702 .
- apparatus 702 is used to perform various processes described herein.
- FIG. 8 is a schematic block diagram illustrating one embodiment of an apparatus for providing an interpreted recovery score 800 .
- Apparatus 800 includes apparatus 702 with fatigue level module 804 , dynamic recovery profile module 806 , and interpreted recovery score module 808 .
- a movement monitoring module (not shown) monitors a movement to create a metabolic activity score based on the movement and user information.
- the movement monitoring module will be described below in further detail with regard to various processes.
- Fatigue level module 804 detects a fatigue level. Fatigue level module 804 will be described below in further detail with regard to various processes.
- Dynamic recovery profile module 806 creates and updates a dynamic recovery profile based on an archive.
- the archive includes historical information about the fatigue level.
- the archive includes historical information about the movement and the metabolic activity score. Dynamic recovery profile module 806 will be described below in further detail with regard to various processes.
- Interpreted recovery score module 808 creates and updates an interpreted recovery score based on the fatigue level and the dynamic recovery profile. Interpreted recovery score module 808 will be described below in further detail with regard to various processes.
- FIG. 9 is a schematic block diagram illustrating one embodiment of apparatus for providing an interpreted recovery score 900 .
- Apparatus 900 includes apparatus for providing an interpreted recovery score 702 with fatigue level module 804 , dynamic recovery profile module 806 , and interpreted recovery score module 808 .
- Apparatus 900 also includes initial recovery profile module 902 , recovery status module 904 , and recovery recommendation module 906 .
- Initial recovery profile module 902 , recovery status module 904 , and recovery recommendation module 906 will be described below in further detail with regard to various processes.
- apparatus 900 also includes the movement monitoring module (not shown) described above with respect to FIG. 8 .
- At least one of fatigue level module 804 , dynamic recovery profile module 806 , interpreted recovery score module 808 , initial recovery profile module 902 , recovery status module 904 , and recovery recommendation module 906 are embodied in a wearable sensor, such as biometric earphones 100 .
- a wearable sensor such as biometric earphones 100 .
- any of the modules described herein are embodied biometric earphones 100 and connect to other modules described herein via communication medium 704 .
- one or more of the modules are embodied in various other forms of hardware, such as the hardware of computing device 708 or computing device 200 .
- FIG. 10A is an operational flow diagram illustrating example method 1000 for providing an interpreted recovery score in accordance with an embodiment of the present disclosure.
- the operations of method 1000 create and update an interpreted recovery score based on a user's personalized fatigue levels, as recorded over time.
- the fatigue level is based on a measured heart rate variability for the user and is a function of recovery.
- the operations of method 1000 take into account not only the user's current fatigue level, but also the relationship between current and past fatigue levels to create an interpreted recovery score that accurately reflects the user's physical condition and performance capabilities. This aids in providing a personalized metric by which the user can attain peak performance.
- apparatus 702 , biometric earphones 100 , and/or computing device 200 , 708 perform various operations of method 1000 .
- movement is monitored to create a metabolic activity score based on the movement and user information.
- the metabolic activity score in one embodiment, is created from a set of metabolic loadings.
- the metabolic loadings may be determined by identifying a user activity type from a set of reference activity types and by identifying a user activity intensity from a set of reference activity intensities.
- the metabolic loadings may be determined based on information provided by a user (user information).
- User information may include, for example, an individual's height, weight, age, gender, and geographic and environmental conditions.
- the user may provide the user information by, for example, a user interface of computing device 708 , or, a controller 130 of biometric earphones 100 .
- User information may be determined based on various measurements—for example, measurements of the user's body-fat content or body type.
- the user information may be determined, for example, by an altimeter or GPS, which may be used to determine the user's elevation, weather conditions in the user's environment, etc.
- apparatus 702 obtains user information from the user indirectly.
- apparatus 702 may collect the user information from a social media account, from a digital profile, or the like.
- computing device 708 obtains user information from the user indirectly.
- computing device 708 may collect the user information from a social media account, from a digital profile, or the like.
- the user information includes a user lifestyle selected from a set of reference lifestyles.
- apparatus 702 may prompt the user for information about the user's lifestyle (e.g., via a user interface or controller).
- Apparatus 702 may prompt the user to determine how active the user's lifestyle is.
- the user may be prompted to select a user lifestyle from a set of reference lifestyles.
- the reference lifestyles may include a range of lifestyles, for example, ranging from inactive, on one end, to highly active on the other end. In such a case, the reference lifestyles that the user selects from may include sedentary, mildly active, moderately active, and heavily active.
- the user lifestyle is determined from the user as an initial matter. For example, upon initiation, apparatus 702 may prompt the user to provide a user lifestyle. In a further embodiment, the user is prompted periodically to select a user lifestyle. In this fashion, the user lifestyle selected may be aligned with the user's actual activity level as the user's activity level varies over time. In another embodiment, the user lifestyle is updated without intervention from the user.
- the metabolic loadings are numerical values and may represent a rate of calories burned per unit weight per unit time (e.g., having units of kcal per kilogram per hour).
- the metabolic loadings may be represented in units of oxygen uptake (e.g., in milliliters per kilogram per minute).
- the metabolic loadings may also represent a ratio of the metabolic rate during activity (e.g., the metabolic rate associated with a particular activity type and/or an activity intensity) to the metabolic rate during rest.
- the metabolic loadings may, for example be represented in a metabolic table, such as metabolic table 1050 , illustrated in FIG. 10B .
- the metabolic loadings are specific to the user information. For example, a metabolic loading may increase for a heavier user, or for an increased elevation, but may decrease for a lighter user or for a decreased elevation.
- the set of metabolic loadings are determined based on the user lifestyle, in addition to the other user information. For example, the metabolic loadings for a user with a heavily active lifestyle may differ from the metabolic loadings for a user with a sedentary lifestyle. In this fashion, there may be a greater coupling between the metabolic loadings and the user's characteristics.
- a device e.g., computing device 708
- a module e.g., biometric earphones 100 or a module therein
- the metabolic loadings may be maintained or provided by server 706 or over communication medium 704 .
- a system administrator provides the metabolic loadings based on a survey, publicly available data, scientifically determined data, compiled user data, or any other source of data.
- a movement monitoring module performs the above-described operations.
- the movement monitoring module includes a metabolic loading module and a metabolic table module that determine the metabolic loading associated with the movement.
- a metabolic table is maintained based on the user information.
- the metabolic loadings in the metabolic table may be based on the user information.
- the metabolic table is maintained based on a set of standard user information, in place of or in addition to user information from the user.
- the standard user information may include, for example, the average fitness characteristics of all individuals being the same age as the user, the same height as the user, etc.
- maintaining the metabolic table is delayed until the user information is obtained.
- Metabolic table 1050 may be stored in computing device 708 (e.g. computing device 200 ) or apparatus 702 (e.g. biometric earphones 100 ), and may include information such as reference activity types (RATs) 1054 , reference activity intensities (RAIs) 1052 , and/or metabolic loadings (MLs) 1060 .
- RATs 1054 are arranged as rows 1058 in metabolic table 1050 .
- Each of a set of rows 1058 corresponds to different RATs 1054 , and each row 1058 is designated by a row index number.
- the first RAT row 1058 may be indexed as RAT_ 0
- the second as RAT_ 1 and so on for as many rows as metabolic table 1050 may include.
- the reference activity types may include typical activities, such as running, walking, sleeping, swimming, bicycling, skiing, surfing, resting, working, and so on.
- the reference activity types may also include a catch-all category, for example, general exercise.
- the reference activity types may also include atypical activities, such as skydiving, SCUBA diving, and gymnastics.
- a user defines a user-defined activity by programming computing device 708 (e.g., by an interface on computing device 708 , such as GUI 205 in the example of computing device 200 ) with information about the user-defined activity, such as pattern of movement, frequency of pattern, and intensity of movement.
- the typical reference activities may be provided, for example, by metabolic table 1050 .
- reference activity intensities 1052 are arranged as columns in metabolic table 1050 , and metabolic table 1050 includes columns 1056 , each corresponding to different RAIs 1052 .
- Each column 1056 is designated by a different column index number.
- the first RAI column 1056 is indexed as RAI_ 0
- the reference activity intensities include, in one embodiment, a numeric scale.
- the reference activity intensities may include numbers ranging from one to ten (representing increasing activity intensity).
- the reference activities may also be represented as a range of letters, colors, and the like.
- the reference activity intensities may be associated with the vigorousness of an activity.
- the reference activity intensities may represented by ranges of heart rates or breathing rates.
- metabolic table 1050 includes metabolic loadings 1060 .
- Each metabolic loading 1060 corresponds to a reference activity type 1058 of the reference activity types 1054 and a reference activity intensity 1056 of the reference activity intensities 1052 .
- Each metabolic loading 1060 corresponds to a unique combination of reference activity type 1054 and reference activity intensity 1052 .
- one of the reference activity types 1054 of a series of rows 1058 of reference activity types, and one of the reference activity intensities 1052 of a series of columns 1056 of reference activity intensities correspond to a particular metabolic loading 1060 .
- each metabolic loading 1060 may be identifiable by only one combination of reference activity type 1058 and reference activity intensity 1056 .
- each metabolic loading 1060 is designated using a two-dimensional index, with the first index dimension corresponding to the row 1058 number and the second index dimension corresponding to the column 1056 number of the metabolic loading 1060 .
- ML_ 2 , 3 has a first dimension index of 2 and a second dimension index of 3.
- ML_ 2 , 3 corresponds to the row 1058 for RAT_ 2 and the column 1056 for RAI_ 3 .
- Any combination of RAT_M and RAIN may identify a corresponding ML_M,N in metabolic table 1050 , where M is any number corresponding to a row 1058 number in metabolic table 1050 and N is any number corresponding to a column 1056 number in metabolic table 1050 .
- the reference activity type RAT_ 3 may be “surfing,” and the reference activity intensity RAI_ 3 may be “4.”
- This combination in metabolic table 1050 corresponds to metabolic loading 1060 ML_ 3 , 3 , which may, for example, represent 5.0 kcal/kg/hour (a typical value for surfing).
- some of the above-described operations are performed by the movement monitoring module and some of the operations are performed by the metabolic table module.
- the movement is monitored by location tracking (e.g., Global Positioning Satellites (GPS), or a location-tracking device connected to a network via communication medium 704 ).
- location tracking e.g., Global Positioning Satellites (GPS), or a location-tracking device connected to a network via communication medium 704 .
- the general location of the user, as well as specific movements of the user's body, are monitored.
- the movement of the user's leg in x, y, and z directions may be monitored (e.g., by an accelerometer or gyroscope).
- apparatus 702 receives an instruction regarding which body part is being monitored.
- apparatus 702 may receive an instruction that the movement of a user's wrist, ankle, head, or torso is being monitored.
- the movement of the user is monitored and a pattern of the movement (pattern) is determined.
- the pattern may be detected by an accelerometer or gyroscope.
- the pattern may be a repetition of a motion or a similar motion monitored by the method 1000 ; for example, the pattern may be geometric shape (e.g., a circle, line, oval) of repeated movement that is monitored.
- the repetition of a motion in a geometric shape is not repeated consistently over time, but is maintained for a substantial proportion of the repetitions of movement. For instance, one occurrence of elliptical motion in a repetitive occurrence (or pattern) of ten circular motions may be monitored and determined to be a pattern of circular motion.
- the geometric shape of the pattern of movement is a three dimensional (3D) shape.
- 3D three dimensional
- the pattern associated with the head of a user rowing a canoe, or a wrist of a person swimming the butterfly stroke may be monitored and analyzed into a geometric shape in three dimensions.
- the pattern may be complicated, but it may be described in a form can be recognized by method 1000 .
- Such a form may include computer code that describes the spatial relationship of a set of points, along with changes in acceleration forces that are experienced along those points as, for example, a sensor travels throughout the pattern.
- monitoring the pattern includes monitoring the frequency with which the pattern is repeated (or pattern frequency).
- the pattern frequency may be derived from a repetition period of the pattern (or pattern repetition period).
- the pattern repetition period may be the length of time elapsing from when a device or sensor passes through a certain point in a pattern and when the device or sensor returns to that point when the pattern is repeated.
- the sensor may be at point x, y, z at time t_ 0 .
- the device may then move along the trajectory of the pattern, eventually returning to point x, y, z at time t_ 1 .
- the pattern repetition period would be the difference between t_ 1 and t_ 0 (e.g., measured in seconds).
- the pattern frequency may be the reciprocal of the pattern repetition period, and may have units of cycles per second. When the pattern repetition period is, for example, two seconds, the pattern frequency would be 0.5 cycles per second.
- monitoring the movement may include monitoring the velocity at which the user is moving (or the user velocity).
- the user velocity may, for example, have units of kilometers per hour.
- the user's location information is monitored to determine user velocity. This may be done by GPS, through communication medium 704 , and so on.
- the user velocity may be distinguished from the speed of the pattern (or pattern speed). For example, the user may be running at a user velocity of 10 km/hour, but the pattern speed of the user's wrist may be 20 km/hour at a given point (e.g., as the wrist moves from behind the user to in front of the user).
- the pattern speed may be monitored using, for example, an accelerometer or gyroscope.
- the user velocity may also be distinguished from the pattern speed of a user's head, albeit a subtle distinction, as the user's head rocks slightly forward and backward when running.
- the user's altitude is monitored. This may be done, for example, using an altimeter, user location information, information entered by the user, etc.
- the impact the user has with an object e.g., the impact of the user's feet with ground
- the ambient temperature is measured (e.g., by apparatus 702 ).
- Apparatus 702 may associate a group of reference activity types with bands of ambient temperature. For example, when the ambient temperature is zero degrees Celsius, activities such as skiing, sledding, and ice climbing are appropriate selections for reference activity types, whereas surfing, swimming, and beach volleyball may be inappropriate.
- the ambient humidity may also be measured (e.g., by a hygrometer).
- pattern duration i.e., the length of time for which particular movement pattern is sustained is measured.
- monitoring the movement is accomplished using sensors configured to be attached to a user's body (e.g. earphones 100 ).
- sensors may include a gyroscope or accelerometer to detect movement, and a heart-rate sensor, each of which may be embedded in a pair of earphones that a user can wear on the user's head, such as biometric earphones 100 .
- biometric earphones 100 various modules and sensors that may be used to perform the above-described operations may be embedded in biometric earphones 100 .
- any one or more of the described sensors may be embedded in computing device 200 .
- various modules and sensors that may be used to perform the above-described operations may be embedded in computing device 200 .
- the data from sensors and/or modules embedded in earphones 100 and the data from sensors and/or modules embedded in computing device 200 are used in combination to implement the above-described operations and computations. In various embodiments, the above-described operations are performed by the movement monitoring module.
- Method 1000 involves determining the user activity type from the set of reference activity types. Once detected, the pattern may be used to determine the user activity type from a set of reference activity types. Each reference activity type is associated with a reference activity type pattern. The user activity type may be determined to be the reference activity type that has a reference activity type pattern that matches the pattern measured by method 1000 .
- the pattern that matches the reference activity type pattern will not be an exact match, but will be substantially similar. In other cases, the patterns will not even be substantially similar, but it may be determined that the patterns match because they are the most similar of any patterns available.
- the reference activity type may be determined such that the difference between the pattern of movement corresponding to this reference activity type and the pattern of movement is less than a predetermined range or ratio.
- the pattern is looked up (for a match) in a reference activity type library.
- the reference activity type library may be included in the metabolic table.
- the reference type library may include rows in a table such as the RAT rows 1058 .
- method 1000 involves using the pattern frequency to determine the user activity type from the set of reference activity types.
- reference activity types may be associated with similar patterns (e.g., because the head moves in a similar pattern when running versus walking).
- the pattern frequency is used to determine the activity type (e.g., because the pattern frequency for running is higher than the pattern frequency for walking).
- Method 1000 involves using additional information to determine the activity type of the user.
- the pattern for walking may be similar to the pattern for running.
- the reference activity of running may be associated with higher user velocities and the reference activity of walking with lower user velocities. In this way, the velocity measured may be used to distinguish two reference activity types having similar patterns.
- method 1000 involves monitoring the impact the user has with the ground and determining that, because the impact is larger, the activity type, for example, is running rather than walking. If there is no impact, the activity type may be determined to be cycling (or other activity where there is no impact). In some cases, the humidity is measured to determine whether the activity is a water sport (i.e., whether the activity is being performed in the water). The reference activity types may be narrowed to those that are performed in the water, from which narrowed set of reference activity types the user activity type may be determined. In other cases, the temperature measured is used to determine the activity type.
- Method 1000 may entail instructing the user to confirm the user activity type.
- a user interface is provided such that the user can confirm whether a displayed user activity type is correct, or select the user activity type from a group of activity types.
- a statistical likelihood for of choices for user activity type is determined.
- the possible user activity types are then provided to the user in such a sequence that the most likely user activity type is listed first (and then in descending order of likelihood). For example, it may be determined that, based on the pattern, the pattern frequency, the temperature, and so on, that there is an 80% chance the user activity type is running, a 15% chance the user activity type is walking, and a 5% chance the user activity is dancing. Via a user interface, a list of these possible user activities may be provided such that the user may select the activity type the user is performing. In various embodiments, some of the above-described operations are performed by the metabolic loading module.
- Method 1000 also includes determining the user activity intensity from a set of reference activity intensities.
- the user activity intensity may be determined in a variety of ways.
- the repetition period (or pattern frequency) and user activity type (UAT) may be associated with a reference activity intensity library to determine the user activity intensity that corresponds to a reference activity intensity.
- FIG. 10C illustrates one embodiment whereby this aspect of method 1000 is accomplished, including reference activity intensity library 1080 .
- Reference activity intensity library 1080 is organized by rows 1088 of reference activity types 1084 and columns 1086 of pattern frequencies 1082 .
- reference activity library 1080 is implemented in a table. Reference activity library 1080 may, however, be implemented other ways.
- the reference activity intensity 1090 is RAI_ 0 , 0 .
- UAT 1084 may correspond to the reference activity type for running, a pattern frequency 1082 of 0.5 cycles per second for the user activity type may be determined.
- Reference activity intensity library 1080 may determine that the UAT 1084 of running at a pattern frequency 1082 of 0.5 cycles per second corresponds to an RAI 1090 of five on a scale of ten.
- the reference activity intensity 1090 is independent of the activity type. For example, the repetition period may be five seconds, and this may correspond to an intensity level of two on a scale of ten.
- Reference activity intensity library 1080 is included in metabolic table 1050 .
- the measured repetition period (or pattern frequency) does not correspond exactly to a repetition period for a reference activity intensity in metabolic table 1050 .
- the correspondence may be a best-match fit, or may be a fit within a tolerance.
- a tolerance may be defined by the user or by a system administrator, for example.
- method 1000 involves supplementing the measurement of pattern frequency to help determine the user activity intensity from the reference activity intensities. For example, if the user activity type is skiing, it may be difficult to determine the user activity intensity because the pattern frequency may be erratic or otherwise immeasurable.
- the user velocity, the user's heart rate, and other indicators e.g., breathing rate
- the reference activity intensity is associated with a pattern speed (i.e., the speed or velocity at which the sensor is progressing through the pattern). A higher pattern speed may correspond to a higher user activity intensity.
- Method 1000 determines the user activity type and the user activity intensity by using sensors configured to be attached to the user's body.
- sensors may include, for example, a gyroscope or accelerometer to detect movement, and a heart-rate sensor, each of which may be mechanically coupled to a pair of earphones that a user can wear in the user's ears, such as earphones 100 .
- various sensors and modules that may be used to preform above-described operations of method 1000 may be embedded in or otherwise coupled to biometric earphones 100 or other hardware (e.g. hardware of computing device 200 ). In various embodiments, the above-described operations are performed by the movement monitoring module.
- method 1000 includes creating and updating a metabolic activity score based on the movement and the user information.
- Method 1000 may also include determining a metabolic loading associated with the user and the movement.
- a duration of the activity type at a particular activity intensity e.g., in seconds, minutes, or hours
- the metabolic activity score may be created and updated by, for example, multiplying the metabolic loading by the duration of the user activity type at a particular user activity intensity. If the user activity intensity changes, the new metabolic loading (associated with the new user activity intensity) may be multiplied by the duration of the user activity type at the new user activity intensity.
- the activity score is represented as a numerical value.
- the metabolic activity score may be updated by continually supplementing the metabolic activity score as new activities are undertaken by the user. In this way, the metabolic activity score continually increases as the user participates in more and more activities.
- the metabolic activity score is based on score periods. Monitoring the movement may include determining, during a score period, the metabolic loading associated with the movement. Score periods may include segments of time. The user activity type, user activity intensity, and the corresponding metabolic loading, in one embodiment, are measured (or determined) during each score period, and the metabolic activity score may be calculated for that score period. As the movement changes over time, the varying characteristics of the movement are captured by the score periods.
- Method 1000 includes, in one embodiment, creating and updating a set of periodic activity scores.
- Each period activity score is based on the movement monitored during a set of score periods, and each period activity score is associated with a particular score period of the set of score periods.
- the metabolic activity score is created and updated as an aggregate of period activity scores, and the metabolic activity score may represent a running sum total of the period activity scores.
- method 1000 includes applying a score period multiplier to the score period to create an adjusted period activity score.
- the metabolic activity score in such an example is an aggregation of adjusted period activity scores.
- Score period multipliers may be associated with certain score periods, such that the certain score periods contribute more or less to the metabolic activity score than other score periods during which the same movement is monitored. For example, if the user is performing a sustained activity, a score period multiplier may be applied to the score periods that occur during the sustained activity. By contrast, a multiplier may not be applied to score periods that are part of intermittent, rather than sustained, activity. As a result of the score period multiplier, the user's sustained activity may contribute more to the metabolic activity score than the user's intermittent activity.
- the score period multiplier may allow consideration of the increased demand of sustained, continuous activity relative to intermittent activity.
- the score period multiplier in one instance, is directly proportional to the number of continuous score periods over which a type and intensity of the movement is maintained.
- the adjusted period activity score may be greater than or less than the period activity score, depending on the score period multiplier. For example, for intermittent activity, the score period multiplier may be less than 1.0, whereas for continuous, sustained activity, the score period multiplier may be greater than 1.0.
- method 1000 entails decreasing the metabolic activity score when the user consumes calories. For example, if the user goes running and generates a metabolic activity score of 1,000 as a result, but then the user consumes calories, the metabolic activity score may be decreased by 200 points, or any number of points. The decrease in the number of points may be proportional to the number of calories consumed. In other embodiments, information about specific aspects of the user's diet is obtained, and metabolic activity score points are awarded for healthy eating (e.g., fiber) and subtracted for unhealthy eating (e.g., excessive fat consumption).
- healthy eating e.g., fiber
- unhealthy eating e.g., excessive fat consumption
- the user in one embodiment, is pushed to work harder, or not as hard, depending on the user lifestyle. This may be done, for example, by adjusting the metabolic loadings based on the user lifestyle.
- a user with a highly active lifestyle may be associated with metabolic loadings that result in a lower metabolic activity score when compared to a user with a less active lifestyle performing the same movements. This results in requiring the more active user to, for example, work (or perform movement) at a higher activity intensity or for a longer duration to achieve the same metabolic activity score as the less active user participating in the same activity type (or movements).
- the metabolic activity score is reset every twenty-four hours.
- the metabolic activity score may be continually incremented and decremented throughout a measuring period, but may be reset to a value (e.g., zero) at the end of twenty-four hours.
- the metabolic activity score may be reset after any given length of time (or measuring period)—for example, the activity score may be continually updated over the period of one week, or one month.
- the metabolic activity score is reset to a number greater than zero. As such, the user effectively receives a credit for a particularly active day, allowing the user to be less active the next day without receiving a lower metabolic activity score for the next day.
- the metabolic activity score is reset to a value less than zero. The user effectively receives a penalty for that day, and would have to make up for a particularly inactive or overly consumptive day by increasing the user's activity levels the next day.
- creating and updating the metabolic activity score is performed by a movement monitoring module or by a metabolic activity score module.
- operation 1006 involves detecting a fatigue level.
- the fatigue level is the fatigue level of the user.
- the fatigue level in one embodiment, is a function of recovery.
- the fatigue level may be detected in various ways.
- the fatigue level is detected by using heartrate measurements detected by earphones 100 to estimate a heart rate variability (HRV) of a user by using logic circuits of processor 165 (discussed above in reference in to FIG. 2B-3F ) and based at least in part on the recovery measured.
- HRV heart rate variability
- representations of fatigue level are described above (e.g., numerical, descriptive, etc.).
- HRV heart rate variability
- the fatigue level may be higher. In other words, the body is less fresh and well-rested.
- HRV is more sporadic (i.e., amount of time between heartbeats varies largely), the fatigue level may be higher.
- HRV may be measured in a number of ways (discussed above in reference in to FIG. 2B-3F ). Measuring HRV, in one embodiment, involves an estimation of HRV based solely on heartrate data detected by optical heartrate sensor 122 of earphones 100 . In other embodiments, HRV may be measured using a combination of data from optical heartrate sensor 122 of earphones 100 , and a finger biosensor embedded in either earphones 100 or computing device 200 . optical heartrate sensor 122 may measure the heartrate at the tragus of the user's ear while a finger sensor biosensor measures the heartrate in a finger of the hand of the other arm.
- this combination allows the sensors, which in one embodiment are conductive, to measure an electrical potential through the body.
- Information about the electrical potential provides cardiac information (e.g., HRV, fatigue level, heart rate information, and so on), and such information is processed at operation 1006 .
- the HRV is measured using sensors that monitor other parts of the user's body, rather than the tragus, finger, etc.
- sensors may be employed to monitor the ankle, leg, arm, or torso.
- the HRV is measured by a module that is not attached to the body (e.g. in computing device 200 , 708 ), but is a standalone module.
- the fatigue level is detected based solely on the HRV measured.
- the fatigue level may be based on other measurements (e.g., measurements monitored by method 1000 ).
- the fatigue level may be based on the amount of sleep that is measured for the previous night, the duration and type of user activity, and the intensity of the activity determined for a previous time period (e.g., exercise activity level in the last twenty-four hours).
- these factors may include stress-related activities such as work and driving in traffic, which may generally cause a user to become fatigued.
- the fatigue level is detected by comparing the HRV measured to a reference HRV. This reference HRV may be based on information gathered from a large number of people from the general public. In another embodiment, the reference HRV is based on past measurements of the user's HRV.
- the fatigue level is detected once every twenty-four hours. This provides information about the user's fatigue level each day so that the user's activity levels may be directed according to the fatigue level. In various embodiments, the fatigue level is detected more or less often. Using the fatigue level, a user may determine whether or not an activity is necessary (or desirable), the appropriate activity intensity, and the appropriate activity duration. For example, in deciding whether to go on a run, or how long to run, the user may want to use operation 1006 to assess the user's current fatigue level. Then, the user may, for example, run for a shorter time if the user is more fatigued, or for a longer time if the user is less fatigued. In some cases, it may be beneficial to detect the fatigue level in the morning, upon the user's waking up. This may provide the user a reference for how the day's activities should proceed.
- operation 1008 involves creating and updating a dynamic recovery profile based on an archive.
- the archive includes historical information about the fatigue level (which is described above with reference to operation 1006 ).
- the archive includes historical information about the movement and the metabolic activity score.
- the archive may include, for example, information about past user activity types, past user activity intensities, and past fatigue levels, as well as the relationships between each of these (e.g., if fatigue levels are particularly high after a certain user activity type or after a user achieve a particular metabolic activity score).
- the archive may also include historical information relative to particular score periods and score period multipliers.
- the archive in various embodiments, is embedded or stored in apparatus 702 or computing device 708 .
- the dynamic recovery profile is created and updated based on the archive.
- the dynamic recovery profile is specific to the user's personal fatigue characteristics and responses.
- the dynamic recovery profile may reflect information indicating that the user typically has a very high fatigue level when the user gets less than six hours of sleep.
- the dynamic recovery profile may indicate that the user typically has a very high fatigue level following a day in which the user achieves a metabolic activity score above a certain amount (or a particular user activity intensity that is sustained over a particular amount of time).
- the user's fatigue levels may not follow typical trends, and the archive can account for this.
- the archive may reflect that the user has recorded a fatigue level of 6 when rested.
- the archive provides a means for the fatigue level measurement to be normalized to the user's specific HRV and fatigue levels.
- the dynamic recovery profile learns the fatigue tendencies of the user by compiling, by way of the archive, data about the user. Moreover, the dynamic recovery profile provides a contoured baseline that is continually adjusted as the user's performance, fatigue, and recovery tendencies change over time. In one embodiment, the dynamic recovery profile represents a range of fatigue levels that are normal for the user. For example, based on data in the archive, the dynamic recovery profile may indicate that fatigue levels between 40 and 60 are typical for the user.
- the dynamic recovery profile accounts for changes in the historical information over time by updating the dynamic recovery profile on a periodic basis. In a further embodiment, the user programs the dynamic recovery profile to refresh periodically to capture recent historical information. Updates to the dynamic recovery profile, in one instance, are based on rates or amounts of change that may occur over time to the historical information in the archive.
- the dynamic recovery profile in one embodiment, is implemented in conjunction with an archive table that represents data and relationships of parameters relative to that data.
- the archive table uses the parameters of metabolic activity score (MAS), date, fatigue level, sleep time, and average user activity intensity (UAI) to organize the data and extract relational information.
- MAS metabolic activity score
- UAI average user activity intensity
- FIG. 10D which provides archive table 1020 (which may be embodied in the archive).
- Archive table 1020 includes the parameters of date 1022 , MAS 1024 , average UAI 1026 , sleep time 1028 , and fatigue level 1030 .
- archive table 1020 may include only information about the user's measured fatigue levels.
- archive table 1020 includes any other parameters that are monitored, determined, or created by method 1000 .
- archive table 1020 includes analytics.
- Such analytics include statistical relationships of the various parameters in archive table 1020 .
- archive 1020 may include analytics such as mean ratio of fatigue level to MAS, mean ratio of sleep to MAS, mean fatigue level by day of the week, and so on. These analytics allow the dynamic recovery profile to back into optimal performance regimens specific to the user.
- the dynamic recovery profile may determine (from archive table 1020 ) that the user has a mean fatigue level of 7 following a day when sleep to MAS ratio is 6 to 2,000, and may determine that the user typically achieves a below average MAS on days when the fatigue level is 7 or higher.
- the dynamic recovery profile may indicate that the user should get more sleep, or should strive for a lower MAS, to avoid becoming overly fatigued.
- the dynamic recovery profile in one embodiment, reflects information about the user's optimal fatigue scenarios; that is, fatigue levels at which the user tends to historically achieve a high MAS.
- the optimal fatigue scenario may be specific to the user (e.g., some users may have greater capacity for activity when more fatigued, etc.).
- operation 1010 involves creating and updating an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- the interpreted recovery score because it is based on both the fatigue level detected and on actual, historical results (as incorporated into the dynamic recovery profile), provides higher resolution and additional perspective into the user's current performance state.
- the interpreted recovery score supplements the fatigue level with information to account for the user's past activities (e.g., from the archive).
- the interpreted recovery score may be, for example, a number selected from a range of numbers.
- the interpreted recovery score may be proportional to the fatigue level (e.g., higher fatigue corresponds to higher interpreted recovery score).
- a typical interpreted recovery score ranges from 40 to 60.
- the interpreted recovery score by way of the dynamic recovery profile (which is based on the archive), in one embodiment, has available information about the user activity type, the user activity intensity, and the duration of the user's recent activities, as well as analytics of historical information pertaining to the user's activities.
- the interpreted recovery score may use this information, in addition to the current fatigue level, to provide higher resolution into the user's capacity for activity. For example, if the user slept poorly, but for some reason this lack of sleep is not captured in the fatigue level measurement (e.g., if the HRV is consistent rather than sporadic), the interpreted recovery score may be adjusted to account for the user's lack of sleep. In this example, the lack of sleep information would be available via archived activity type detection and movement monitoring. In other embodiments, the interpreted recovery score will be based only on historic fatigue levels specific to the user. In various embodiments, operation 1010 is performed by interpreted recovery score module 808 .
- FIG. 11 is an operational flow diagram illustrating an example method 1100 for providing an interpreted recovery score in accordance with an embodiment of the present disclosure.
- apparatus 702 , earphones 100 , computing device 200 and/or computing device 708 perform various operations of method 1100 .
- method 1100 may include, at operation 1102 , various operations from method 1000 .
- method 1100 involves creating an initial recovery profile.
- the initial recovery profile is based on a comparison of the user information to normative group information.
- the normative group may include information collected from a group of people other than the user.
- the normative group information may be averaged and used as a baseline for the initial recovery profile (an expectation of user activity levels) before any historical information is generated.
- the normative group information is adjusted according to different possible sets of user information.
- the normative group information may collected and average (or otherwise statistically analyzed).
- a user information multiplier may be created based on a comparison of the normative group information and the user information.
- the user information multiplier may be applied to the normative group information to adjust the normative group information such that the normative group information becomes specific to the user's information and characteristics. For example, an average value of the normative group information may be increased if the user is younger than the average group member, or may decrease the average for a user that is less active than the average group member.
- This adjustment results in an initial recovery profile that is based on the normative group information but is specific to the user information (and the user).
- the initial recovery profile may represent a user-specific expectation for activity level (e.g., for MAS).
- the initial recovery profile may also represent a user-specific expectation for fatigue level.
- operation 1104 is performed by initial recovery profile module 902 .
- creating and updating the dynamic recovery profile is further based on the initial recovery profile.
- the dynamic recovery profile is updated in a way that reflects this discrepancy. For example, based on actual fatigue levels detected, the dynamic recovery profile may expect a higher fatigue level than indicated by the initial recovery profile.
- the dynamic recovery profile learns over time what fatigue levels or range of fatigue level is normal from the user.
- the dynamic recovery profile may include a blend of information from the archive and the initial recovery profile.
- the dynamic recovery profile in such an embodiment, more heavily weights the information from the archive as the archive gathers information that is increasingly complete. For example, before taking any fatigue measurements, the dynamic recovery profile may be based entirely on the initial recovery profile (which is derived from normative data). Then, for example, after detecting and storing in the archive two weeks' worth of fatigue level information from the user the dynamic recovery profile may weigh the information from the archive more heavily (e.g., base the dynamic recovery profile 50% on the archive and 50% on the initial recovery profile).
- the dynamic recovery profile may phase out the initial recovery profile entirely. That is, the dynamic recovery profile may be entirely based on the archive. In other words, the dynamic recovery profile, in such an embodiment, phases out the initial recovery profile as the amount of information in the archive increases.
- the historical information about the user activity type or user activity intensity (or MAS) may differ from the initial recovery profile in a way that warrants a shift in expected activity levels.
- the initial recovery profile may expect a higher or lower amount of user activity intensity (or MAS) than is in reality measured. This discrepancy may be resolved by updating the dynamic recovery profile based on the archive. For example, the dynamic recovery profile may be decreased because the user is not performing at the level (e.g., MAS) initially expected (or indicated by the initial recovery profile).
- the user information may change in a way that causes the initial recovery profile, created at operation 1104 , to lose its accuracy.
- the dynamic recovery profile may be updated to reflect such changes, such that the dynamic recovery profile is more accurate. For example, the user's weight or age may change. As a result, the normative group data used to generate the initial recovery profile may become stale. This may be resolved by updating the dynamic recovery profile (e.g., with the user's actual weight).
- the dynamic recovery profile may function as a version of the initial recovery profile adjusted according to the historical information in the archive.
- method 1100 includes operation 1106 , which involves providing a recovery status based on the interpreted recovery score.
- the recovery status may be based on various thresholds of the interpreted recovery score.
- the recovery status may be represented on a numerical, descriptive, or color scale, or the like.
- the recovery status is directly proportional to the interpreted recovery score.
- the recovery status in such an example, may indicate the user's need to rest from strenuous activity or high levels of activity.
- a negative recovery status may indicate that the user is over-rested
- a positive recovery status may indicate that rest is needed
- a small recovery status i.e., near-zero
- the recovery status includes the following: fatigued, recovered, and optimal. If the interpreted recovery score is below a lowest threshold, in the descriptive recovery status example, the recovery status will be “recovered.” This indicates that the user is fully rested. In some instances, “recovered” is distinguished from “optimal” because “recovered” indicates that the user is too rested and has less capacity for activity. Further illustrating the descriptive recovery status example, if the interpreted recovery score is above the lowest threshold but below the highest threshold, the recovery status will be “optimal.” This indicates that the user has peak capacity for activity. “Optimal” recovery status may be associated with the scenario in which the user is rested, but no overly so.
- the recovery status (in this example) will be “fatigued.” This indicates that the user has minimal capacity for activity because the user needs to rest.
- the recovery status is based on any number of thresholds and may be further stratified for higher granularity into the user's recovery status.
- Method 1100 includes operation 1108 , as illustrated in FIG. 11 .
- Operation 1108 involves providing an activity recommendation based on the interpreted recovery score. For example, if the interpreted recovery score is high, indicating that the user is more fatigued, lower user activity intensities may be recommended. If the interpreted recovery score is low, indicating that the user is well-rested, higher activity intensities may be recommended. This example applies to recommended activity durations in a similar fashion (e.g., longer durations if less fatigued, etc.).
- method 1100 includes operation 1110 , which involves comparing the interpreted recovery score to a past interpreted recovery score.
- the interpreted recovery is associated with a measuring period and the past interpreted recovery score is associated with a past measuring period.
- Interpreted recovery scores may be stored and associated with past measuring periods (i.e., the measured period during which the interpreted recovery score was created). In this way, past interpreted recovery scores and information associated therewith may be used to inform the user's current activity.
- comparing the scores may include providing a simple numerical readout of both scores (e.g., side by side).
- information about the time of day associated with the past interpreted recovery score is presented. For example, the time of day at which the past interpreted recovery score was created may be presented. This may inform the user of how the user's current interpreted activity score relates to the past interpreted recovery score, allowing the user to gauge how the interpreted recovery score may correlate to the user's physical state or feeling.
- the past interpreted recovery score is displayed on a graph (e.g., a line or bar graph) as a function of time (e.g., comparing against other past interpreted recovery scores from past measuring periods).
- the graph may be overlaid with a graph of the current interpreted recovery score.
- operation 1110 is performed by interpreted recovery score module 808 .
- FIG. 12 is an operational flow diagram illustrating an example method 1200 for providing an interpreted recovery score in accordance with an embodiment of the present disclosure.
- apparatus 702 e.g. biometric earphones 100 , or computing device 200
- computing device 708 e.g. computing device 200
- method 1200 involves performing a comparison of the interpreted recovery score to the fatigue level.
- Operation 1206 in another embodiment, involves tracking the comparison over time.
- the fatigue level may be associated with physical phenomena, including HRV, while the interpreted recovery score is based on actual, historical information (via the dynamic recovery profile), include past fatigue levels for the user.
- tracking the comparison over time provides insight into how lifestyle choices affect performance capacity and fatigue levels. For example, the comparison may provide a normalization for the user's typical fatigue levels as they change over time relative to past fatigue levels.
- method 1200 involves receiving an external interpreted recovery score.
- the external interpreted recovery score may be received in a number of ways (e.g., via communication medium 704 ).
- the external interpreted recovery score may be created and updated in a manner similar to the creating and updating of the interpreted recovery (operation 1010 ).
- the external interpreted recovery score may be from a second user, who is any user other than the user. The second user may be a friend or associate of the first user.
- operation 1208 is performed by interpreted recovery score module 808 .
- an embodiment of method 1200 involves comparing the external interpreted recovery score to the interpreted recovery score.
- the external interpreted recovery score may be compared to the interpreted recovery score in a fashion substantially similar to the comparison performed in operation 1110 .
- Operation 1210 allows the user to compare the user's interpreted recovery score (based on the user's fatigue level) to the interpreted recovery score of another user (based on the other user's fatigue level).
- operation 1210 is performed by interpreted recovery score module 808 .
- the operations of method 1000 , method 1100 , and method 1200 are performed using sensors configured to be attached to the body (e.g., the biometric earphones worn in the ears of a user).
- sensors may include a gyroscope or accelerometer to detect movement, and a heart-rate sensor (e.g. optical heartrate sensor 122 ), each of which may be embedded in a pair of earphones that a user can wear in the user's ears, such as earphones 100 , or a device or module such as computing device 200 .
- Such sensors may be used to perform the operations of monitoring the movement, detecting the fatigue level, creating and updating the dynamic recovery profile, and creating and updating the interpreted recovery score, or any other operation disclosed herein.
- sensors used to perform these operations may be standalone sensors, and may not attach to the body (e.g. coupled to computing device 200 , or other computing device).
- FIGS. 13-16 are provided to depict example user interfaces that may be used to display, and allow a user to interact with, the various data detected and computed in accordance with the above described systems and methods. Although not all data that can be provided by the systems and methods disclosed herein are depicted in FIGS. 13-16 , the figures nevertheless provide context for conveying how such data may be provided to a user.
- FIG. 13 illustrates an activity display 1300 that may be associated with an activity display module, such as activity display module 211 of activity tracking application 210 .
- activity display 1300 may visually present to a user a record of the user's activity.
- activity display 1300 may comprise a display navigation area 1301 , activity icons 1302 , activity goal section 1303 , live activity chart 1304 , and activity timeline 1305 .
- display navigation area 1301 allows a user to navigate between the various displays associated with modules 211 - 214 by selecting “right” and “left” arrows depicted at the top of the display on either side of the display screen title. An identification of the selected display may be displayed at the center of the navigation area 1301 .
- Other selectable displays may displayed on the left and right sides of navigation area 1301 .
- the activity display 1300 includes the identification “ACTIVITY” at the center of the navigation area. If the user wishes to navigate to a sleep display in this embodiment, the user may select the left arrow.
- navigation between the displays may be accomplished via finger swiping gestures. For example, in one embodiment a user may swipe the screen right or left to navigate to a different display screen. In another embodiment, a user may press the left or right arrows to navigate between the various display screens.
- activity icons 1302 may be displayed on activity display 1300 based on the user's predicted or self-reported activity. For example, in this particular embodiment activity icons 1302 are displayed for the activities of walking, running, swimming, sport, and biking, indicating that the user has performed these five activities.
- one or more modules of application 210 may estimate the activity being performed (e.g., sleeping, walking, running, or swimming) by comparing the data collected by a biometric earphone's sensors to pre-loaded or learned activity profiles. For example, accelerometer data, gyroscope data, heartrate data, or some combination thereof may be compared to preloaded activity profiles of what the data should look like for a generic user that is running, walking, or swimming.
- the preloaded activity profiles for each particular activity may be adjusted over time based on a history of the user's activity, thereby improving the activity predictive capability of the system.
- activity display 1300 allows a user to manually select the activity being performed (e.g., via touch gestures), thereby enabling the system to accurately adjust an activity profile associated with the user-selected activity. In this way, the system's activity estimating capabilities will improve over time as the system learns how particular activity profiles match an individual user. Particular methods of implementing this activity estimation and activity profile learning capability are described in U.S.
- an activity goal section 1303 may display various activity metrics such as a percentage activity goal providing an overview of the status of an activity goal for a timeframe (e.g., day or week), an activity score or other smart activity score associated with the goal, and activities for the measured timeframe (e.g., day or week).
- the display may provide a user with a current activity score for the day versus a target activity score for the day.
- the percentage activity goal may be selected by the user (e.g., by a touch tap) to display to the user an amount of a particular activity (e.g., walking or running) needed to complete the activity goal (e.g., reach 100%).
- activities for the timeframe may be individually selected to display metrics of the selected activity such as points, calories, duration, or some combination thereof.
- activity goal section 1303 displays that 100% of the activity goal for the day has been accomplished.
- activity goal section 1303 displays that activities of walking, running, biking, and no activity (sedentary) were performed during the day. This is also displayed as a numerical activity score 5000/5000.
- a breakdown of metrics for each activity e.g., activity points, calories, and duration
- a live activity chart 1304 may also display an activity trend of the aforementioned metrics (or other metrics) as a dynamic graph at the bottom of the display.
- the graph may be used to show when user has been most active during the day (e.g., burning the most calories or otherwise engaged in an activity).
- An activity timeline 1305 may be displayed as a collapsed bar at the bottom of display 1300 .
- activity timeline 1305 may display a more detailed breakdown of daily activity, including, for example, an activity performed at a particular time with associated metrics, total active time for the measuring period, total inactive time for the measuring period, total calories burned for the measuring period, total distance traversed for the measuring period, and other metrics.
- FIG. 14 illustrates a sleep display 1400 that may be associated with a sleep display module 212 .
- sleep display 1400 may visually present to a user a record of the user's sleep history and sleep recommendations for the day. It is worth noting that in various embodiments one or more modules of the activity tracking application 210 may automatically determine or estimate when a user is sleeping (and awake) based on an a pre-loaded or learned activity profile for sleep, in accordance with the activity profiles described above. Alternatively, the user may interact with the sleep display 1400 or other display to indicate that the current activity is sleep, enabling the system to better learn that individualized activity profile associated with sleep.
- the modules may also use data collected from the earphones, including fatigue level and activity score trends, to calculate a recommended amount of sleep.
- the modules may also use collected and processed data from the earphones, including some or all of the data related to fatigue level, recovery recommendations, dynamic recovery profile, interpreted recovery score, recovery status, initial recovery profile, activity score trends, HRV, etc. to calculate an interpreted recovery score as has been described in detail above.
- Systems and methods for implementing this functionality are further described in U.S. patent application Ser. No. 14/568,835, filed Dec. 12, 2014, and titled “System and Method for Creating a Dynamic Activity Profile”, and U.S. patent application Ser. No. 14/137,942, filed Dec. 20, 2013, titled “System and Method for Providing an Interpreted Recovery Score,” both of which are incorporated herein by reference in their entirety.
- sleep display 1400 may comprise a display navigation area 1401 , a center sleep display area 902 , a textual sleep recommendation 1403 , and a sleeping detail or timeline 1404 .
- Display navigation area 1401 allows a user to navigate between the various displays associated with modules 211 - 214 as described above.
- the sleep display 1400 includes the identification “SLEEP” at the center of the navigation area 1401 .
- Center sleep display area 1402 may display sleep metrics such as the user's recent average level of sleep or sleep trend 1402 A, a recommended amount of sleep for the night 1402 B, and an ideal average sleep amount 1402 C.
- these sleep metrics may be displayed in units of time (e.g., hours and minutes) or other suitable units.
- a user may compare a recommended sleep level for the user (e.g., metric 1402 B) against the user's historical sleep level (e.g., metric 1402 A).
- the sleep metrics 1402 A- 902 C may be displayed as a pie chart showing the recommended and historical sleep times in different colors.
- sleep metrics 1402 A- 902 C may be displayed as a curvilinear graph showing the recommended and historical sleep times as different colored, concentric lines.
- This particular embodiment is illustrated in example sleep display 1400 , which illustrates an inner concentric line for recommended sleep metric 1402 B and an outer concentric line for average sleep metric 1402 A.
- the lines are concentric about a numerical display of the sleep metrics.
- a textual sleep recommendation 1403 may be displayed at the bottom or other location of display 1400 based on the user's recent sleep history.
- a sleeping detail or timeline 1404 may also be displayed as a collapsed bar at the bottom of sleep display 1400 .
- when a user selects sleeping detail 1404 it may display a more detailed breakdown of daily sleep metrics, including, for example, total time slept, bedtime, and wake time. In particular implementations of these embodiments, the user may edit the calculated bedtime and wake time.
- the selected sleeping detail 1404 may graphically display a timeline of the user's movements during the sleep hours, thereby providing an indication of how restless or restful the user's sleep is during different times, as well as the user's sleep cycles.
- the user's movements may be displayed as a histogram plot charting the frequency and/or intensity of movement during different sleep times.
- FIG. 15 illustrates an activity recommendation and fatigue level display 1500 that may be associated with an activity recommendation and fatigue level display module 213 .
- display 1500 may visually present to a user the user's current fatigue level and a recommendation of whether or not engage in activity.
- one or more modules of activity tracking application 210 may track fatigue level based on data received from the earphones 100 , and make an activity level recommendation. For example, HRV data tracked at regular intervals may be compared with other biometric or biological data to determine how fatigued the user is. Additionally, the HRV data may be compared to pre-loaded or learned fatigue level profiles, as well as a user's specified activity goals. Particular systems and methods for implementing this functionality are described in greater detail in U.S.
- display 1500 may comprise a display navigation area 1501 (as described above), a textual activity recommendation 1502 , and a center fatigue and activity recommendation display 1503 .
- Textual activity recommendation 1502 may, for example, display a recommendation as to whether a user is too fatigued for activity, and thus must rest, or if the user should be active.
- Center display 1503 may display an indication to a user to be active (or rest) 1503 A (e.g., “go”), an overall score 1503 B indicating the body's overall readiness for activity, and an activity goal score 1503 C indicating an activity goal for the day or other period.
- indication 1503 A may be displayed as a result of a binary decision—for example, telling the user to be active, or “go”—or on a scaled indicator—for example, a circular dial display showing that a user should be more or less active depending on where a virtual needle is pointing on the dial.
- a binary decision for example, telling the user to be active, or “go”
- a scaled indicator for example, a circular dial display showing that a user should be more or less active depending on where a virtual needle is pointing on the dial.
- display 1500 may be generated by measuring the user's HRV at the beginning of the day (e.g., within 30 minutes of waking up.) For example, the user's HRV may be automatically measured using the optical heartrate sensor 122 after the user wears the earphones in a position that generates a good signal as described in method 400 .
- computing device 200 may display any one of the following: an instruction to remain relaxed while the variability in the user's heart signal (i.e., HRV) is being measured, an amount of time remaining until the HRV has been sufficiently measured, and an indication that the user's HRV is detected.
- HRV variability in the user's heart signal
- one or more processing modules of computing device 200 may determine the user's fatigue level for the day and a recommended amount of activity for the day. Activity recommendation and fatigue level display 1500 is generated based on this determination.
- the user's HRV may be automatically measured at predetermined intervals throughout the day using optical heartrate sensor 122 .
- activity recommendation and fatigue level display 1500 may be updated based on the updated HRV received throughout the day. In this manner, the activity recommendations presented to the user may be adjusted throughout the day.
- FIG. 16 illustrates a biological data and intensity recommendation display 1600 that may be associated with a biological data and intensity recommendation display module 214 .
- display 1600 may guide a user of the activity monitoring system through various fitness cycles of high-intensity activity followed by lower-intensity recovery based on the user's body fatigue and recovery level, thereby boosting the user's level of fitness and capacity on each cycle.
- display 1600 may include a textual recommendation 1601 , a center display 1602 , and a historical plot 1103 indicating the user's transition between various fitness cycles.
- textual recommendation 1601 may display a current recommended level of activity or training intensity based on current fatigue levels, current activity levels, user goals, pre-loaded profiles, activity scores, smart activity scores, historical trends, and other bio-metrics of interest.
- Center display 1602 may display a fitness cycle target 1602 A (e.g., intensity, peak, fatigue, or recovery), an overall score 1602 B indicating the body's overall readiness for activity, an activity goal score 1602 C indicating an activity goal for the day or other period, and an indication to a user to be active (or rest) 1602 D (e.g., “go”).
- the data of center display 1602 may be displayed, for example, on a virtual dial, as text, or some combination thereof.
- recommended transitions between various fitness cycles (e.g., intensity and recovery) may be indicated by the dial transitioning between predetermined markers.
- display 1600 may display a historical plot 1103 that indicates the user's historical and current transitions between various fitness cycles over a predetermined period of time (e.g., 30 days).
- the fitness cycles may include, for example, a fatigue cycle, a performance cycle, and a recovery cycle.
- Each of these cycles may be associated with a predetermined score range (e.g., overall score 1602 B).
- a fatigue cycle may be associated with an overall score range of 0 to 33
- a performance cycle may be associated with an overall score range of 34 to 66
- a recovery cycle may be associated with an overall score range of 67 to 100.
- the transitions between the fitness cycles may be demarcated by horizontal lines intersecting the historical plot 1103 at the overall score range boundaries.
- the illustrated historical plot 1103 includes two horizontal lines intersecting the historical plot.
- measurements below the lowest horizontal line indicate a first fitness cycle (e.g., fatigue cycle)
- measurements between the two horizontal lines indicate a second fitness cycle (e.g., performance cycle)
- measurements above the highest horizontal line indicate a third fitness cycle (e.g., recovery cycle).
- FIG. 17 illustrates an example computing module that may be used to implement various features of the systems and methods disclosed herein.
- the computing module includes a processor and a set of computer programs residing on the processor.
- the set of computer programs is stored on a non-transitory computer readable medium having computer executable program code embodied thereon.
- the computer executable code is configured to detect a fatigue level.
- the computer executable code is also configured to create and update a dynamic recovery profile based on an archive.
- the archive includes historical information about the fatigue level.
- the computer executable code is further configured to create and update an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- the example computing module may be used to implement these various features in a variety of ways, as described above with reference to the methods and tables illustrated in FIGS. 10A , 10 B, 10 C, 10 D, 11 , and 12 , and as will be appreciated by one of ordinary skill in the art upon reading this disclosure.
- module might describe a given unit of functionality that can be performed in accordance with one or more embodiments of the present application.
- a module might be implemented utilizing any form of hardware, software, or a combination thereof.
- processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a module.
- the various modules described herein might be implemented as discrete modules or the functions and features described can be shared in part or in total among one or more modules.
- computing module 1700 may represent, for example, computing or processing capabilities found within desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, smart-watches, smart-glasses etc.); mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment.
- Computing module 1700 might also represent computing capabilities embedded within or otherwise available to a given device.
- a computing module might be found in other electronic devices such as, for example, digital cameras, navigation systems, cellular telephones, portable computing devices, modems, routers, WAPs, terminals and other electronic devices that might include some form of processing capability.
- Computing module 1700 might include, for example, one or more processors, controllers, control modules, or other processing devices, such as a processor 1704 .
- Processor 1704 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic.
- processor 1704 is connected to a bus 1702 , although any communication medium can be used to facilitate interaction with other components of computing module 1700 or to communicate externally.
- Computing module 1700 might also include one or more memory modules, simply referred to herein as main memory 1708 .
- main memory 1708 preferably random access memory (RAM) or other dynamic memory, might be used for storing information and instructions to be executed by processor 1704 .
- Main memory 1708 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1704 .
- Computing module 1700 might likewise include a read only memory (“ROM”) or other static storage device coupled to bus 1702 for storing static information and instructions for processor 1704 .
- ROM read only memory
- the computing module 1700 might also include one or more various forms of information storage mechanism 1710 , which might include, for example, a media drive 1712 and a storage unit interface 1720 .
- the media drive 1712 might include a drive or other mechanism to support fixed or removable storage media 1714 .
- a hard disk drive, a solid state drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive might be provided.
- storage media 1714 might include, for example, a hard disk, a solid state drive, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed by media drive 1712 .
- the storage media 1714 can include a computer usable storage medium having stored therein computer software or data.
- information storage mechanism 1710 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 1700 .
- Such instrumentalities might include, for example, a fixed or removable storage unit 1722 and a storage interface 1720 .
- Examples of such storage units 1722 and storage interfaces 1720 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed or removable storage units 1722 and storage interfaces 1720 that allow software and data to be transferred from the storage unit 1722 to computing module 1700 .
- Computing module 1700 might also include a communications interface 1724 .
- Communications interface 1724 might be used to allow software and data to be transferred between computing module 1700 and external devices.
- Examples of communications interface 1724 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface.
- Software and data transferred via communications interface 1724 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a given communications interface 1724 . These signals might be provided to communications interface 1724 via a channel 1728 .
- This channel 1728 might carry signals and might be implemented using a wired or wireless communication medium.
- Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels.
- computer program medium and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example, memory 1308 , storage unit 1720 , media 1714 , and channel 1728 .
- These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution.
- Such instructions embodied on the medium are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable the computing module 1700 to perform features or functions of the present application as discussed herein.
- module does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Cardiology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Otolaryngology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Biodiversity & Conservation Biology (AREA)
- Pulmonology (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
A system for providing an interpreted recovery score includes an apparatus for providing an interpreted recovery score. The apparatus includes a fatigue level module that detects a fatigue level. In addition, the apparatus includes a dynamic recovery profile module that creates and updates a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level. The apparatus also includes an interpreted recovery score module that creates and updates an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
Description
- This application is a continuation-in-part of and claims the benefit of U.S. patent application Ser. No. 14/137,942, filed Dec. 20, 2013, titled “System and Method for Providing an Interpreted Recovery Score,” which is a continuation-in-part of U.S. patent application Ser. No. 14/137,734, filed Dec. 20, 2013, titled “System and Method for Providing a Smart Activity Score,” which is a continuation-in-part of U.S. patent application Ser. No. 14/062,815, filed Oct. 24, 2013, titled “Wristband with Removable Activity Monitoring Device.” The contents of each of the Ser. No. 14/137,942 application, the Ser. No. 14/137,734 application and the Ser. No. 14/062,815 application are incorporated herein by reference.
- The present disclosure relates generally to fitness monitoring devices, and more particularly to a system and method for providing an interpreted recovery score.
- Previous generation heart rate monitors and fitness tracking devices generally enabled only a monitoring of a user's heart rate. Currently available fitness tracking devices now add functionality that measures the user's heart rate variability. One issue with currently available fitness tracking devices and heart rate monitors is that they do not account for the performance or recovery state of the user in a scientific, user-specific way. In other words, currently available solutions do not normalize the heart rate variability measurement to be specific to the user. Another issue is that currently available solutions do not learn how the user's normal recovery levels are reflected in measurements of the user's heart rate variability.
- In view of the above drawbacks, there exists a long-felt need for fitness tracking devices and heart rate monitors that detect a fatigue level in a scientific way and provide user-specific recovery feedback based on actual, historical data about the user's fatigue levels or heart rate variability. Further, there is a need for fitness tracking devices and heart rate monitors that learn how the user's normal recovery levels are reflected in measurements of the user's heart rate variability.
- Embodiments of the present disclosure include systems and methods for providing an interpreted recovery score.
- One embodiment involves an apparatus for providing an interpreted recovery score. The apparatus includes a fatigue level module that detects a fatigue level. The apparatus also includes a dynamic recovery profile module that creates and updates a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level. In addition, the apparatus includes an interpreted recovery score module that creates and updates an interpreted recovery score based on the fatigue level and the dynamic recovery profile. In one embodiment, the interpreted recovery score is specific to a measuring period.
- The apparatus for providing an interpreted recovery score, in one embodiment, also includes an initial recovery profile module that creates an initial recovery profile. The initial recovery profile is based on a comparison of the user information to normative group information. In another embodiment, the dynamic recovery profile module creates and updates the dynamic recovery profile further based on the initial recovery profile.
- In a further embodiment, the apparatus includes a recovery status module that provides a recovery status based on the interpreted recovery score. The recovery status, in one instance, is one of the following: fatigued, recovered, and optimal. In another example, the interpreted recovery score module performs a comparison of the interpreted recovery score to the fatigue level, and tracks the comparison over time.
- The apparatus, in one embodiment, includes a recovery recommendation module that provides an activity recommendation based on the interpreted recovery score. At least one of the fatigue level module, the dynamic recovery profile module, and the interpreted recovery score module is embodied in a wearable sensor. For example, one or more of these modules may be embodied in a biometric sensor (e.g. heartrate sensor, motion sensor, etc.) mechanically coupled to a pair of earphones that can be worn in a user's ears. The earphones may further be configured to communicate with a computing device to provide and/or display an interpreted recovery score to a user.
- One embodiment of the present disclosure involves a method for providing an interpreted recovery score. The method includes detecting a fatigue level. In addition, the method includes creating and updating a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level. The method also includes creating and updating an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- The method for providing an interpreted recovery score, in one embodiment, includes creating an initial recovery profile based on a comparison of the user information to normative group information. In another embodiment, creating and updating the dynamic recovery profile is further based on the initial recovery profile. The dynamic recovery profile, in one embodiment, phases out the initial recovery profile as an amount of the historical information in the archive increases.
- In a further embodiment, the method includes providing a recovery status. The recovery status is based on the interpreted recovery score. The recovery status, in one instance, is one of the following: fatigued, recovered, and optimal. In another example, the method includes performing a comparison of the interpreted recovery score to the fatigue level, and the method includes tracking the comparison over time.
- The method, in one embodiment, includes providing an activity recommendation based on the interpreted recovery score. In one instance, the method includes receiving an external interpreted recovery score and comparing the external interpreted recovery score to the interpreted recovery score. In another example, the method includes comparing the interpreted recovery score to a past interpreted recovery score. In such an example, the interpreted recovery score is associated with a measuring period and the past interpreted recovery score is associated with a past measuring period.
- In various embodiments, at least one of the operations of detecting the fatigue level, creating and updating the dynamic recovery profile, and creating and updating the interpreted recovery score includes using a sensor configured to be attached to the body of a user. For example, one or more of these operations may be implemented by using a biometric sensor (e.g. heartrate sensor, motion sensor, etc.) mechanically coupled to a pair of earphones. In exemplary embodiments employing biometric earphones, the earphones are configured to be worn in a user's ears and may be further configured to communicate (e.g. transmit detected biometric data, receive audio signals, etc.) with another computing device.
- One embodiment of the disclosure includes a system for providing an interpreted recovery score. The system includes a processor and at least one computer program residing on the processor. The computer program is stored on a non-transitory computer readable medium having computer executable program code embodied thereon. The computer executable program code is configured to detect a fatigue level. In addition, the computer executable program code is configured to create and update a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level. The computer executable program code is further configured to create and update an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
- Other features and aspects of the disclosure will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features in accordance with embodiments of the disclosure. The summary is not intended to limit the scope of the disclosure, which is defined solely by the claims attached hereto.
- The present disclosure, in accordance with one or more various embodiments, is described in detail with reference to the following figures. The figures are provided for purposes of illustration only and merely depict typical or example embodiments of the disclosure.
-
FIG. 1 illustrates a perspective view of example communications environment in which embodiments of the disclosed technology may be implemented. -
FIG. 2A illustrates a perspective view of an example pair of biometric earphones that, in some embodiments, is the activity monitoring device used to implement the technology disclosed herein. -
FIG. 2B illustrates an example architecture for circuitry of the biometric earphones ofFIG. 2A . -
FIG. 3A illustrates a perspective view of a particular embodiment of a biometric earphone, including an optical heartrate sensor that may be used to implement the technology disclosed herein. -
FIG. 3B illustrates a side perspective view of placement of the optical heartrate sensor of the earphones ofFIG. 3A when they are worn by a user. -
FIG. 3C illustrates a frontal perspective view of placement of the optical heartrate sensor of the biometric earphone ofFIG. 3A when they are worn by a user. -
FIG. 3D illustrates a cross-sectional view of an over-the-ear configuration of dual-fit earphones in accordance with the disclosed technology. -
FIG. 3E illustrates a cross-sectional view of an over-the-ear configuration of the dual-fit earphones ofFIG. 3D . -
FIG. 3F illustrates a cross-sectional view of an under-the-ear configuration of the dual-fit earphones ofFIG. 3D . -
FIG. 4 is a block diagram illustrating an example computing device that may be used to implement embodiments of the disclosed technology. -
FIG. 5 illustrates modules of an example activity monitoring application that may be used to implement embodiments of the disclosed technology. -
FIG. 6 is an operational flow diagram illustrating a method of prompting a user to adjust the placement of earphones in the user's ear to ensure accurate biometric data collection by the earphones' biometric sensors. -
FIG. 7 illustrates an example system for providing an interpreted recovery score. -
FIG. 8 illustrates an example apparatus for providing an interpreted recovery score. -
FIG. 9 illustrates another example apparatus for providing an interpreted recovery score. -
FIG. 10A is an operational flow diagram illustrating an example of a method for creating and updating an interpreted recovery score. -
FIG. 10B is an example of a metabolic loading table. -
FIG. 10C is an example of an activity intensity library. -
FIG. 10D is an example of an archive table. -
FIG. 11 is an operational flow diagram illustrating an example of a method for providing an interpreted recovery score including providing a recovery status. -
FIG. 12 is an operational flow diagram illustrating an example of a method for providing an interpreted recovery score including comparing the interpreted recovery score to an external interpreted recovery score. -
FIG. 13 illustrates an activity display that may be associated with an activity display module of the activity monitoring application ofFIG. 5 . -
FIG. 14 illustrates a sleep display that may be associated with a sleep display module of the activity monitoring application ofFIG. 5 . -
FIG. 15 illustrates an activity recommendation and fatigue level display that may be associated with an activity recommendation and fatigue level display module of the activity monitoring application ofFIG. 5 . -
FIG. 16 illustrates a biological data and intensity recommendation display that may be associated with a biological data and intensity recommendation display module of the activity monitoring application ofFIG. 5 . -
FIG. 17 illustrates an example computing module that may be used to implement various features of the systems and methods disclosed herein. - The figures are not intended to be exhaustive or to limit the disclosure to the precise form disclosed. It should be understood that the disclosure can be practiced with modification and alteration, and that the disclosure can be limited only by the claims and the equivalents thereof.
- The present disclosure is directed toward systems and methods for providing an interpreted recovery score. The disclosure is directed toward various embodiments of such systems and methods. In one such embodiment, the systems and methods are directed to a device that provides an interpreted recovery score. According to some embodiments of the disclosure, the device may be a pair of earphones including one or more biometric sensors (e.g. heartrate sensor, motion sensor, etc.) mechanically coupled thereto. The earphones may be configured with electronic components and circuitry for processing detected user biometric data and providing user biometric data to another computing device (e.g. smartphone, laptop, desktop, tablet, etc.).
FIGS. 1-6 illustrate, by way of example, embodiments that utilize such biometric earphones. However, one of ordinary skill in the art will recognize that the systems and methods of the present disclosure may be implemented using various activity monitoring devices. Indeed, the figures are not intended to be exhaustive or to limit the disclosure to the precise form disclosed. -
FIG. 1 illustrates an example communications environment in accordance with an embodiment of the technology disclosed herein, the embodiment employing biometric earphones. In this embodiment,earphones 100 communicate biometric and audio data withcomputing device 200 over acommunication link 300. The biometric data is measured by one or more sensors (e.g., heart rate sensor, accelerometer, gyroscope) ofearphones 100. Although a smartphone is illustrated,computing device 200 may comprise any computing device (smartphone, tablet, laptop, smartwatch, desktop, etc.) configured to transmit audio data toearphones 100, receive biometric data from earphones 100 (e.g., heartrate and motion data), and process the biometric data collected byearphones 100. In additional embodiments,computing device 200 itself may collect additional biometric information that is provided for display. For example, if computingdevice 200 is a smartphone it may use built in accelerometers, gyroscopes, and a GPS to collect additional biometric data. -
Computing device 200 additionally includes a graphical user interface (GUI) to perform functions such as accepting user input and displaying processed biometric data to the user. The GUI may be provided by various operating systems known in the art, such as, for example, iOS, Android, Windows Mobile, Windows, Mac OS, Chrome OS, Linux, Unix, a gaming platform OS, etc. The biometric information displayed to the user can include, for example a summary of the user's activities, a summary of the user's fitness levels, activity recommendations for the day, the user's heart rate and heart rate variability (HRV), and other activity related information. User input that can be accepted on the GUI can include inputs for interacting with an activity tracking application further described below. - In embodiments, the
communication link 300 is a wireless communication link based on one or more wireless communication protocols such as BLUETOOTH, ZIGBEE, 1302.11 protocols, Infrared (IR), Radio Frequency (RF), etc. Alternatively, the communications link 300 may be a wired link (e.g., using any one or a combination of an audio cable, a USB cable, etc.) - With specific reference now to
earphones 100,FIG. 2A is a diagram illustrating a perspective view ofexemplary earphones 100.FIG. 2A will be described in conjunction withFIG. 2B , which is a diagram illustrating an example architecture for circuitry ofearphones 100.Earphones 100 comprise aleft earphone 110 withtip 116, aright earphone 120 withtip 126, acontroller 130 and acable 140.Cable 140 electrically couples theright earphone 110 to theleft earphone 120, and both earphones 110-120 tocontroller 130. Additionally, each earphone may optionally include a fin orear cushion 117 that contacts folds in the outer ear anatomy to further secure the earphone to the wearer's ear. - In embodiments,
earphones 100 may be constructed with different dimensions, including different diameters, widths, and thicknesses, in order to accommodate different human ear sizes and different preferences. In some embodiments ofearphones 100, the housing of eachearphone motion sensor 121,optical heartrate sensor 122, audio-electronic components such asdrivers speakers processors memories 170, 175). The rigid shell may be made with plastic, metal, rubber, or other materials known in the art. The housing may be cubic shaped, prism shaped, tubular shaped, cylindrical shaped, or otherwise shaped to house the electronic components. - The
tips - In embodiments,
controller 130 may provide various controls (e.g., buttons and switches) related to audio playback, such as, for example, volume adjustment, track skipping, audio track pausing, and the like. Additionally,controller 130 may include various controls related to biometric data gathering, such as, for example, controls for enabling or disabling heart rate and motion detection. In a particular embodiment,controller 130 may be a three button controller. - The circuitry of
earphones 100 includesprocessors memories earphone 110 andearphone 120, and abattery 190. In this embodiment,earphone 120 includes a motion sensor 121 (e.g., an accelerometer or gyroscope), anoptical heartrate sensor 122, and aright speaker 124 andcorresponding driver 123.Earphone 110 includes aleft speaker 114 andcorresponding driver 113. In additional embodiments,earphone 110 may also include a motion sensor (e.g., an accelerometer or gyroscope), and/or an optical heartrate sensor. - A
biometric processor 165 comprises logical circuits dedicated to receiving, processing and storing biometric information collected by the biometric sensors of the earphones. More particularly, as illustrated inFIG. 2B ,processor 165 is electrically coupled tomotion sensor 121 andoptical heartrate sensor 122, and receives and processes electrical signals generated by these sensors. These processed electrical signals represent biometric information such as the earphone wearer's motion and heartrate.Processor 165 may store the processed signals as biometric data inmemory 175, which may be subsequently made available to a computing device using wireless transceiver 180. In some embodiments, sufficient memory is provided to store biometric data for transmission to a computing device for further processing. - During operation,
optical heartrate sensor 122 uses a photoplethysmogram (PPG) to optically obtain the user's heart rate. In one embodiment,optical heartrate sensor 122 includes a pulse oximeter that detects blood oxygenation level changes as changes in coloration at the surface of a user's skin. More particularly, in this embodiment, theoptical heartrate sensor 122 illuminates the skin of the user's ear with a light-emitting diode (LED). The light penetrates through the epidermal layers of the skin to underlying blood vessels. A portion of the light is absorbed and a portion is reflected back. The light reflected back through the skin of the user's ear is then obtained with a receiver (e.g., a photodiode) and used to determine changes in the user's blood oxygen saturation (SpO2) and pulse rate, thereby permitting calculation of the user's heart rate using algorithms known in the art (e.g., using processor 165). In this embodiment, the optical sensor may be positioned on one of the earphones such that it is proximal to the interior side of a user's tragus when the earphones are worn. - In various embodiments,
optical heartrate sensor 122 may also be used to estimate a heart rate variable (HRV), i.e. the variation in time interval between consecutive heartbeats, of the user ofearphones 100. For example,processor 165 may calculate the HRV using the data collected bysensor 122 based on a time domain methods, frequency domain methods, and other methods known in the art that calculate HRV based on data such as the mean heart rate, the change in pulse rate over a time interval, and other data used in the art to estimate HRV. - In further embodiments, logic circuits of
processor 165 may further detect, calculate, and store metrics such as the amount of physical activity, sleep, or rest over a period of time, or the amount of time without physical activity over a period of time. The logic circuits may use the HRV, the metrics, or some combination thereof to calculate a recovery score. In various embodiments, the recovery score may indicate the user's physical condition and aptitude for further physical activity for the current day. For example, the logic circuits may detect the amount of physical activity and the amount of sleep a user experienced over the last 48 hours, combine those metrics with the user's HRV, and calculate a recovery score. In various embodiments, the calculated recovery score may be based on any scale or range, such as, for example, a range between 1 and 10, a range between 1 and 100, or a range between 0% and 100%. Further, the logic circuits may use the HRV, the metrics, or some combination thereof to calculate an interpreted recovery score as described in more detail in connection withFIGS. 7-12 . - During audio playback,
earphones 100 wirelessly receive audio data using wireless transceiver 180. The audio data is processed by logic circuits ofaudio processor 160 into electrical signals that are delivered torespective drivers left speaker 114 andright speaker 124 ofearphones - The wireless transceiver 180 is configured to communicate biometric and audio data using available wireless communications standards. For example, in some embodiments, the wireless transceiver 180 may be a BLUETOOTH transmitter, a ZIGBEE transmitter, a Wi-Fi transmitter, a GPS transmitter, a cellular transmitter, or some combination thereof. Although
FIG. 2B illustrates a single wireless transceiver 180 for both transmitting biometric data and receiving audio data, in an alternative embodiment, a transmitter dedicated to transmitting only biometric data to a computing device may be used. In this alternative embodiment, the transmitter may be a low energy transmitter such as a near field communications (NFC) transmitter or a BLUETOOTH low energy (LE) transmitter. In implementations of this particular embodiment, a separate wireless receiver may be provided for receiving high fidelity audio data from an audio source. In yet additional embodiments, a wired interface (e.g., micro-USB) may be used for communicating data stored inmemories -
FIG. 2B also shows that the electrical components ofheadphones 100 are powered by abattery 190 coupled topower circuitry 191. Any suitable battery or power supply technologies known in the art or later developed may be used. For example, a lithium-ion battery, aluminum-ion battery, piezo or vibration energy harvesters, photovoltaic cells, or other like devices can be used. In embodiments,battery 190 may be enclosed inearphone 110 orearphone 120. Alternatively, battery 102 may be enclosed incontroller 130. In embodiments, the circuitry may be configured to enter a low-power or inactive mode whenearphones 100 are not in use. For example, mechanisms such as, for example, an on/off switch, a BLUETOOTH transmission disabling button, or the like may be provided oncontroller 130 such that a user may manually control the on/off state of power-consuming components ofearphones 100. - It should be noted that in various embodiments,
processors memories battery 190 may be enclosed in and distributed throughout any one or more ofearphone 110,earphone 120, andcontroller 130. For example, in one particular embodiment,processor 165 andmemory 175 may be enclosed inearphone 120 along withoptical heartrate sensor 122 andmotion sensor 121. In this particular embodiment, these four components are electrically coupled to the same printed circuit board (PCB) enclosed inearphone 120. It should also be noted that althoughaudio processor 160 andbiometric processor 165 are illustrated in this exemplary embodiment as separate processors, in an alternative embodiment the functions of the two processors may be integrated into a single processor. -
FIG. 3A illustrates a perspective view of one embodiment of anearphone 120, including anoptical heartrate sensor 122, in accordance with the technology disclosed herein.FIG. 3A will be described in conjunction withFIGS. 3B-3C , which are perspective views illustrating placement ofheartrate sensor 122 whenearphone 120 is worn in a user'sear 350. As illustrated,earphone 120 includes abody 125,tip 126,ear cushion 127, and anoptical heartrate sensor 122.Optical heartrate sensor 122 protrudes from a frontal side ofbody 125, proximal to tip 126 and where the earphone's nozzle (not shown) is present.FIGS. 3B-3C illustrate the optical sensor andear interface 340 whenearphone 120 is worn in a user'sear 350. Whenearphone 120 is worn,optical heartrate sensor 122 is proximal to the interior side of a user'stragus 360. - In this embodiment,
optical heartrate sensor 122 illuminates the skin of the interior side of the ear'stragus 360 with a light-emitting diode (LED). The light penetrates through the epidermal layers of the skin to underlying blood vessels. A portion of the light is absorbed and a portion is reflected back. The light reflected back through the skin is then obtained with a receiver (e.g., a photodiode) ofoptical heartrate sensor 122 and used to determine changes in the user's blood flow, thereby permitting measurement of the user's heart rate and HRV. - In various embodiments,
earphones 100 may be dual-fit earphones shaped to comfortably and securely be worn in either an over-the-ear configuration or an under-the-ear configuration. The secure fit provided by such embodiments keeps theoptical heartrate sensor 122 in place on the interior side of the ear'stragus 360, thereby ensuring accurate and consistent measurements of a user's heartrate. -
FIGS. 3D and 3E are cross-sectional views illustrating one such embodiment of dual-fit earphones 600 being worn in an over-the-ear configuration.FIG. 3F illustrates dual-fit earphones 600 in an under-the-ear configuration. - As illustrated,
earphone 600 includeshousing 610,tip 620,strain relief 630, and cord orcable 640. The proximal end oftip 620 mechanically couples to the distal end ofhousing 610. Similarly, the distal end ofstrain relief 630 mechanically couples to a side (e.g., the top side) ofhousing 610. Furthermore, the distal end ofcord 640 is disposed within and secured by the proximal end ofstrain relief 630. The longitudinal axis of the housing, Hx, forms angle θ1 with respect to the longitudinal axis of the tip, Tx. The longitudinal axis of the strain relief, Sy, aligns with the proximal end ofstrain relief 630 and forms angle θ2 with respect to the axis Hx. In several embodiments, θ1 is greater than 0 degrees (e.g., Tx extends in a non-straight angle from Hx, or in other words, thetip 620 is angled with respect to the housing 610). In some embodiments, θ1 is selected to approximate the ear canal angle of the wearer. For example, θ1 may range between 5 degrees and 15 degrees. Also in several embodiments, θ2 is less than 90 degrees (e.g., Sy extends in a non-orthogonal angle from Hx, or in other words, thestrain relief 630 is angled with respect to a perpendicular orientation with housing 610). In some embodiments, θ2 may be selected to direct the distal end ofcord 640 closer to the wearer's ear. For example, θ2 may range between 75 degrees and 85 degrees - As illustrated, x1 represents the distance between the distal end of
tip 620 and the intersection of strain relief longitudinal axis Sy and housing longitudinal axis Hx. One of skill in the art would appreciate that the dimension x1 may be selected based on several parameters, including the desired fit to a wearer's ear based on the average human ear anatomical dimensions, the types and dimensions of electronic components (e.g., optical sensor, motion sensor, processor, memory, etc.) that must be disposed within the housing and the tip, and the specific placement of the optical sensor. In some examples, x1 may be at least 18 mm. However, in other examples, x1 may be smaller or greater based on the parameters discussed above. - Similarly, as illustrated, x2 represents the distance between the proximal end of
strain relief 630 and the surface wearer's ear. In the configuration illustrated, θ2 may be selected to reduce x2, as well as to direct thecord 640 towards the wearer's ear, such thatcord 640 may rest in the crevice formed where the top of the wearer's ear meets the side of the wearer's head. In some embodiments, θ2 may range between 75 degrees and 85 degrees. In some examples,strain relief 630 may be made of a flexible material such as rubber, silicone, or soft plastic such that it may be further bent towards the wearer's ear. Similarly,strain relief 630 may comprise a shape memory material such that it may be bent inward and retain the shape. In some examples,strain relief 630 may be shaped to curve inward towards the wearer's ear. - In some embodiments, the proximal end of
tip 620 may flexibly couple to the distal end ofhousing 610, enabling a wearer to adjust θ1 to most closely accommodate the fit oftip 620 into the wearer's ear canal (e.g., by closely matching the ear canal angle). - As one having skill in the art would appreciate from the above description,
earphones 100 in various embodiments may gather biometric user data that may be used to track a user's activities and activity level. That data may then be made available to another computing device (e.g. smartphone, tablet), which may provide a GUI for interacting with the data using a software activity tracking application installed on the computing device.FIG. 4 is a block diagram illustrating example components of onesuch computing device 200 including an installedactivity tracking application 210. - As illustrated in this example,
computing device 200 comprises a connectivity interface 201,storage 202 withactivity tracking application 210,processor 204, a graphical user interface (GUI) 205 includingdisplay 206, and abus 207 for transferring data between the various components ofcomputing device 200. - Connectivity interface 201 connects
computing device 200 toearphones 100 through a communication medium. The medium may comprise a wireless network system such as a BLUETOOTH system, a ZIGBEE system, an Infrared (IR) system, a Radio Frequency (RF) system, a cellular network, a satellite network, a wireless local area network, or the like. The medium may additionally comprise a wired component such as a USB system. -
Storage 202 may comprise volatile memory (e.g. RAM), non-volatile memory (e.g. flash storage), or some combination thereof. In various embodiments,storage 202 may store biometric data collected byearphones 100. Additionally,storage 202 stores anactivity tracking application 210, that when executed byprocessor 204, allows a user to interact with the collected biometric information. - In various embodiments, a user may interact with
activity tracking application 210 via aGUI 205 including adisplay 206, such as, for example, a touchscreen display that accepts various hand gestures as inputs. In accordance with various embodiments,activity tracking application 210 may process the biometric information collected byearphones 100 and present it viadisplay 206 ofGUI 205. Before describing exemplaryactivity tracking application 210 in further detail, it is worth noting that in someembodiments earphones 100 may filter and/or process the collected biometric information prior to transmitting the biometric information tocomputing device 200. Accordingly, although the embodiments disclosed herein are described with reference toactivity tracking application 210 processing the received biometric information, in various implementations various processing and/or preprocessing operations may be performed by aprocessor earphones 100. - In various embodiments,
activity tracking application 210 may be initially configured/setup (e.g., after installation on a smartphone) based on a user's self-reported biological information, sleep information, and activity preference information. For example, during setup a user may be prompted viadisplay 206 for biological information such as the user's gender, height, age, and weight. Further, during setup the user may be prompted for sleep information such as the amount of sleep needed by the user and the user's regular bed time. Further, still, the user may be prompted during setup for a preferred activity level and activities the user desires to be tracked (e.g., running, walking, swimming, biking, etc.) In various embodiments, described below, this self-reported information may be used in tandem with the information collected byearphones 100 to display activity monitoring information using various modules. - Following setup,
activity tracking application 210 may be used by a user to monitor and define how active the user wants to be on a day-to-day basis based on the biometric information (e.g., accelerometer information, optical heart rate sensor information, etc.) collected byearphones 100. As illustrated inFIG. 5 ,activity tracking application 210 may comprise various display modules, including anactivity display module 211, asleep display module 212, an activity recommendation and fatiguelevel display module 213, and a biological data and intensityrecommendation display module 214. Additionally,activity tracking application 210 may comprisevarious processing modules 215 for processing the activity monitoring information (e.g., optical heartrate information, accelerometer information, gyroscope information, etc.) collected by the earphones or the biological information entered by the users. These modules may be implemented separately or in combination. For example, in some embodimentsactivity processing modules 215 may be directly integrated with one or more of display modules 211-214. - As will be further described below, each of display modules 211-214 may be associated with a unique display provided by
activity tracking app 210 viadisplay 206. That is,activity display module 211 may have an associated activity display,sleep display module 212 may have an associated sleep display, activity recommendation and fatiguelevel display module 213 may have an associated activity recommendation and fatigue level display, and biological data and intensityrecommendation display module 214 may have an associated biological data and intensity recommendation display. - In embodiments,
application 210 may be used to display to the user an instruction for wearing and/or adjustingearphones 100 if it is determined thatoptical heartrate sensor 122 and/ormotion sensor 121 are not accurately gathering motion data and heart rate data.FIG. 6 is an operational flow diagram illustrating onesuch method 400 of an earphone adjustment feedback loop with a user that ensures accurate biometric data collection byearphones 100. Atoperation 410, execution ofapplication 210 may causedisplay 206 to display an instruction to the user on how to wearearphones 100 to obtain an accurate and reliable signal from the biometric sensors. In embodiments,operation 410 may occur once after installingapplication 210, once a day (e.g., when user first wears theearphones 100 for the day), or at any custom and/or predetermined interval. - At
operation 420, feedback is provided to the user regarding the quality of the signal received from the biometric sensors based on the particular position that earphones 100 are being worn. For example,display 206 may display a signal quality bar or other graphical element. Atdecision 430, it is determined if the biosensor signal quality is satisfactory for biometric data gathering and use ofapplication 210. In various embodiments, this determination may be based on factors such as, for example, the frequency with whichoptical heartrate sensor 122 is collecting heart rate data, the variance in the measurements ofoptical heartrate sensor 122, dropouts in heart rate measurements bysensor 122, the signal-to-noise ratio approximation ofoptical heartrate sensor 122, the amplitude of the signals generated by the sensors, and the like. - If the signal quality is unsatisfactory, at
operation 440,application 210 may causedisplay 206 to display to the user advice on how to adjust the earphones to improve the signal, andoperations 420 anddecision 430 may subsequently be repeated. For example, advice on adjusting the strain relief of the earphones may be displayed. Otherwise, if the signal quality is satisfactory, atoperation 450, application may causedisplay 206 to display to the user confirmation of good signal quality and/or good earphone position. Subsequently,application 210 may proceed with normal operation (e.g., display modules 211-214).FIGS. 13-16 illustrate a particular exemplary implementation of a GUI forapp 210 comprising displays associated with each of display modules 211-214. -
FIG. 7 is a schematic block diagram illustrating an example of asystem 700 for providing an interpreted recovery score.System 700 includes apparatus for providing interpretedrecovery score 702,communication medium 704,server 706, andcomputing device 708. Although apparatus for providing an interpretedrecovery score 702 may in some embodiments beearphones 100 ofFIGS. 2-3E , and althoughcomputing device 708 may in some embodiments be the same ascomputing device 200 inFIG. 4 , embodiments of the present disclosure may also take other forms, as has been noted. The following detailed description forFIGS. 7-9 are described in terms ofapparatus 702 andcomputing device 708 to convey that the systems and methods disclosed herein may also be implemented using other various devices without departing from the scope of the technology disclosed herein. -
Communication medium 704 may be implemented in a variety of forms. For example,communication medium 704 may be an Internet connection, such as a local area network (“LAN”), a wide area network (“WAN”), a fiber optic network, internet over power lines, a hard-wired connection (e.g., a bus), and the like, or any other kind of network connection.Communication medium 704 may be implemented using any combination of routers, cables, modems, switches, fiber optics, wires, radio, and the like.Communication medium 704 may be implemented using various wireless standards, such as Bluetooth, Wi-Fi, 4G LTE, etc. One of skill in the art will recognize other ways to implementcommunication medium 704 for communications purposes. -
Server 706 directs communications made overcommunication medium 704.Server 706 may be, for example, an Internet server, a router, a desktop or laptop computer, a smartphone, a tablet, a processor, a module, or the like. In one embodiment,server 706 directs communications betweencommunication medium 704 andcomputing device 708. For example,server 706 may update information stored oncomputing device 708, orserver 706 may send information tocomputing device 708 in real time. -
Computing device 708 may take a variety of forms, such as a desktop or laptop computer, a smartphone, a tablet, a processor, a module, or the like. In addition,computing device 708 may be a processor or module embedded in a wearable sensor, a pair of earphones, a bracelets, a smart-watch, a piece of clothing, an accessory, and so on. For example,computing device 708 may be substantially similar to devices embedded in or otherwise coupled toearphones 100.Computing device 708 may communicate with other devices overcommunication medium 704 with or without the use ofserver 706. In one embodiment,computing device 708 includesapparatus 702. In various embodiments,apparatus 702 is used to perform various processes described herein. -
FIG. 8 is a schematic block diagram illustrating one embodiment of an apparatus for providing an interpretedrecovery score 800.Apparatus 800 includesapparatus 702 withfatigue level module 804, dynamicrecovery profile module 806, and interpretedrecovery score module 808. - In one embodiment of
apparatus 800, a movement monitoring module (not shown) monitors a movement to create a metabolic activity score based on the movement and user information. The movement monitoring module will be described below in further detail with regard to various processes. -
Fatigue level module 804 detects a fatigue level.Fatigue level module 804 will be described below in further detail with regard to various processes. - Dynamic
recovery profile module 806 creates and updates a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level. In one embodiment, the archive includes historical information about the movement and the metabolic activity score. Dynamicrecovery profile module 806 will be described below in further detail with regard to various processes. - Interpreted
recovery score module 808 creates and updates an interpreted recovery score based on the fatigue level and the dynamic recovery profile. Interpretedrecovery score module 808 will be described below in further detail with regard to various processes. -
FIG. 9 is a schematic block diagram illustrating one embodiment of apparatus for providing an interpretedrecovery score 900.Apparatus 900 includes apparatus for providing an interpretedrecovery score 702 withfatigue level module 804, dynamicrecovery profile module 806, and interpretedrecovery score module 808.Apparatus 900 also includes initialrecovery profile module 902,recovery status module 904, andrecovery recommendation module 906. Initialrecovery profile module 902,recovery status module 904, andrecovery recommendation module 906 will be described below in further detail with regard to various processes. In one embodiment,apparatus 900 also includes the movement monitoring module (not shown) described above with respect toFIG. 8 . - In one embodiment, at least one of
fatigue level module 804, dynamicrecovery profile module 806, interpretedrecovery score module 808, initialrecovery profile module 902,recovery status module 904, andrecovery recommendation module 906 are embodied in a wearable sensor, such asbiometric earphones 100. In various embodiments, any of the modules described herein are embodiedbiometric earphones 100 and connect to other modules described herein viacommunication medium 704. In other cases, one or more of the modules are embodied in various other forms of hardware, such as the hardware ofcomputing device 708 orcomputing device 200. -
FIG. 10A is an operational flow diagram illustratingexample method 1000 for providing an interpreted recovery score in accordance with an embodiment of the present disclosure. The operations ofmethod 1000 create and update an interpreted recovery score based on a user's personalized fatigue levels, as recorded over time. In various embodiments, the fatigue level is based on a measured heart rate variability for the user and is a function of recovery. Moreover, the operations ofmethod 1000 take into account not only the user's current fatigue level, but also the relationship between current and past fatigue levels to create an interpreted recovery score that accurately reflects the user's physical condition and performance capabilities. This aids in providing a personalized metric by which the user can attain peak performance. In one embodiment,apparatus 702,biometric earphones 100, and/orcomputing device method 1000. - In one embodiment, movement is monitored to create a metabolic activity score based on the movement and user information. The metabolic activity score, in one embodiment, is created from a set of metabolic loadings. The metabolic loadings may be determined by identifying a user activity type from a set of reference activity types and by identifying a user activity intensity from a set of reference activity intensities. In addition, the metabolic loadings may be determined based on information provided by a user (user information).
- User information may include, for example, an individual's height, weight, age, gender, and geographic and environmental conditions. The user may provide the user information by, for example, a user interface of
computing device 708, or, acontroller 130 ofbiometric earphones 100. User information may be determined based on various measurements—for example, measurements of the user's body-fat content or body type. In addition, the user information may be determined, for example, by an altimeter or GPS, which may be used to determine the user's elevation, weather conditions in the user's environment, etc. In one embodiment,apparatus 702 obtains user information from the user indirectly. For example,apparatus 702 may collect the user information from a social media account, from a digital profile, or the like. In another embodiment,computing device 708 obtains user information from the user indirectly. For example,computing device 708 may collect the user information from a social media account, from a digital profile, or the like. - The user information, in one embodiment, includes a user lifestyle selected from a set of reference lifestyles. For example,
apparatus 702 may prompt the user for information about the user's lifestyle (e.g., via a user interface or controller).Apparatus 702 may prompt the user to determine how active the user's lifestyle is. Additionally, the user may be prompted to select a user lifestyle from a set of reference lifestyles. The reference lifestyles may include a range of lifestyles, for example, ranging from inactive, on one end, to highly active on the other end. In such a case, the reference lifestyles that the user selects from may include sedentary, mildly active, moderately active, and heavily active. - In one instance, the user lifestyle is determined from the user as an initial matter. For example, upon initiation,
apparatus 702 may prompt the user to provide a user lifestyle. In a further embodiment, the user is prompted periodically to select a user lifestyle. In this fashion, the user lifestyle selected may be aligned with the user's actual activity level as the user's activity level varies over time. In another embodiment, the user lifestyle is updated without intervention from the user. - The metabolic loadings, in one embodiment, are numerical values and may represent a rate of calories burned per unit weight per unit time (e.g., having units of kcal per kilogram per hour). By way of example, the metabolic loadings may be represented in units of oxygen uptake (e.g., in milliliters per kilogram per minute). The metabolic loadings may also represent a ratio of the metabolic rate during activity (e.g., the metabolic rate associated with a particular activity type and/or an activity intensity) to the metabolic rate during rest. The metabolic loadings, may, for example be represented in a metabolic table, such as metabolic table 1050, illustrated in
FIG. 10B . In one embodiment, the metabolic loadings are specific to the user information. For example, a metabolic loading may increase for a heavier user, or for an increased elevation, but may decrease for a lighter user or for a decreased elevation. - In one embodiment, the set of metabolic loadings are determined based on the user lifestyle, in addition to the other user information. For example, the metabolic loadings for a user with a heavily active lifestyle may differ from the metabolic loadings for a user with a sedentary lifestyle. In this fashion, there may be a greater coupling between the metabolic loadings and the user's characteristics.
- In various embodiments, a device (e.g., computing device 708) or a module (e.g.,
biometric earphones 100 or a module therein) stores or provides the metabolic loadings. The metabolic loadings may be maintained or provided byserver 706 or overcommunication medium 704. In one embodiment, a system administrator provides the metabolic loadings based on a survey, publicly available data, scientifically determined data, compiled user data, or any other source of data. In some instances, a movement monitoring module performs the above-described operations. In various embodiments, the movement monitoring module includes a metabolic loading module and a metabolic table module that determine the metabolic loading associated with the movement. - In one embodiment, a metabolic table is maintained based on the user information. The metabolic loadings in the metabolic table may be based on the user information. In some cases, the metabolic table is maintained based on a set of standard user information, in place of or in addition to user information from the user. The standard user information may include, for example, the average fitness characteristics of all individuals being the same age as the user, the same height as the user, etc. In another embodiment, instead of maintaining the metabolic table based on standard information, if the user has not provided user information, maintaining the metabolic table is delayed until the user information is obtained.
- As illustrated in
FIG. 10B , in one embodiment, the metabolic table is maintained as metabolic table 1050. Metabolic table 1050 may be stored in computing device 708 (e.g. computing device 200) or apparatus 702 (e.g. biometric earphones 100), and may include information such as reference activity types (RATs) 1054, reference activity intensities (RAIs) 1052, and/or metabolic loadings (MLs) 1060. As illustrated inFIG. 10B , in one embodiment,RATs 1054 are arranged asrows 1058 in metabolic table 1050. Each of a set ofrows 1058 corresponds todifferent RATs 1054, and eachrow 1058 is designated by a row index number. For example, thefirst RAT row 1058 may be indexed as RAT_0, the second as RAT_1, and so on for as many rows as metabolic table 1050 may include. - The reference activity types may include typical activities, such as running, walking, sleeping, swimming, bicycling, skiing, surfing, resting, working, and so on. The reference activity types may also include a catch-all category, for example, general exercise. The reference activity types may also include atypical activities, such as skydiving, SCUBA diving, and gymnastics. In one embodiment, a user defines a user-defined activity by programming computing device 708 (e.g., by an interface on
computing device 708, such asGUI 205 in the example of computing device 200) with information about the user-defined activity, such as pattern of movement, frequency of pattern, and intensity of movement. The typical reference activities may be provided, for example, by metabolic table 1050. - In one embodiment,
reference activity intensities 1052 are arranged as columns in metabolic table 1050, and metabolic table 1050 includescolumns 1056, each corresponding todifferent RAIs 1052. Eachcolumn 1056 is designated by a different column index number. For example, thefirst RAI column 1056 is indexed as RAI_0, the second as RAI_1 and so on for as many columns as metabolic table 1050 may include. - The reference activity intensities include, in one embodiment, a numeric scale. For example, the reference activity intensities may include numbers ranging from one to ten (representing increasing activity intensity). The reference activities may also be represented as a range of letters, colors, and the like. The reference activity intensities may be associated with the vigorousness of an activity. For example, the reference activity intensities may represented by ranges of heart rates or breathing rates.
- In one embodiment, metabolic table 1050 includes
metabolic loadings 1060. Eachmetabolic loading 1060 corresponds to areference activity type 1058 of thereference activity types 1054 and areference activity intensity 1056 of thereference activity intensities 1052. Eachmetabolic loading 1060 corresponds to a unique combination ofreference activity type 1054 andreference activity intensity 1052. For example, in the column and row arrangement discussed above, one of thereference activity types 1054 of a series ofrows 1058 of reference activity types, and one of thereference activity intensities 1052 of a series ofcolumns 1056 of reference activity intensities correspond to a particularmetabolic loading 1060. In such an arrangement, eachmetabolic loading 1060 may be identifiable by only one combination ofreference activity type 1058 andreference activity intensity 1056. - This concept is illustrated in
FIG. 10B . As shown, eachmetabolic loading 1060 is designated using a two-dimensional index, with the first index dimension corresponding to therow 1058 number and the second index dimension corresponding to thecolumn 1056 number of themetabolic loading 1060. For example, inFIG. 10B , ML_2,3 has a first dimension index of 2 and a second dimension index of 3. ML_2,3 corresponds to therow 1058 for RAT_2 and thecolumn 1056 for RAI_3. Any combination of RAT_M and RAIN may identify a corresponding ML_M,N in metabolic table 1050, where M is any number corresponding to arow 1058 number in metabolic table 1050 and N is any number corresponding to acolumn 1056 number in metabolic table 1050. By way of example, the reference activity type RAT_3 may be “surfing,” and the reference activity intensity RAI_3 may be “4.” This combination in metabolic table 1050 corresponds tometabolic loading 1060 ML_3,3, which may, for example, represent 5.0 kcal/kg/hour (a typical value for surfing). In various embodiments, some of the above-described operations are performed by the movement monitoring module and some of the operations are performed by the metabolic table module. - Referring again to
method 1000, in various embodiments, the movement is monitored by location tracking (e.g., Global Positioning Satellites (GPS), or a location-tracking device connected to a network via communication medium 704). The general location of the user, as well as specific movements of the user's body, are monitored. For example, the movement of the user's leg in x, y, and z directions may be monitored (e.g., by an accelerometer or gyroscope). In one embodiment,apparatus 702 receives an instruction regarding which body part is being monitored. For example,apparatus 702 may receive an instruction that the movement of a user's wrist, ankle, head, or torso is being monitored. - In various embodiments, the movement of the user is monitored and a pattern of the movement (pattern) is determined. For example, the pattern may be detected by an accelerometer or gyroscope. The pattern may be a repetition of a motion or a similar motion monitored by the
method 1000; for example, the pattern may be geometric shape (e.g., a circle, line, oval) of repeated movement that is monitored. In some cases, the repetition of a motion in a geometric shape is not repeated consistently over time, but is maintained for a substantial proportion of the repetitions of movement. For instance, one occurrence of elliptical motion in a repetitive occurrence (or pattern) of ten circular motions may be monitored and determined to be a pattern of circular motion. - In further embodiments, the geometric shape of the pattern of movement is a three dimensional (3D) shape. To illustrate, the pattern associated with the head of a user rowing a canoe, or a wrist of a person swimming the butterfly stroke may be monitored and analyzed into a geometric shape in three dimensions. The pattern may be complicated, but it may be described in a form can be recognized by
method 1000. Such a form may include computer code that describes the spatial relationship of a set of points, along with changes in acceleration forces that are experienced along those points as, for example, a sensor travels throughout the pattern. - In various embodiments, monitoring the pattern includes monitoring the frequency with which the pattern is repeated (or pattern frequency). The pattern frequency may be derived from a repetition period of the pattern (or pattern repetition period). The pattern repetition period may be the length of time elapsing from when a device or sensor passes through a certain point in a pattern and when the device or sensor returns to that point when the pattern is repeated. For example, the sensor may be at point x, y, z at time t_0. The device may then move along the trajectory of the pattern, eventually returning to point x, y, z at time t_1. The pattern repetition period would be the difference between t_1 and t_0 (e.g., measured in seconds). The pattern frequency may be the reciprocal of the pattern repetition period, and may have units of cycles per second. When the pattern repetition period is, for example, two seconds, the pattern frequency would be 0.5 cycles per second.
- In some embodiments, various other inputs are used to determine the activity type and activity intensity. For example, monitoring the movement may include monitoring the velocity at which the user is moving (or the user velocity). The user velocity may, for example, have units of kilometers per hour. In one embodiment, the user's location information is monitored to determine user velocity. This may be done by GPS, through
communication medium 704, and so on. The user velocity may be distinguished from the speed of the pattern (or pattern speed). For example, the user may be running at a user velocity of 10 km/hour, but the pattern speed of the user's wrist may be 20 km/hour at a given point (e.g., as the wrist moves from behind the user to in front of the user). The pattern speed may be monitored using, for example, an accelerometer or gyroscope. In another example, the user velocity may also be distinguished from the pattern speed of a user's head, albeit a subtle distinction, as the user's head rocks slightly forward and backward when running. - In one embodiment, the user's altitude is monitored. This may be done, for example, using an altimeter, user location information, information entered by the user, etc. In another embodiment, the impact the user has with an object (e.g., the impact of the user's feet with ground) is monitored. This may be done using an accelerometer or gyroscope. In some cases, the ambient temperature is measured (e.g., by apparatus 702).
Apparatus 702 may associate a group of reference activity types with bands of ambient temperature. For example, when the ambient temperature is zero degrees Celsius, activities such as skiing, sledding, and ice climbing are appropriate selections for reference activity types, whereas surfing, swimming, and beach volleyball may be inappropriate. The ambient humidity may also be measured (e.g., by a hygrometer). In some cases, pattern duration (i.e., the length of time for which particular movement pattern is sustained) is measured. - In one embodiment, monitoring the movement is accomplished using sensors configured to be attached to a user's body (e.g. earphones 100). Such sensors may include a gyroscope or accelerometer to detect movement, and a heart-rate sensor, each of which may be embedded in a pair of earphones that a user can wear on the user's head, such as
biometric earphones 100. Additionally, various modules and sensors that may be used to perform the above-described operations may be embedded inbiometric earphones 100. In other embodiments, any one or more of the described sensors may be embedded incomputing device 200. Additionally, various modules and sensors that may be used to perform the above-described operations may be embedded incomputing device 200. In other embodiments, the data from sensors and/or modules embedded inearphones 100 and the data from sensors and/or modules embedded incomputing device 200 are used in combination to implement the above-described operations and computations. In various embodiments, the above-described operations are performed by the movement monitoring module. -
Method 1000, in one embodiment, involves determining the user activity type from the set of reference activity types. Once detected, the pattern may be used to determine the user activity type from a set of reference activity types. Each reference activity type is associated with a reference activity type pattern. The user activity type may be determined to be the reference activity type that has a reference activity type pattern that matches the pattern measured bymethod 1000. - In some cases, the pattern that matches the reference activity type pattern will not be an exact match, but will be substantially similar. In other cases, the patterns will not even be substantially similar, but it may be determined that the patterns match because they are the most similar of any patterns available. For example, the reference activity type may be determined such that the difference between the pattern of movement corresponding to this reference activity type and the pattern of movement is less than a predetermined range or ratio. In one embodiment, the pattern is looked up (for a match) in a reference activity type library. The reference activity type library may be included in the metabolic table. For example, the reference type library may include rows in a table such as the
RAT rows 1058. - In further embodiments,
method 1000 involves using the pattern frequency to determine the user activity type from the set of reference activity types. Several reference activity types, however, may be associated with similar patterns (e.g., because the head moves in a similar pattern when running versus walking). In such cases, the pattern frequency is used to determine the activity type (e.g., because the pattern frequency for running is higher than the pattern frequency for walking). -
Method 1000, in some instances, involves using additional information to determine the activity type of the user. For example, the pattern for walking may be similar to the pattern for running. The reference activity of running may be associated with higher user velocities and the reference activity of walking with lower user velocities. In this way, the velocity measured may be used to distinguish two reference activity types having similar patterns. - In other embodiments,
method 1000 involves monitoring the impact the user has with the ground and determining that, because the impact is larger, the activity type, for example, is running rather than walking. If there is no impact, the activity type may be determined to be cycling (or other activity where there is no impact). In some cases, the humidity is measured to determine whether the activity is a water sport (i.e., whether the activity is being performed in the water). The reference activity types may be narrowed to those that are performed in the water, from which narrowed set of reference activity types the user activity type may be determined. In other cases, the temperature measured is used to determine the activity type. -
Method 1000 may entail instructing the user to confirm the user activity type. In one embodiment, a user interface is provided such that the user can confirm whether a displayed user activity type is correct, or select the user activity type from a group of activity types. - In further embodiments, a statistical likelihood for of choices for user activity type is determined. The possible user activity types are then provided to the user in such a sequence that the most likely user activity type is listed first (and then in descending order of likelihood). For example, it may be determined that, based on the pattern, the pattern frequency, the temperature, and so on, that there is an 80% chance the user activity type is running, a 15% chance the user activity type is walking, and a 5% chance the user activity is dancing. Via a user interface, a list of these possible user activities may be provided such that the user may select the activity type the user is performing. In various embodiments, some of the above-described operations are performed by the metabolic loading module.
-
Method 1000, in some embodiments, also includes determining the user activity intensity from a set of reference activity intensities. The user activity intensity may be determined in a variety of ways. For example, the repetition period (or pattern frequency) and user activity type (UAT) may be associated with a reference activity intensity library to determine the user activity intensity that corresponds to a reference activity intensity.FIG. 10C illustrates one embodiment whereby this aspect ofmethod 1000 is accomplished, including referenceactivity intensity library 1080. Referenceactivity intensity library 1080 is organized byrows 1088 ofreference activity types 1084 andcolumns 1086 ofpattern frequencies 1082. InFIG. 10C ,reference activity library 1080 is implemented in a table.Reference activity library 1080 may, however, be implemented other ways. - In one embodiment, it is determined that, for
user activity type 1084 UAT_0 performed atpattern frequency 1082 F_0, thereference activity intensity 1090 is RAI_0,0. For example,UAT 1084 may correspond to the reference activity type for running, apattern frequency 1082 of 0.5 cycles per second for the user activity type may be determined. Referenceactivity intensity library 1080 may determine that theUAT 1084 of running at apattern frequency 1082 of 0.5 cycles per second corresponds to anRAI 1090 of five on a scale of ten. In another embodiment, thereference activity intensity 1090 is independent of the activity type. For example, the repetition period may be five seconds, and this may correspond to an intensity level of two on a scale of ten. - Reference
activity intensity library 1080, in one embodiment, is included in metabolic table 1050. In some cases, the measured repetition period (or pattern frequency) does not correspond exactly to a repetition period for a reference activity intensity in metabolic table 1050. In such cases, the correspondence may be a best-match fit, or may be a fit within a tolerance. Such a tolerance may be defined by the user or by a system administrator, for example. - In various embodiments,
method 1000 involves supplementing the measurement of pattern frequency to help determine the user activity intensity from the reference activity intensities. For example, if the user activity type is skiing, it may be difficult to determine the user activity intensity because the pattern frequency may be erratic or otherwise immeasurable. In such an example, the user velocity, the user's heart rate, and other indicators (e.g., breathing rate) may be monitored to determine how hard the user is working during the activity. For example, higher heart rate may indicate higher user activity intensity. In a further embodiment, the reference activity intensity is associated with a pattern speed (i.e., the speed or velocity at which the sensor is progressing through the pattern). A higher pattern speed may correspond to a higher user activity intensity. -
Method 1000, in one embodiment, determines the user activity type and the user activity intensity by using sensors configured to be attached to the user's body. Such sensors may include, for example, a gyroscope or accelerometer to detect movement, and a heart-rate sensor, each of which may be mechanically coupled to a pair of earphones that a user can wear in the user's ears, such asearphones 100. Additionally, various sensors and modules that may be used to preform above-described operations ofmethod 1000 may be embedded in or otherwise coupled tobiometric earphones 100 or other hardware (e.g. hardware of computing device 200). In various embodiments, the above-described operations are performed by the movement monitoring module. - Referring again to
FIG. 10A ,method 1000 includes creating and updating a metabolic activity score based on the movement and the user information.Method 1000 may also include determining a metabolic loading associated with the user and the movement. In one embodiment, a duration of the activity type at a particular activity intensity (e.g., in seconds, minutes, or hours) is determined. The metabolic activity score may be created and updated by, for example, multiplying the metabolic loading by the duration of the user activity type at a particular user activity intensity. If the user activity intensity changes, the new metabolic loading (associated with the new user activity intensity) may be multiplied by the duration of the user activity type at the new user activity intensity. In one embodiment, the activity score is represented as a numerical value. By way of example, the metabolic activity score may be updated by continually supplementing the metabolic activity score as new activities are undertaken by the user. In this way, the metabolic activity score continually increases as the user participates in more and more activities. - In one embodiment, the metabolic activity score is based on score periods. Monitoring the movement may include determining, during a score period, the metabolic loading associated with the movement. Score periods may include segments of time. The user activity type, user activity intensity, and the corresponding metabolic loading, in one embodiment, are measured (or determined) during each score period, and the metabolic activity score may be calculated for that score period. As the movement changes over time, the varying characteristics of the movement are captured by the score periods.
-
Method 1000 includes, in one embodiment, creating and updating a set of periodic activity scores. Each period activity score is based on the movement monitored during a set of score periods, and each period activity score is associated with a particular score period of the set of score periods. In one example, the metabolic activity score is created and updated as an aggregate of period activity scores, and the metabolic activity score may represent a running sum total of the period activity scores. - In one embodiment,
method 1000 includes applying a score period multiplier to the score period to create an adjusted period activity score. The metabolic activity score in such an example is an aggregation of adjusted period activity scores. Score period multipliers may be associated with certain score periods, such that the certain score periods contribute more or less to the metabolic activity score than other score periods during which the same movement is monitored. For example, if the user is performing a sustained activity, a score period multiplier may be applied to the score periods that occur during the sustained activity. By contrast, a multiplier may not be applied to score periods that are part of intermittent, rather than sustained, activity. As a result of the score period multiplier, the user's sustained activity may contribute more to the metabolic activity score than the user's intermittent activity. The score period multiplier may allow consideration of the increased demand of sustained, continuous activity relative to intermittent activity. - The score period multiplier, in one instance, is directly proportional to the number of continuous score periods over which a type and intensity of the movement is maintained. The adjusted period activity score may be greater than or less than the period activity score, depending on the score period multiplier. For example, for intermittent activity, the score period multiplier may be less than 1.0, whereas for continuous, sustained activity, the score period multiplier may be greater than 1.0.
- In one embodiment,
method 1000 entails decreasing the metabolic activity score when the user consumes calories. For example, if the user goes running and generates a metabolic activity score of 1,000 as a result, but then the user consumes calories, the metabolic activity score may be decreased by 200 points, or any number of points. The decrease in the number of points may be proportional to the number of calories consumed. In other embodiments, information about specific aspects of the user's diet is obtained, and metabolic activity score points are awarded for healthy eating (e.g., fiber) and subtracted for unhealthy eating (e.g., excessive fat consumption). - The user, in one embodiment, is pushed to work harder, or not as hard, depending on the user lifestyle. This may be done, for example, by adjusting the metabolic loadings based on the user lifestyle. To illustrate, a user with a highly active lifestyle may be associated with metabolic loadings that result in a lower metabolic activity score when compared to a user with a less active lifestyle performing the same movements. This results in requiring the more active user to, for example, work (or perform movement) at a higher activity intensity or for a longer duration to achieve the same metabolic activity score as the less active user participating in the same activity type (or movements).
- In one embodiment, the metabolic activity score is reset every twenty-four hours. The metabolic activity score may be continually incremented and decremented throughout a measuring period, but may be reset to a value (e.g., zero) at the end of twenty-four hours. The metabolic activity score may be reset after any given length of time (or measuring period)—for example, the activity score may be continually updated over the period of one week, or one month.
- In one embodiment, because the metabolic activity score was greater than a certain amount for the measuring period, the metabolic activity score is reset to a number greater than zero. As such, the user effectively receives a credit for a particularly active day, allowing the user to be less active the next day without receiving a lower metabolic activity score for the next day. In a further embodiment, because the metabolic activity score was less than a predetermined value for the measuring period, the metabolic activity score is reset to a value less than zero. The user effectively receives a penalty for that day, and would have to make up for a particularly inactive or overly consumptive day by increasing the user's activity levels the next day. In various embodiments, creating and updating the metabolic activity score is performed by a movement monitoring module or by a metabolic activity score module.
- Referring again to
FIG. 10A ,operation 1006 involves detecting a fatigue level. In one embodiment, the fatigue level is the fatigue level of the user. The fatigue level, in one embodiment, is a function of recovery. The fatigue level may be detected in various ways. In one example, the fatigue level is detected by using heartrate measurements detected byearphones 100 to estimate a heart rate variability (HRV) of a user by using logic circuits of processor 165 (discussed above in reference in toFIG. 2B-3F ) and based at least in part on the recovery measured. Further, representations of fatigue level are described above (e.g., numerical, descriptive, etc.). When the HRV is more consistent (i.e., steady, consistent amount of time between heartbeats), for example, the fatigue level may be higher. In other words, the body is less fresh and well-rested. When HRV is more sporadic (i.e., amount of time between heartbeats varies largely), the fatigue level may be higher. - At
operation 1006, HRV may be measured in a number of ways (discussed above in reference in toFIG. 2B-3F ). Measuring HRV, in one embodiment, involves an estimation of HRV based solely on heartrate data detected byoptical heartrate sensor 122 ofearphones 100. In other embodiments, HRV may be measured using a combination of data fromoptical heartrate sensor 122 ofearphones 100, and a finger biosensor embedded in eitherearphones 100 orcomputing device 200.optical heartrate sensor 122 may measure the heartrate at the tragus of the user's ear while a finger sensor biosensor measures the heartrate in a finger of the hand of the other arm. In some embodiments, this combination allows the sensors, which in one embodiment are conductive, to measure an electrical potential through the body. Information about the electrical potential provides cardiac information (e.g., HRV, fatigue level, heart rate information, and so on), and such information is processed atoperation 1006. In other embodiments, the HRV is measured using sensors that monitor other parts of the user's body, rather than the tragus, finger, etc. For example, in some embodiments sensors may be employed to monitor the ankle, leg, arm, or torso. In some instances, the HRV is measured by a module that is not attached to the body (e.g. incomputing device 200, 708), but is a standalone module. - In one embodiment, at
operation 1006, the fatigue level is detected based solely on the HRV measured. The fatigue level, however, may be based on other measurements (e.g., measurements monitored by method 1000). For example, the fatigue level may be based on the amount of sleep that is measured for the previous night, the duration and type of user activity, and the intensity of the activity determined for a previous time period (e.g., exercise activity level in the last twenty-four hours). By way of example, these factors may include stress-related activities such as work and driving in traffic, which may generally cause a user to become fatigued. In some cases, the fatigue level is detected by comparing the HRV measured to a reference HRV. This reference HRV may be based on information gathered from a large number of people from the general public. In another embodiment, the reference HRV is based on past measurements of the user's HRV. - At
operation 1006, in one embodiment, the fatigue level is detected once every twenty-four hours. This provides information about the user's fatigue level each day so that the user's activity levels may be directed according to the fatigue level. In various embodiments, the fatigue level is detected more or less often. Using the fatigue level, a user may determine whether or not an activity is necessary (or desirable), the appropriate activity intensity, and the appropriate activity duration. For example, in deciding whether to go on a run, or how long to run, the user may want to useoperation 1006 to assess the user's current fatigue level. Then, the user may, for example, run for a shorter time if the user is more fatigued, or for a longer time if the user is less fatigued. In some cases, it may be beneficial to detect the fatigue level in the morning, upon the user's waking up. This may provide the user a reference for how the day's activities should proceed. - Referring again to
FIG. 10A ,operation 1008 involves creating and updating a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level (which is described above with reference to operation 1006). In one embodiment, the archive includes historical information about the movement and the metabolic activity score. The archive may include, for example, information about past user activity types, past user activity intensities, and past fatigue levels, as well as the relationships between each of these (e.g., if fatigue levels are particularly high after a certain user activity type or after a user achieve a particular metabolic activity score). The archive may also include historical information relative to particular score periods and score period multipliers. The archive, in various embodiments, is embedded or stored inapparatus 702 orcomputing device 708. - In embodiments, the dynamic recovery profile is created and updated based on the archive. In one embodiment, being based on the user's actual (historical) and detected fatigue level, the dynamic recovery profile is specific to the user's personal fatigue characteristics and responses. The dynamic recovery profile, for example, may reflect information indicating that the user typically has a very high fatigue level when the user gets less than six hours of sleep. In another instance, the dynamic recovery profile may indicate that the user typically has a very high fatigue level following a day in which the user achieves a metabolic activity score above a certain amount (or a particular user activity intensity that is sustained over a particular amount of time). In another example, the user's fatigue levels may not follow typical trends, and the archive can account for this. For example, while the average user may present a fatigue level of 4 when well rested, the archive may reflect that the user has recorded a fatigue level of 6 when rested. The archive provides a means for the fatigue level measurement to be normalized to the user's specific HRV and fatigue levels.
- The dynamic recovery profile, in other words, learns the fatigue tendencies of the user by compiling, by way of the archive, data about the user. Moreover, the dynamic recovery profile provides a contoured baseline that is continually adjusted as the user's performance, fatigue, and recovery tendencies change over time. In one embodiment, the dynamic recovery profile represents a range of fatigue levels that are normal for the user. For example, based on data in the archive, the dynamic recovery profile may indicate that fatigue levels between 40 and 60 are typical for the user. The dynamic recovery profile, in one embodiment, accounts for changes in the historical information over time by updating the dynamic recovery profile on a periodic basis. In a further embodiment, the user programs the dynamic recovery profile to refresh periodically to capture recent historical information. Updates to the dynamic recovery profile, in one instance, are based on rates or amounts of change that may occur over time to the historical information in the archive.
- The dynamic recovery profile, in one embodiment, is implemented in conjunction with an archive table that represents data and relationships of parameters relative to that data. In one instance, the archive table uses the parameters of metabolic activity score (MAS), date, fatigue level, sleep time, and average user activity intensity (UAI) to organize the data and extract relational information. This is illustrated in
FIG. 10D , which provides archive table 1020 (which may be embodied in the archive). Archive table 1020 includes the parameters ofdate 1022,MAS 1024,average UAI 1026,sleep time 1028, andfatigue level 1030. In other instances, archive table 1020 may include only information about the user's measured fatigue levels. - In various embodiments, archive table 1020 includes any other parameters that are monitored, determined, or created by
method 1000. In some embodiments, archive table 1020 includes analytics. Such analytics include statistical relationships of the various parameters in archive table 1020. For example,archive 1020 may include analytics such as mean ratio of fatigue level to MAS, mean ratio of sleep to MAS, mean fatigue level by day of the week, and so on. These analytics allow the dynamic recovery profile to back into optimal performance regimens specific to the user. - To illustrate, the dynamic recovery profile may determine (from archive table 1020) that the user has a mean fatigue level of 7 following a day when sleep to MAS ratio is 6 to 2,000, and may determine that the user typically achieves a below average MAS on days when the fatigue level is 7 or higher. In such an example, the dynamic recovery profile may indicate that the user should get more sleep, or should strive for a lower MAS, to avoid becoming overly fatigued. The dynamic recovery profile, in one embodiment, reflects information about the user's optimal fatigue scenarios; that is, fatigue levels at which the user tends to historically achieve a high MAS. The optimal fatigue scenario may be specific to the user (e.g., some users may have greater capacity for activity when more fatigued, etc.).
- Referring again to
FIG. 10A ,operation 1010 involves creating and updating an interpreted recovery score based on the fatigue level and the dynamic recovery profile. The interpreted recovery score, because it is based on both the fatigue level detected and on actual, historical results (as incorporated into the dynamic recovery profile), provides higher resolution and additional perspective into the user's current performance state. In one embodiment, the interpreted recovery score supplements the fatigue level with information to account for the user's past activities (e.g., from the archive). The interpreted recovery score may be, for example, a number selected from a range of numbers. In one case, the interpreted recovery score may be proportional to the fatigue level (e.g., higher fatigue corresponds to higher interpreted recovery score). In one embodiment, a typical interpreted recovery score ranges from 40 to 60. - The interpreted recovery score, by way of the dynamic recovery profile (which is based on the archive), in one embodiment, has available information about the user activity type, the user activity intensity, and the duration of the user's recent activities, as well as analytics of historical information pertaining to the user's activities. The interpreted recovery score may use this information, in addition to the current fatigue level, to provide higher resolution into the user's capacity for activity. For example, if the user slept poorly, but for some reason this lack of sleep is not captured in the fatigue level measurement (e.g., if the HRV is consistent rather than sporadic), the interpreted recovery score may be adjusted to account for the user's lack of sleep. In this example, the lack of sleep information would be available via archived activity type detection and movement monitoring. In other embodiments, the interpreted recovery score will be based only on historic fatigue levels specific to the user. In various embodiments,
operation 1010 is performed by interpretedrecovery score module 808. -
FIG. 11 is an operational flow diagram illustrating anexample method 1100 for providing an interpreted recovery score in accordance with an embodiment of the present disclosure. In one embodiment,apparatus 702,earphones 100,computing device 200 and/orcomputing device 708 perform various operations ofmethod 1100. In addition,method 1100 may include, atoperation 1102, various operations frommethod 1000. - In one embodiment, at
operation 1104,method 1100 involves creating an initial recovery profile. The initial recovery profile is based on a comparison of the user information to normative group information. The normative group may include information collected from a group of people other than the user. The normative group information may be averaged and used as a baseline for the initial recovery profile (an expectation of user activity levels) before any historical information is generated. - The normative group information, in one embodiment, is adjusted according to different possible sets of user information. For example, the normative group information may collected and average (or otherwise statistically analyzed). A user information multiplier may be created based on a comparison of the normative group information and the user information. The user information multiplier may be applied to the normative group information to adjust the normative group information such that the normative group information becomes specific to the user's information and characteristics. For example, an average value of the normative group information may be increased if the user is younger than the average group member, or may decrease the average for a user that is less active than the average group member. This adjustment, in one embodiment, results in an initial recovery profile that is based on the normative group information but is specific to the user information (and the user). The initial recovery profile may represent a user-specific expectation for activity level (e.g., for MAS). The initial recovery profile may also represent a user-specific expectation for fatigue level. In various embodiments,
operation 1104 is performed by initialrecovery profile module 902. - In one embodiment, creating and updating the dynamic recovery profile is further based on the initial recovery profile. In such an embodiment, if the historical information about the user's fatigue levels indicates that the user is typically more fatigued than the user's initial recovery profile indicates the user is expected to be, the dynamic recovery profile is updated in a way that reflects this discrepancy. For example, based on actual fatigue levels detected, the dynamic recovery profile may expect a higher fatigue level than indicated by the initial recovery profile.
- The dynamic recovery profile, in one embodiment, learns over time what fatigue levels or range of fatigue level is normal from the user. During this learning phase, the dynamic recovery profile may include a blend of information from the archive and the initial recovery profile. The dynamic recovery profile, in such an embodiment, more heavily weights the information from the archive as the archive gathers information that is increasingly complete. For example, before taking any fatigue measurements, the dynamic recovery profile may be based entirely on the initial recovery profile (which is derived from normative data). Then, for example, after detecting and storing in the archive two weeks' worth of fatigue level information from the user the dynamic recovery profile may weigh the information from the archive more heavily (e.g., base the dynamic recovery profile 50% on the archive and 50% on the initial recovery profile). Eventually, once the dynamic recovery profile captures complete information in the archive (e.g., after two months' worth of detecting fatigue level information), the dynamic recovery profile may phase out the initial recovery profile entirely. That is, the dynamic recovery profile may be entirely based on the archive. In other words, the dynamic recovery profile, in such an embodiment, phases out the initial recovery profile as the amount of information in the archive increases.
- In further embodiments, the historical information about the user activity type or user activity intensity (or MAS) may differ from the initial recovery profile in a way that warrants a shift in expected activity levels. For example, the initial recovery profile may expect a higher or lower amount of user activity intensity (or MAS) than is in reality measured. This discrepancy may be resolved by updating the dynamic recovery profile based on the archive. For example, the dynamic recovery profile may be decreased because the user is not performing at the level (e.g., MAS) initially expected (or indicated by the initial recovery profile).
- In addition, the user information may change in a way that causes the initial recovery profile, created at
operation 1104, to lose its accuracy. The dynamic recovery profile may be updated to reflect such changes, such that the dynamic recovery profile is more accurate. For example, the user's weight or age may change. As a result, the normative group data used to generate the initial recovery profile may become stale. This may be resolved by updating the dynamic recovery profile (e.g., with the user's actual weight). The dynamic recovery profile may function as a version of the initial recovery profile adjusted according to the historical information in the archive. - Referring again to
FIG. 11 , in one embodiment,method 1100 includesoperation 1106, which involves providing a recovery status based on the interpreted recovery score. The recovery status may be based on various thresholds of the interpreted recovery score. For example, the recovery status may be represented on a numerical, descriptive, or color scale, or the like. In one instance, the recovery status is directly proportional to the interpreted recovery score. The recovery status, in such an example, may indicate the user's need to rest from strenuous activity or high levels of activity. In the case that the recovery status is numerical, a negative recovery status may indicate that the user is over-rested, a positive recovery status may indicate that rest is needed, and a small recovery status (i.e., near-zero) may indicate an optimal recovery level. - In one embodiment of the descriptive recovery status, the recovery status includes the following: fatigued, recovered, and optimal. If the interpreted recovery score is below a lowest threshold, in the descriptive recovery status example, the recovery status will be “recovered.” This indicates that the user is fully rested. In some instances, “recovered” is distinguished from “optimal” because “recovered” indicates that the user is too rested and has less capacity for activity. Further illustrating the descriptive recovery status example, if the interpreted recovery score is above the lowest threshold but below the highest threshold, the recovery status will be “optimal.” This indicates that the user has peak capacity for activity. “Optimal” recovery status may be associated with the scenario in which the user is rested, but no overly so. If the interpreted recovery score is above the highest threshold, the recovery status (in this example) will be “fatigued.” This indicates that the user has minimal capacity for activity because the user needs to rest. In various embodiments, the recovery status is based on any number of thresholds and may be further stratified for higher granularity into the user's recovery status.
-
Method 1100, in one embodiment, includesoperation 1108, as illustrated inFIG. 11 .Operation 1108 involves providing an activity recommendation based on the interpreted recovery score. For example, if the interpreted recovery score is high, indicating that the user is more fatigued, lower user activity intensities may be recommended. If the interpreted recovery score is low, indicating that the user is well-rested, higher activity intensities may be recommended. This example applies to recommended activity durations in a similar fashion (e.g., longer durations if less fatigued, etc.). - In a further embodiment,
method 1100 includesoperation 1110, which involves comparing the interpreted recovery score to a past interpreted recovery score. In this embodiment, the interpreted recovery is associated with a measuring period and the past interpreted recovery score is associated with a past measuring period. Interpreted recovery scores may be stored and associated with past measuring periods (i.e., the measured period during which the interpreted recovery score was created). In this way, past interpreted recovery scores and information associated therewith may be used to inform the user's current activity. - At
operation 1110, comparing the scores the may include providing a simple numerical readout of both scores (e.g., side by side). In one embodiment, information about the time of day associated with the past interpreted recovery score is presented. For example, the time of day at which the past interpreted recovery score was created may be presented. This may inform the user of how the user's current interpreted activity score relates to the past interpreted recovery score, allowing the user to gauge how the interpreted recovery score may correlate to the user's physical state or feeling. - In another embodiment, the past interpreted recovery score is displayed on a graph (e.g., a line or bar graph) as a function of time (e.g., comparing against other past interpreted recovery scores from past measuring periods). The graph may be overlaid with a graph of the current interpreted recovery score. One of ordinary skill in the art will appreciate other ways to compare the interpreted recovery scores. In various embodiments,
operation 1110 is performed by interpretedrecovery score module 808. -
FIG. 12 is an operational flow diagram illustrating anexample method 1200 for providing an interpreted recovery score in accordance with an embodiment of the present disclosure. In one embodiment, apparatus 702 (e.g.biometric earphones 100, or computing device 200) and computing device 708 (e.g. computing device 200) perform various operations ofmethod 1200. - In one embodiment, at
operation 1204,method 1200 involves performing a comparison of the interpreted recovery score to the fatigue level.Operation 1206, in another embodiment, involves tracking the comparison over time. As described above, the fatigue level may be associated with physical phenomena, including HRV, while the interpreted recovery score is based on actual, historical information (via the dynamic recovery profile), include past fatigue levels for the user. In one embodiment, tracking the comparison over time (operation 1206) provides insight into how lifestyle choices affect performance capacity and fatigue levels. For example, the comparison may provide a normalization for the user's typical fatigue levels as they change over time relative to past fatigue levels. - Referring again to
FIG. 12 , in one embodiment, atoperation 1208,method 1200 involves receiving an external interpreted recovery score. The external interpreted recovery score may be received in a number of ways (e.g., via communication medium 704). The external interpreted recovery score may be created and updated in a manner similar to the creating and updating of the interpreted recovery (operation 1010). The external interpreted recovery score may be from a second user, who is any user other than the user. The second user may be a friend or associate of the first user. In various embodiments,operation 1208 is performed by interpretedrecovery score module 808. - At
operation 1210, an embodiment ofmethod 1200 involves comparing the external interpreted recovery score to the interpreted recovery score. The external interpreted recovery score may be compared to the interpreted recovery score in a fashion substantially similar to the comparison performed inoperation 1110.Operation 1210 allows the user to compare the user's interpreted recovery score (based on the user's fatigue level) to the interpreted recovery score of another user (based on the other user's fatigue level). In various embodiments,operation 1210 is performed by interpretedrecovery score module 808. - In one embodiment, the operations of
method 1000,method 1100, andmethod 1200 are performed using sensors configured to be attached to the body (e.g., the biometric earphones worn in the ears of a user). Such sensors may include a gyroscope or accelerometer to detect movement, and a heart-rate sensor (e.g. optical heartrate sensor 122), each of which may be embedded in a pair of earphones that a user can wear in the user's ears, such asearphones 100, or a device or module such ascomputing device 200. Such sensors may be used to perform the operations of monitoring the movement, detecting the fatigue level, creating and updating the dynamic recovery profile, and creating and updating the interpreted recovery score, or any other operation disclosed herein. In further embodiments, sensors used to perform these operations may be standalone sensors, and may not attach to the body (e.g. coupled tocomputing device 200, or other computing device). - For exemplary purposes,
FIGS. 13-16 are provided to depict example user interfaces that may be used to display, and allow a user to interact with, the various data detected and computed in accordance with the above described systems and methods. Although not all data that can be provided by the systems and methods disclosed herein are depicted inFIGS. 13-16 , the figures nevertheless provide context for conveying how such data may be provided to a user. -
FIG. 13 illustrates anactivity display 1300 that may be associated with an activity display module, such asactivity display module 211 ofactivity tracking application 210. In various embodiments,activity display 1300 may visually present to a user a record of the user's activity. As illustrated,activity display 1300 may comprise adisplay navigation area 1301,activity icons 1302,activity goal section 1303,live activity chart 1304, andactivity timeline 1305. As illustrated in this particular embodiment,display navigation area 1301 allows a user to navigate between the various displays associated with modules 211-214 by selecting “right” and “left” arrows depicted at the top of the display on either side of the display screen title. An identification of the selected display may be displayed at the center of thenavigation area 1301. Other selectable displays may displayed on the left and right sides ofnavigation area 1301. For example, in this embodiment theactivity display 1300 includes the identification “ACTIVITY” at the center of the navigation area. If the user wishes to navigate to a sleep display in this embodiment, the user may select the left arrow. In implementations wherecomputing device - In various embodiments,
activity icons 1302 may be displayed onactivity display 1300 based on the user's predicted or self-reported activity. For example, in this particularembodiment activity icons 1302 are displayed for the activities of walking, running, swimming, sport, and biking, indicating that the user has performed these five activities. In one particular embodiment, one or more modules ofapplication 210 may estimate the activity being performed (e.g., sleeping, walking, running, or swimming) by comparing the data collected by a biometric earphone's sensors to pre-loaded or learned activity profiles. For example, accelerometer data, gyroscope data, heartrate data, or some combination thereof may be compared to preloaded activity profiles of what the data should look like for a generic user that is running, walking, or swimming. In implementations of this embodiment, the preloaded activity profiles for each particular activity (e.g., sleeping, running, walking, or swimming) may be adjusted over time based on a history of the user's activity, thereby improving the activity predictive capability of the system. In additional implementations,activity display 1300 allows a user to manually select the activity being performed (e.g., via touch gestures), thereby enabling the system to accurately adjust an activity profile associated with the user-selected activity. In this way, the system's activity estimating capabilities will improve over time as the system learns how particular activity profiles match an individual user. Particular methods of implementing this activity estimation and activity profile learning capability are described in U.S. patent application Ser. No. 14/568,835, filed Dec. 12, 2014, titled “System and Method for Creating a Dynamic Activity Profile”, and which is incorporated herein by reference in its entirety. - In various embodiments, an
activity goal section 1303 may display various activity metrics such as a percentage activity goal providing an overview of the status of an activity goal for a timeframe (e.g., day or week), an activity score or other smart activity score associated with the goal, and activities for the measured timeframe (e.g., day or week). For example, the display may provide a user with a current activity score for the day versus a target activity score for the day. Particular methods of calculating activity scores are described in U.S. patent application Ser. No. 14/137,734, filed Dec. 20, 2013, titled “System and Method for Providing a Smart Activity Score”, and which is incorporated herein by reference in its entirety. - In various embodiments, the percentage activity goal may be selected by the user (e.g., by a touch tap) to display to the user an amount of a particular activity (e.g., walking or running) needed to complete the activity goal (e.g., reach 100%). In additional embodiments, activities for the timeframe may be individually selected to display metrics of the selected activity such as points, calories, duration, or some combination thereof. For example, in this particular embodiment
activity goal section 1303 displays that 100% of the activity goal for the day has been accomplished. Further,activity goal section 1303 displays that activities of walking, running, biking, and no activity (sedentary) were performed during the day. This is also displayed as a numerical activity score 5000/5000. In this embodiment, a breakdown of metrics for each activity (e.g., activity points, calories, and duration) for the day may be displayed by selecting the activity. - A
live activity chart 1304 may also display an activity trend of the aforementioned metrics (or other metrics) as a dynamic graph at the bottom of the display. For example, the graph may be used to show when user has been most active during the day (e.g., burning the most calories or otherwise engaged in an activity). - An
activity timeline 1305 may be displayed as a collapsed bar at the bottom ofdisplay 1300. In various embodiments, when a user selectsactivity timeline 1305, it may display a more detailed breakdown of daily activity, including, for example, an activity performed at a particular time with associated metrics, total active time for the measuring period, total inactive time for the measuring period, total calories burned for the measuring period, total distance traversed for the measuring period, and other metrics. -
FIG. 14 illustrates asleep display 1400 that may be associated with asleep display module 212. In various embodiments,sleep display 1400 may visually present to a user a record of the user's sleep history and sleep recommendations for the day. It is worth noting that in various embodiments one or more modules of theactivity tracking application 210 may automatically determine or estimate when a user is sleeping (and awake) based on an a pre-loaded or learned activity profile for sleep, in accordance with the activity profiles described above. Alternatively, the user may interact with thesleep display 1400 or other display to indicate that the current activity is sleep, enabling the system to better learn that individualized activity profile associated with sleep. The modules may also use data collected from the earphones, including fatigue level and activity score trends, to calculate a recommended amount of sleep. The modules may also use collected and processed data from the earphones, including some or all of the data related to fatigue level, recovery recommendations, dynamic recovery profile, interpreted recovery score, recovery status, initial recovery profile, activity score trends, HRV, etc. to calculate an interpreted recovery score as has been described in detail above. Systems and methods for implementing this functionality are further described in U.S. patent application Ser. No. 14/568,835, filed Dec. 12, 2014, and titled “System and Method for Creating a Dynamic Activity Profile”, and U.S. patent application Ser. No. 14/137,942, filed Dec. 20, 2013, titled “System and Method for Providing an Interpreted Recovery Score,” both of which are incorporated herein by reference in their entirety. - As illustrated,
sleep display 1400 may comprise adisplay navigation area 1401, a centersleep display area 902, atextual sleep recommendation 1403, and a sleeping detail ortimeline 1404.Display navigation area 1401 allows a user to navigate between the various displays associated with modules 211-214 as described above. In this embodiment thesleep display 1400 includes the identification “SLEEP” at the center of thenavigation area 1401. - Center
sleep display area 1402 may display sleep metrics such as the user's recent average level of sleep orsleep trend 1402A, a recommended amount of sleep for thenight 1402B, and an idealaverage sleep amount 1402C. In various embodiments, these sleep metrics may be displayed in units of time (e.g., hours and minutes) or other suitable units. Accordingly, a user may compare a recommended sleep level for the user (e.g., metric 1402B) against the user's historical sleep level (e.g., metric 1402A). In one embodiment, thesleep metrics 1402A-902C may be displayed as a pie chart showing the recommended and historical sleep times in different colors. In another embodiment,sleep metrics 1402A-902C may be displayed as a curvilinear graph showing the recommended and historical sleep times as different colored, concentric lines. This particular embodiment is illustrated inexample sleep display 1400, which illustrates an inner concentric line for recommended sleep metric 1402B and an outer concentric line for average sleep metric 1402A. In this example, the lines are concentric about a numerical display of the sleep metrics. - In various embodiments, a
textual sleep recommendation 1403 may be displayed at the bottom or other location ofdisplay 1400 based on the user's recent sleep history. A sleeping detail ortimeline 1404 may also be displayed as a collapsed bar at the bottom ofsleep display 1400. In various embodiments, when a user selects sleepingdetail 1404, it may display a more detailed breakdown of daily sleep metrics, including, for example, total time slept, bedtime, and wake time. In particular implementations of these embodiments, the user may edit the calculated bedtime and wake time. In additional embodiments, the selected sleepingdetail 1404 may graphically display a timeline of the user's movements during the sleep hours, thereby providing an indication of how restless or restful the user's sleep is during different times, as well as the user's sleep cycles. For the example, the user's movements may be displayed as a histogram plot charting the frequency and/or intensity of movement during different sleep times. -
FIG. 15 illustrates an activity recommendation andfatigue level display 1500 that may be associated with an activity recommendation and fatiguelevel display module 213. In various embodiments,display 1500 may visually present to a user the user's current fatigue level and a recommendation of whether or not engage in activity. It is worth noting that one or more modules ofactivity tracking application 210 may track fatigue level based on data received from theearphones 100, and make an activity level recommendation. For example, HRV data tracked at regular intervals may be compared with other biometric or biological data to determine how fatigued the user is. Additionally, the HRV data may be compared to pre-loaded or learned fatigue level profiles, as well as a user's specified activity goals. Particular systems and methods for implementing this functionality are described in greater detail in U.S. patent application Ser. No. 14/140,414, filed Dec. 24, 2013, titled “System and Method for Providing an Intelligent Goal Recommendation for Activity Level”, and which is incorporated herein by reference in its entirety. - As illustrated,
display 1500 may comprise a display navigation area 1501 (as described above), atextual activity recommendation 1502, and a center fatigue andactivity recommendation display 1503.Textual activity recommendation 1502 may, for example, display a recommendation as to whether a user is too fatigued for activity, and thus must rest, or if the user should be active.Center display 1503 may display an indication to a user to be active (or rest) 1503A (e.g., “go”), anoverall score 1503B indicating the body's overall readiness for activity, and anactivity goal score 1503C indicating an activity goal for the day or other period. In various embodiments,indication 1503A may be displayed as a result of a binary decision—for example, telling the user to be active, or “go”—or on a scaled indicator—for example, a circular dial display showing that a user should be more or less active depending on where a virtual needle is pointing on the dial. - In various embodiments,
display 1500 may be generated by measuring the user's HRV at the beginning of the day (e.g., within 30 minutes of waking up.) For example, the user's HRV may be automatically measured using theoptical heartrate sensor 122 after the user wears the earphones in a position that generates a good signal as described inmethod 400. In embodiments, when the user's HRV is being measured,computing device 200 may display any one of the following: an instruction to remain relaxed while the variability in the user's heart signal (i.e., HRV) is being measured, an amount of time remaining until the HRV has been sufficiently measured, and an indication that the user's HRV is detected. After the user's HRV is measured byearphones 100 for a predetermined amount of time (e.g., two minutes), one or more processing modules ofcomputing device 200 may determine the user's fatigue level for the day and a recommended amount of activity for the day. Activity recommendation andfatigue level display 1500 is generated based on this determination. - In further embodiments, the user's HRV may be automatically measured at predetermined intervals throughout the day using
optical heartrate sensor 122. In such embodiments, activity recommendation andfatigue level display 1500 may be updated based on the updated HRV received throughout the day. In this manner, the activity recommendations presented to the user may be adjusted throughout the day. -
FIG. 16 illustrates a biological data andintensity recommendation display 1600 that may be associated with a biological data and intensityrecommendation display module 214. In various embodiments,display 1600 may guide a user of the activity monitoring system through various fitness cycles of high-intensity activity followed by lower-intensity recovery based on the user's body fatigue and recovery level, thereby boosting the user's level of fitness and capacity on each cycle. - As illustrated,
display 1600 may include atextual recommendation 1601, acenter display 1602, and a historical plot 1103 indicating the user's transition between various fitness cycles. In various embodiments,textual recommendation 1601 may display a current recommended level of activity or training intensity based on current fatigue levels, current activity levels, user goals, pre-loaded profiles, activity scores, smart activity scores, historical trends, and other bio-metrics of interest.Center display 1602 may display afitness cycle target 1602A (e.g., intensity, peak, fatigue, or recovery), anoverall score 1602B indicating the body's overall readiness for activity, anactivity goal score 1602C indicating an activity goal for the day or other period, and an indication to a user to be active (or rest) 1602D (e.g., “go”). The data ofcenter display 1602 may be displayed, for example, on a virtual dial, as text, or some combination thereof. In one particular embodiment implementing a dial display, recommended transitions between various fitness cycles (e.g., intensity and recovery) may be indicated by the dial transitioning between predetermined markers. - In various embodiments,
display 1600 may display a historical plot 1103 that indicates the user's historical and current transitions between various fitness cycles over a predetermined period of time (e.g., 30 days). The fitness cycles, may include, for example, a fatigue cycle, a performance cycle, and a recovery cycle. Each of these cycles may be associated with a predetermined score range (e.g.,overall score 1602B). For example, in one particular implementation a fatigue cycle may be associated with an overall score range of 0 to 33, a performance cycle may be associated with an overall score range of 34 to 66, and a recovery cycle may be associated with an overall score range of 67 to 100. The transitions between the fitness cycles may be demarcated by horizontal lines intersecting the historical plot 1103 at the overall score range boundaries. For example, the illustrated historical plot 1103 includes two horizontal lines intersecting the historical plot. In this example, measurements below the lowest horizontal line indicate a first fitness cycle (e.g., fatigue cycle), measurements between the two horizontal lines indicate a second fitness cycle (e.g., performance cycle), and measurements above the highest horizontal line indicate a third fitness cycle (e.g., recovery cycle). -
FIG. 17 illustrates an example computing module that may be used to implement various features of the systems and methods disclosed herein. In one embodiment, the computing module includes a processor and a set of computer programs residing on the processor. The set of computer programs is stored on a non-transitory computer readable medium having computer executable program code embodied thereon. The computer executable code is configured to detect a fatigue level. The computer executable code is also configured to create and update a dynamic recovery profile based on an archive. The archive includes historical information about the fatigue level. The computer executable code is further configured to create and update an interpreted recovery score based on the fatigue level and the dynamic recovery profile. - The example computing module may be used to implement these various features in a variety of ways, as described above with reference to the methods and tables illustrated in
FIGS. 10A , 10B, 10C, 10D, 11, and 12, and as will be appreciated by one of ordinary skill in the art upon reading this disclosure. - As used herein, the term module might describe a given unit of functionality that can be performed in accordance with one or more embodiments of the present application. As used herein, a module might be implemented utilizing any form of hardware, software, or a combination thereof. For example, one or more processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a module. In implementation, the various modules described herein might be implemented as discrete modules or the functions and features described can be shared in part or in total among one or more modules. In other words, as would be apparent to one of ordinary skill in the art after reading this description, the various features and functionality described herein may be implemented in any given application and can be implemented in one or more separate or shared modules in various combinations and permutations. Even though various features or elements of functionality may be individually described or claimed as separate modules, one of ordinary skill in the art will understand that these features and functionality can be shared among one or more common software and hardware elements, and such description shall not require or imply that separate hardware or software components are used to implement such features or functionality.
- Where components or modules of the application are implemented in whole or in part using software, in one embodiment, these software elements can be implemented to operate with a computing or processing module capable of carrying out the functionality described with respect thereto. One such example computing module is shown in
FIG. 17 . Various embodiments are described in terms of this example-computing module 1700. After reading this description, it will become apparent to a person skilled in the relevant art how to implement the application using other computing modules or architectures. - Referring now to
FIG. 17 , computing module 1700 may represent, for example, computing or processing capabilities found within desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, smart-watches, smart-glasses etc.); mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment. Computing module 1700 might also represent computing capabilities embedded within or otherwise available to a given device. For example, a computing module might be found in other electronic devices such as, for example, digital cameras, navigation systems, cellular telephones, portable computing devices, modems, routers, WAPs, terminals and other electronic devices that might include some form of processing capability. - Computing module 1700 might include, for example, one or more processors, controllers, control modules, or other processing devices, such as a
processor 1704.Processor 1704 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic. In the illustrated example,processor 1704 is connected to abus 1702, although any communication medium can be used to facilitate interaction with other components of computing module 1700 or to communicate externally. - Computing module 1700 might also include one or more memory modules, simply referred to herein as
main memory 1708. For example, preferably random access memory (RAM) or other dynamic memory, might be used for storing information and instructions to be executed byprocessor 1704.Main memory 1708 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed byprocessor 1704. Computing module 1700 might likewise include a read only memory (“ROM”) or other static storage device coupled tobus 1702 for storing static information and instructions forprocessor 1704. - The computing module 1700 might also include one or more various forms of
information storage mechanism 1710, which might include, for example, amedia drive 1712 and astorage unit interface 1720. The media drive 1712 might include a drive or other mechanism to support fixed orremovable storage media 1714. For example, a hard disk drive, a solid state drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive might be provided. Accordingly,storage media 1714 might include, for example, a hard disk, a solid state drive, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed bymedia drive 1712. As these examples illustrate, thestorage media 1714 can include a computer usable storage medium having stored therein computer software or data. - In alternative embodiments,
information storage mechanism 1710 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 1700. Such instrumentalities might include, for example, a fixed orremovable storage unit 1722 and astorage interface 1720. Examples ofsuch storage units 1722 andstorage interfaces 1720 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed orremovable storage units 1722 andstorage interfaces 1720 that allow software and data to be transferred from thestorage unit 1722 to computing module 1700. - Computing module 1700 might also include a
communications interface 1724.Communications interface 1724 might be used to allow software and data to be transferred between computing module 1700 and external devices. Examples ofcommunications interface 1724 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface. Software and data transferred viacommunications interface 1724 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a givencommunications interface 1724. These signals might be provided tocommunications interface 1724 via achannel 1728. Thischannel 1728 might carry signals and might be implemented using a wired or wireless communication medium. Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels. - In this document, the terms “computer program medium” and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example, memory 1308,
storage unit 1720,media 1714, andchannel 1728. These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution. Such instructions embodied on the medium are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable the computing module 1700 to perform features or functions of the present application as discussed herein. - The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term “module” does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
- Additionally, the various embodiments set forth herein are described in terms of exemplary block diagrams, flow charts and other illustrations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated examples. For example, block diagrams and their accompanying description should not be construed as mandating a particular architecture or configuration.
- While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not of limitation. Likewise, the various diagrams may depict an example architectural or other configuration for the disclosure, which is done to aid in understanding the features and functionality that can be included in the disclosure. The disclosure is not restricted to the illustrated example architectures or configurations, but the desired features can be implemented using a variety of alternative architectures and configurations. Indeed, it will be apparent to one of skill in the art how alternative functional, logical or physical partitioning and configurations can be implemented to implement the desired features of the present disclosure. Also, a multitude of different constituent module names other than those depicted herein can be applied to the various partitions. Additionally, with regard to flow diagrams, operational descriptions and method claims, the order in which the steps are presented herein shall not mandate that various embodiments be implemented to perform the recited functionality in the same order unless the context dictates otherwise.
- Although the disclosure is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the other embodiments of the disclosure, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments.
Claims (28)
1. An apparatus for providing an interpreted recovery score, comprising:
an earphone containing a biometric sensor;
a fatigue level module that detects a fatigue level;
a dynamic recovery profile module that creates and updates a dynamic recovery profile based on an archive, the archive comprising historical information about the fatigue level; and
an interpreted recovery score module that creates and updates an interpreted recovery score based on the fatigue level and the dynamic recovery profile
wherein at least one of the fatigue level module, the dynamic recovery profile module, and the interpreted recovery score module is embodied in the earphone.
2. The apparatus of claim 1 , further comprising an initial recovery profile module that creates an initial recovery profile, the initial recovery profile based on a comparison of the user information to normative group information.
3. The apparatus of claim 2 , wherein the dynamic recovery profile module creates and updates the dynamic recovery profile further based on the initial recovery profile.
4. The apparatus of claim 1 , further comprising a recovery status module that provides a recovery status based on the interpreted recovery score.
5. The apparatus of claim 4 , wherein the recovery status is selected from the group consisting of fatigued, recovered, and optimal.
6. The apparatus of claim 1 ,
wherein the interpreted recovery score module performs a comparison of the interpreted recovery score to the fatigue level; and
wherein the interpreted recovery score module tracks the comparison over time.
7. The apparatus of claim 1 , further comprising a recovery recommendation module that provides an activity recommendation based on the interpreted recovery score.
8. The apparatus of claim 1 , wherein the interpreted recovery score is specific to a measuring period.
9. The apparatus of claim 1 , wherein the sensor comprises an optical heartrate sensor.
10. The apparatus of claim 1 , wherein the sensor comprises an optical heartrate sensor protruding from a side of an earphone proximal to an interior side of a user's ear when the earphone is worn.
11. The apparatus of claim 10 , wherein the optical heartrate sensor protrudes from a side of the earphone proximal to a user's tragus when the earphone is worn.
12. A method for providing an interpreted recovery score, comprising:
detecting a fatigue level;
creating and updating a dynamic recovery profile based on an archive, the archive comprising historical information about the fatigue level; and
creating and updating an interpreted recovery score based on the fatigue level and the dynamic recovery profile;
wherein at least one of the operations of detecting the fatigue level, creating and updating the dynamic recovery profile, and creating and updating the interpreted recovery score comprises using data collected an earphone containing a biometric sensor.
13. The method of claim 12 , further comprising creating an initial recovery profile based on a comparison of the user information to normative group information.
14. The method of claim 13 , wherein creating and updating the dynamic recovery profile is further based on the initial recovery profile.
15. The method of claim 13 , further comprising providing a recovery status based on the interpreted recovery score.
16. The method of claim 15 , wherein the recovery status is selected from the group consisting of fatigued, recovered, and optimal.
17. The method of claim 14 , wherein the dynamic recovery profile phases out the initial recovery profile as an amount of the historical information in the archive increases.
18. The method of claim 12 , further comprising providing an activity recommendation based on the interpreted recovery score.
19. The method of claim 12 , further comprising:
receiving an external interpreted recovery score; and
comparing the external interpreted recovery score to the interpreted recovery score.
20. The method of claim 12 , further comprising comparing the interpreted recovery score to a past interpreted recovery score, the interpreted recovery score associated with a measuring period, the past interpreted recovery associated with a past measuring period.
21. The method of claim 12 , wherein the biometric sensor is an optical heartrate sensor.
22. The method of claim 12 , wherein the biometric sensors is an optical heartrate sensor protruding from a side of an earphone proximal to an interior side of a user's ear when the earphone is worn.
23. The method of claim 12 , wherein the optical heartrate sensor protrudes from a side of the earphone proximal to a user's tragus when the earphone is worn.
24. A system for providing an interpreted recovery score, comprising:
an earphone, comprising:
a biometric sensor;
a first processor;
a computing device communicatively coupled to the earphone, the computing device comprising:
a second processor; and
wherein at least one computer program resides on at least one of the first or second processor;
wherein the at least one computer program is stored on a non-transitory computer readable medium having computer executable program code embodied thereon, the computer executable program code configured to:
detect a fatigue level;
create and update a dynamic recovery profile based on an archive comprising historical information about the fatigue level; and
create and update an interpreted recovery score based on the fatigue level and the dynamic recovery profile.
25. The system of claim 24 , wherein the biometric sensor is an optical heartrate sensor.
26. The system of claim 24 , wherein the biometric sensor is a motion sensor.
27. The system of claim 24 , wherein the biometric sensor is an optical heartrate sensor protruding from a side of the earphone proximal to an interior side of a user's ear when the earphone is worn.
28. The system of claim 27 , wherein the optical heartrate sensor protrudes from a side of the earphone proximal to a user's tragus when the earphone is worn.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/934,084 US20160058378A1 (en) | 2013-10-24 | 2015-11-05 | System and method for providing an interpreted recovery score |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/062,815 US20150116125A1 (en) | 2013-10-24 | 2013-10-24 | Wristband with removable activity monitoring device |
US14/137,942 US20150119732A1 (en) | 2013-10-24 | 2013-12-20 | System and method for providing an interpreted recovery score |
US14/137,734 US20150119760A1 (en) | 2013-10-24 | 2013-12-20 | System and method for providing a smart activity score |
US14/934,084 US20160058378A1 (en) | 2013-10-24 | 2015-11-05 | System and method for providing an interpreted recovery score |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/137,942 Continuation-In-Part US20150119732A1 (en) | 2013-10-24 | 2013-12-20 | System and method for providing an interpreted recovery score |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160058378A1 true US20160058378A1 (en) | 2016-03-03 |
Family
ID=55401141
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/934,084 Abandoned US20160058378A1 (en) | 2013-10-24 | 2015-11-05 | System and method for providing an interpreted recovery score |
Country Status (1)
Country | Link |
---|---|
US (1) | US20160058378A1 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD777186S1 (en) * | 2014-12-24 | 2017-01-24 | Logitech Europe, S.A. | Display screen or portion thereof with a graphical user interface |
US20170215742A1 (en) * | 2016-02-01 | 2017-08-03 | Logitech Europe, S.A. | Systems and methods for gathering and interpreting heart rate data from an activity monitoring device |
WO2018226305A1 (en) * | 2017-06-04 | 2018-12-13 | Apple Inc. | Heartrate tracking techniques |
US20190015017A1 (en) * | 2017-07-14 | 2019-01-17 | Seiko Epson Corporation | Portable electronic apparatus |
CN112914536A (en) * | 2021-03-24 | 2021-06-08 | 平安科技(深圳)有限公司 | Motion state detection method and device, computer equipment and storage medium |
US20210290103A1 (en) * | 2018-07-16 | 2021-09-23 | Medical Device Development Group B.V. | User device for registering disease related states of a user |
US11140486B2 (en) | 2017-11-28 | 2021-10-05 | Samsung Electronics Co., Ltd. | Electronic device operating in associated state with external audio device based on biometric information and method therefor |
EP3888547A1 (en) * | 2020-04-02 | 2021-10-06 | Koninklijke Philips N.V. | Device, system and method for generating information on musculoskeletal recovery of a subject |
US11331019B2 (en) | 2017-08-07 | 2022-05-17 | The Research Foundation For The State University Of New York | Nanoparticle sensor having a nanofibrous membrane scaffold |
WO2022129879A1 (en) * | 2020-12-15 | 2022-06-23 | Prevayl Innovations Limited | Method and system for generating a recovery score for a user |
US11595762B2 (en) * | 2016-01-22 | 2023-02-28 | Staton Techiya Llc | System and method for efficiency among devices |
US11683643B2 (en) | 2007-05-04 | 2023-06-20 | Staton Techiya Llc | Method and device for in ear canal echo suppression |
US11693617B2 (en) | 2014-10-24 | 2023-07-04 | Staton Techiya Llc | Method and device for acute sound detection and reproduction |
US11818552B2 (en) | 2006-06-14 | 2023-11-14 | Staton Techiya Llc | Earguard monitoring system |
US11856375B2 (en) | 2007-05-04 | 2023-12-26 | Staton Techiya Llc | Method and device for in-ear echo suppression |
US11889275B2 (en) | 2008-09-19 | 2024-01-30 | Staton Techiya Llc | Acoustic sealing analysis system |
US12047731B2 (en) | 2007-03-07 | 2024-07-23 | Staton Techiya Llc | Acoustic device and methods |
US12183341B2 (en) | 2008-09-22 | 2024-12-31 | St Casestech, Llc | Personalized sound management and method |
US12249326B2 (en) | 2007-04-13 | 2025-03-11 | St Case1Tech, Llc | Method and device for voice operated control |
US12248730B2 (en) | 2018-03-10 | 2025-03-11 | The Diablo Canyon Collective Llc | Earphone software and hardware |
Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5890128A (en) * | 1996-03-04 | 1999-03-30 | Diaz; H. Benjamin | Personalized hand held calorie computer (ECC) |
US6104947A (en) * | 1994-12-29 | 2000-08-15 | Polar Electro Oy | Method and apparatus for determining exertion levels in fitness or athletic training and for determining the stress caused by training |
US6269339B1 (en) * | 1997-04-04 | 2001-07-31 | Real Age, Inc. | System and method for developing and selecting a customized wellness plan |
US20030013995A1 (en) * | 2000-01-18 | 2003-01-16 | Yoshitake Oshima | Fat combustion value calculating method, fat combustion value calculating device, and exercise machine |
US6516222B2 (en) * | 2000-01-05 | 2003-02-04 | Tanita Corporation | Apparatus for determining degree of fatigue of human body |
US6554776B1 (en) * | 2001-11-21 | 2003-04-29 | Medical Graphics Corporation | Method for detecting anaerobic threshold and prescribing a training zone to maximize fat utilization or improved cardiovascular fitness |
US20040019290A1 (en) * | 2002-07-24 | 2004-01-29 | Tanita Corporation | Muscle fatigue measuring equipment |
US6783501B2 (en) * | 2001-07-19 | 2004-08-31 | Nihon Seimitsu Sokki Co., Ltd. | Heart rate monitor and heart rate measuring method |
US20050228239A1 (en) * | 2004-04-12 | 2005-10-13 | Frank Shallenberger | Method for analyzing the biological age of a subject |
US20060079800A1 (en) * | 2004-07-01 | 2006-04-13 | Mega Elektroniikka Oy | Method and device for measuring exercise level during exercise and for measuring fatigue |
US20070197881A1 (en) * | 2006-02-22 | 2007-08-23 | Wolf James L | Wireless Health Monitor Device and System with Cognition |
US20070276282A1 (en) * | 2006-05-24 | 2007-11-29 | Casio Computer Co., Ltd. | Living body information measuring apparatus and system |
US20100079291A1 (en) * | 2008-09-26 | 2010-04-01 | Muve, Inc. | Personalized Activity Monitor and Weight Management System |
US20100137748A1 (en) * | 2006-05-29 | 2010-06-03 | Motoki Sone | Fatigue estimation device and electronic apparatus having the fatigue estimation device mounted thereon |
US20110077472A1 (en) * | 2009-09-11 | 2011-03-31 | Hyperion Biotechnology | Methods and compositions for biomarkers of fatigue, fitness and physical performance capacity |
US20110081037A1 (en) * | 2009-10-07 | 2011-04-07 | Samsung Electronics Co., Ltd. | Earphone device having biological information measuring apparatus |
US20110190645A1 (en) * | 2010-02-02 | 2011-08-04 | Jeff Hunt | Recovery Determination Methods And Recovery Determination Apparatuses |
US20110288424A1 (en) * | 2009-10-29 | 2011-11-24 | Etsuko Kanai | Human fatigue assessment device and human fatigue assessment method |
US20130018592A1 (en) * | 2011-07-15 | 2013-01-17 | Pulsar Informatics, Inc. | Systems and Methods for Inter-Population Neurobehavioral Status Assessment Using Profiles Adjustable to Testing Conditions |
US20130158423A1 (en) * | 2011-12-14 | 2013-06-20 | Rijuven Corporation | Mobile wellness device |
US20130325396A1 (en) * | 2010-09-30 | 2013-12-05 | Fitbit, Inc. | Methods and Systems for Metrics Analysis and Interactive Rendering, Including Events Having Combined Activity and Location Information |
US20140228649A1 (en) * | 2012-07-30 | 2014-08-14 | Treefrog Developments, Inc. | Activity monitoring |
US8812428B2 (en) * | 2010-09-20 | 2014-08-19 | Pulsar Informatics, Inc. | Systems and methods for assessment of fatigue-related contextual performance using historical incident data |
US20140270375A1 (en) * | 2013-03-15 | 2014-09-18 | Focus Ventures, Inc. | System and Method for Identifying and Interpreting Repetitive Motions |
US20150045693A1 (en) * | 2012-03-23 | 2015-02-12 | Juno Medical Llc | Measuring Device and Method for Indicating Level of Fatigue |
US9087446B2 (en) * | 2012-12-28 | 2015-07-21 | Mitsubishi Electric Corporation | Life management apparatus and life management method |
US20150208933A1 (en) * | 2012-09-28 | 2015-07-30 | Rohm Co., Ltd. | Pulse wave sensor |
US20150342480A1 (en) * | 2014-05-30 | 2015-12-03 | Microsoft Corporation | Optical pulse-rate sensing |
US20160007933A1 (en) * | 2013-10-24 | 2016-01-14 | JayBird LLC | System and method for providing a smart activity score using earphones with biometric sensors |
US20160027324A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for providing lifestyle recommendations using earphones with biometric sensors |
US20160022200A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for providing an intelligent goal recommendation for activity level using earphones with biometric sensors |
US20160029125A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for anticipating activity using earphones with biometric sensors |
US20160026856A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for identifying performance days using earphones with biometric sensors |
US20160023047A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for providing a training load schedule for peak performance positioning using earphones with biometric sensors |
US20160030809A1 (en) * | 2013-10-24 | 2016-02-04 | JayBird LLC | System and method for identifying fitness cycles using earphones with biometric sensors |
US20160029974A1 (en) * | 2013-10-24 | 2016-02-04 | JayBird LLC | System and method for tracking biological age over time based upon heart rate variability using earphones with biometric sensors |
US20160199001A1 (en) * | 2015-01-10 | 2016-07-14 | Cheng Uei Precision Industry Co., Ltd. | Heart rate detection earphone |
US20160249133A1 (en) * | 2013-10-07 | 2016-08-25 | Gn Netcom A/S | Earphone device with optical sensor |
US20160287108A1 (en) * | 2015-03-30 | 2016-10-06 | Bose Corporation | Light guide system for physiological sensor |
-
2015
- 2015-11-05 US US14/934,084 patent/US20160058378A1/en not_active Abandoned
Patent Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6104947A (en) * | 1994-12-29 | 2000-08-15 | Polar Electro Oy | Method and apparatus for determining exertion levels in fitness or athletic training and for determining the stress caused by training |
US5890128A (en) * | 1996-03-04 | 1999-03-30 | Diaz; H. Benjamin | Personalized hand held calorie computer (ECC) |
US6269339B1 (en) * | 1997-04-04 | 2001-07-31 | Real Age, Inc. | System and method for developing and selecting a customized wellness plan |
US6516222B2 (en) * | 2000-01-05 | 2003-02-04 | Tanita Corporation | Apparatus for determining degree of fatigue of human body |
US20030013995A1 (en) * | 2000-01-18 | 2003-01-16 | Yoshitake Oshima | Fat combustion value calculating method, fat combustion value calculating device, and exercise machine |
US6783501B2 (en) * | 2001-07-19 | 2004-08-31 | Nihon Seimitsu Sokki Co., Ltd. | Heart rate monitor and heart rate measuring method |
US6554776B1 (en) * | 2001-11-21 | 2003-04-29 | Medical Graphics Corporation | Method for detecting anaerobic threshold and prescribing a training zone to maximize fat utilization or improved cardiovascular fitness |
US20040019290A1 (en) * | 2002-07-24 | 2004-01-29 | Tanita Corporation | Muscle fatigue measuring equipment |
US20050228239A1 (en) * | 2004-04-12 | 2005-10-13 | Frank Shallenberger | Method for analyzing the biological age of a subject |
US20060079800A1 (en) * | 2004-07-01 | 2006-04-13 | Mega Elektroniikka Oy | Method and device for measuring exercise level during exercise and for measuring fatigue |
US20070197881A1 (en) * | 2006-02-22 | 2007-08-23 | Wolf James L | Wireless Health Monitor Device and System with Cognition |
US20070276282A1 (en) * | 2006-05-24 | 2007-11-29 | Casio Computer Co., Ltd. | Living body information measuring apparatus and system |
US20100137748A1 (en) * | 2006-05-29 | 2010-06-03 | Motoki Sone | Fatigue estimation device and electronic apparatus having the fatigue estimation device mounted thereon |
US20100079291A1 (en) * | 2008-09-26 | 2010-04-01 | Muve, Inc. | Personalized Activity Monitor and Weight Management System |
US20110077472A1 (en) * | 2009-09-11 | 2011-03-31 | Hyperion Biotechnology | Methods and compositions for biomarkers of fatigue, fitness and physical performance capacity |
US20110081037A1 (en) * | 2009-10-07 | 2011-04-07 | Samsung Electronics Co., Ltd. | Earphone device having biological information measuring apparatus |
US20110288424A1 (en) * | 2009-10-29 | 2011-11-24 | Etsuko Kanai | Human fatigue assessment device and human fatigue assessment method |
US20110190645A1 (en) * | 2010-02-02 | 2011-08-04 | Jeff Hunt | Recovery Determination Methods And Recovery Determination Apparatuses |
US8812428B2 (en) * | 2010-09-20 | 2014-08-19 | Pulsar Informatics, Inc. | Systems and methods for assessment of fatigue-related contextual performance using historical incident data |
US20130325396A1 (en) * | 2010-09-30 | 2013-12-05 | Fitbit, Inc. | Methods and Systems for Metrics Analysis and Interactive Rendering, Including Events Having Combined Activity and Location Information |
US20130018592A1 (en) * | 2011-07-15 | 2013-01-17 | Pulsar Informatics, Inc. | Systems and Methods for Inter-Population Neurobehavioral Status Assessment Using Profiles Adjustable to Testing Conditions |
US20130158423A1 (en) * | 2011-12-14 | 2013-06-20 | Rijuven Corporation | Mobile wellness device |
US20150045693A1 (en) * | 2012-03-23 | 2015-02-12 | Juno Medical Llc | Measuring Device and Method for Indicating Level of Fatigue |
US20140228649A1 (en) * | 2012-07-30 | 2014-08-14 | Treefrog Developments, Inc. | Activity monitoring |
US20150208933A1 (en) * | 2012-09-28 | 2015-07-30 | Rohm Co., Ltd. | Pulse wave sensor |
US9087446B2 (en) * | 2012-12-28 | 2015-07-21 | Mitsubishi Electric Corporation | Life management apparatus and life management method |
US20140270375A1 (en) * | 2013-03-15 | 2014-09-18 | Focus Ventures, Inc. | System and Method for Identifying and Interpreting Repetitive Motions |
US20160249133A1 (en) * | 2013-10-07 | 2016-08-25 | Gn Netcom A/S | Earphone device with optical sensor |
US20160027324A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for providing lifestyle recommendations using earphones with biometric sensors |
US20160007933A1 (en) * | 2013-10-24 | 2016-01-14 | JayBird LLC | System and method for providing a smart activity score using earphones with biometric sensors |
US20160022200A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for providing an intelligent goal recommendation for activity level using earphones with biometric sensors |
US20160029125A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for anticipating activity using earphones with biometric sensors |
US20160026856A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for identifying performance days using earphones with biometric sensors |
US20160023047A1 (en) * | 2013-10-24 | 2016-01-28 | JayBird LLC | System and method for providing a training load schedule for peak performance positioning using earphones with biometric sensors |
US20160030809A1 (en) * | 2013-10-24 | 2016-02-04 | JayBird LLC | System and method for identifying fitness cycles using earphones with biometric sensors |
US20160029974A1 (en) * | 2013-10-24 | 2016-02-04 | JayBird LLC | System and method for tracking biological age over time based upon heart rate variability using earphones with biometric sensors |
US20150342480A1 (en) * | 2014-05-30 | 2015-12-03 | Microsoft Corporation | Optical pulse-rate sensing |
US20160199001A1 (en) * | 2015-01-10 | 2016-07-14 | Cheng Uei Precision Industry Co., Ltd. | Heart rate detection earphone |
US20160287108A1 (en) * | 2015-03-30 | 2016-10-06 | Bose Corporation | Light guide system for physiological sensor |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11818552B2 (en) | 2006-06-14 | 2023-11-14 | Staton Techiya Llc | Earguard monitoring system |
US12047731B2 (en) | 2007-03-07 | 2024-07-23 | Staton Techiya Llc | Acoustic device and methods |
US12249326B2 (en) | 2007-04-13 | 2025-03-11 | St Case1Tech, Llc | Method and device for voice operated control |
US11683643B2 (en) | 2007-05-04 | 2023-06-20 | Staton Techiya Llc | Method and device for in ear canal echo suppression |
US11856375B2 (en) | 2007-05-04 | 2023-12-26 | Staton Techiya Llc | Method and device for in-ear echo suppression |
US11889275B2 (en) | 2008-09-19 | 2024-01-30 | Staton Techiya Llc | Acoustic sealing analysis system |
US12183341B2 (en) | 2008-09-22 | 2024-12-31 | St Casestech, Llc | Personalized sound management and method |
US11693617B2 (en) | 2014-10-24 | 2023-07-04 | Staton Techiya Llc | Method and device for acute sound detection and reproduction |
USD777186S1 (en) * | 2014-12-24 | 2017-01-24 | Logitech Europe, S.A. | Display screen or portion thereof with a graphical user interface |
US11917367B2 (en) | 2016-01-22 | 2024-02-27 | Staton Techiya Llc | System and method for efficiency among devices |
US11595762B2 (en) * | 2016-01-22 | 2023-02-28 | Staton Techiya Llc | System and method for efficiency among devices |
US10420474B2 (en) * | 2016-02-01 | 2019-09-24 | Logitech Europe, S.A. | Systems and methods for gathering and interpreting heart rate data from an activity monitoring device |
US20170215742A1 (en) * | 2016-02-01 | 2017-08-03 | Logitech Europe, S.A. | Systems and methods for gathering and interpreting heart rate data from an activity monitoring device |
US12239427B2 (en) | 2017-06-04 | 2025-03-04 | Apple Inc. | Heartrate tracking techniques |
US10874313B2 (en) | 2017-06-04 | 2020-12-29 | Apple Inc. | Heartrate tracking techniques |
WO2018226305A1 (en) * | 2017-06-04 | 2018-12-13 | Apple Inc. | Heartrate tracking techniques |
US11690522B2 (en) | 2017-06-04 | 2023-07-04 | Apple Inc. | Heartrate tracking techniques |
US11083396B2 (en) * | 2017-07-14 | 2021-08-10 | Seiko Epson Corporation | Portable electronic apparatus |
US20190015017A1 (en) * | 2017-07-14 | 2019-01-17 | Seiko Epson Corporation | Portable electronic apparatus |
US11331019B2 (en) | 2017-08-07 | 2022-05-17 | The Research Foundation For The State University Of New York | Nanoparticle sensor having a nanofibrous membrane scaffold |
US11140486B2 (en) | 2017-11-28 | 2021-10-05 | Samsung Electronics Co., Ltd. | Electronic device operating in associated state with external audio device based on biometric information and method therefor |
US12248730B2 (en) | 2018-03-10 | 2025-03-11 | The Diablo Canyon Collective Llc | Earphone software and hardware |
US20210290103A1 (en) * | 2018-07-16 | 2021-09-23 | Medical Device Development Group B.V. | User device for registering disease related states of a user |
WO2021197911A1 (en) | 2020-04-02 | 2021-10-07 | Koninklijke Philips N.V. | Device, system and method for generating information on musculoskeletal recovery of a subject |
EP3888547A1 (en) * | 2020-04-02 | 2021-10-06 | Koninklijke Philips N.V. | Device, system and method for generating information on musculoskeletal recovery of a subject |
WO2022129879A1 (en) * | 2020-12-15 | 2022-06-23 | Prevayl Innovations Limited | Method and system for generating a recovery score for a user |
CN112914536A (en) * | 2021-03-24 | 2021-06-08 | 平安科技(深圳)有限公司 | Motion state detection method and device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160058378A1 (en) | System and method for providing an interpreted recovery score | |
US20160007933A1 (en) | System and method for providing a smart activity score using earphones with biometric sensors | |
US9622685B2 (en) | System and method for providing a training load schedule for peak performance positioning using earphones with biometric sensors | |
US20160030809A1 (en) | System and method for identifying fitness cycles using earphones with biometric sensors | |
US10078734B2 (en) | System and method for identifying performance days using earphones with biometric sensors | |
US20160027324A1 (en) | System and method for providing lifestyle recommendations using earphones with biometric sensors | |
US20160051184A1 (en) | System and method for providing sleep recommendations using earbuds with biometric sensors | |
US12070297B2 (en) | Photoplethysmography-based pulse wave analysis using a wearable device | |
US20170049335A1 (en) | Earphones with biometric sensors | |
US10559220B2 (en) | Systems and methods for creating a neural network to provide personalized recommendations using activity monitoring devices with biometric sensors | |
US10292606B2 (en) | System and method for determining performance capacity | |
US9526947B2 (en) | Method for providing a training load schedule for peak performance positioning | |
US10327674B2 (en) | Biometric monitoring device with immersion sensor and swim stroke detection and related methods | |
US10112075B2 (en) | Systems, methods and devices for providing a personalized exercise program recommendation | |
US20160029974A1 (en) | System and method for tracking biological age over time based upon heart rate variability using earphones with biometric sensors | |
US10178973B2 (en) | Wearable heart rate monitor | |
US20190082985A1 (en) | Optical device for determining pulse rate | |
CN108742559B (en) | Wearable heart rate monitor | |
US20160051185A1 (en) | System and method for creating a dynamic activity profile using earphones with biometric sensors | |
US20160029125A1 (en) | System and method for anticipating activity using earphones with biometric sensors | |
US10129628B2 (en) | Systems, methods and devices for providing an exertion recommendation based on performance capacity | |
US20150190072A1 (en) | Systems and methods for displaying and interacting with data from an activity monitoring device | |
US9864843B2 (en) | System and method for identifying performance days | |
US20160022200A1 (en) | System and method for providing an intelligent goal recommendation for activity level using earphones with biometric sensors | |
US20150118669A1 (en) | System and method for providing an intelligent goal recommendation for activity level |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: JAYBIRD LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WISBEY, BEN;SHEPHERD, DAVID;DUDDY, STEPHEN;SIGNING DATES FROM 20151107 TO 20151110;REEL/FRAME:037002/0453 |
|
AS | Assignment |
Owner name: LOGITECH EUROPE, S.A., SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JAYBIRD, LLC;REEL/FRAME:039414/0683 Effective date: 20160719 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |