WO2022146424A1 - Device tracking with angle-of-arrival data - Google Patents

Device tracking with angle-of-arrival data Download PDF

Info

Publication number
WO2022146424A1
WO2022146424A1 PCT/US2020/067438 US2020067438W WO2022146424A1 WO 2022146424 A1 WO2022146424 A1 WO 2022146424A1 US 2020067438 W US2020067438 W US 2020067438W WO 2022146424 A1 WO2022146424 A1 WO 2022146424A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
hwd
aoa
pose
inertial
Prior art date
Application number
PCT/US2020/067438
Other languages
French (fr)
Inventor
Steven Benjamin GOLDBERG
Qiyue John Zhang
Dongeek Shin
Clayton KIMBER
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Llc filed Critical Google Llc
Priority to CN202080105193.0A priority Critical patent/CN116113911A/en
Priority to US18/035,970 priority patent/US20230418369A1/en
Priority to PCT/US2020/067438 priority patent/WO2022146424A1/en
Priority to EP20848822.1A priority patent/EP4185938A1/en
Publication of WO2022146424A1 publication Critical patent/WO2022146424A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0258Hybrid positioning by combining or switching between measurements derived from different systems
    • G01S5/02585Hybrid positioning by combining or switching between measurements derived from different systems at least one of the measurements being a non-radio measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0278Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves involving statistical or probabilistic considerations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0294Trajectory determination or predictive filtering, e.g. target tracking or Kalman filtering
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • some computer systems employ wearable devices as part of the user interface.
  • some computer systems employ a head-wearable or head-mounted device to display information, and one or more wearable devices (e.g., a ring, bracelet, or watch) to provide input information from the user.
  • wearable devices e.g., a ring, bracelet, or watch
  • Many of these systems rely on pose tracking to determine the input information, the information to display, or a combination thereof. Accordingly, accurate and reliable pose tracking can lead to a more immersive and enjoyable user experience.
  • conventional pose tracking systems can require a relatively high amount of computing resources, relatively bulky or inflexible tracking equipment, or a combination thereof. These conventional systems consume a high amount of power, and can require special equipment set ups, such as for optical tracking systems employing one or more cameras external to the computer systems.
  • the proposed solution in particular relates to a method comprising determining - or generating - angle of arrival (AOA) data based on the angle of arrival of a received signal; and identifying a relative pose between a head-wearable display (HWD) and a wearable device based on the AOA data. Identifying a relative pose between a headwearable display (HWD) and a wearable device may relate to identifying a position and/or orientation of the HWD relative to the wearable device.
  • AOA angle of arrival
  • the HWD and the wearable device both comprise at least one signal transmitter and at least one signal receiver.
  • the HWD and/or the wearable device comprise at least one transceiver for transmitting and receiving signals.
  • a receiver or transceiver of the HWD includes a plurality of antennas, wherein each of the plurality of antennas receives a signal from the wearable device, such as a tag signal.
  • At least one processor of the HWD which, for example, is part of a signal module at the HWD, may then identify phase differences between the received signals and identify the AOA of the signal based on the identified phase differences.
  • the AOA data may include at least one of a first angle representing a horizontal angle of arrival of the signal and a second angle representing a vertical angle of arrival of the signal.
  • the first and second angles may respectively relate to an angle between a) a vector being calculated based on the phase differences and indicating the direction along which the received signal travelled from a transmitter to the receiver and b) a horizontal or vertical axis.
  • the first signal may be an anchor signal and the response signal may be a tag signal.
  • the response signal may also be used for determining the AOA.
  • a round trip time (RTT) is determined based on the exchanged signals.
  • the HWD transmits a first signal, such as an UWD signal, to a transceiver of the wearable device, and records a time of transmission for the first signal.
  • a first signal such as an UWD signal
  • the wearable device waits for a specified amount of time, and then transmits a response signal to the HWD.
  • the HWD determines the signal receive time and, based on the difference between the first signal transmit time and the response signal receive time, determines the distance between the wearable device and the HWD.
  • Generating the pose data referred to as AOA data or AOA pose date, for the HWD may then for example also include using the AOA of the response signal and the determined distance between the HWD and the wearable device.
  • the method may additionally comprise receiving inertial data from an inertial measurement unit (IMU) of the HWD, wherein identifying the relative pose comprises identifying the relative pose based on the inertial data.
  • IMU inertial measurement unit
  • An IMU may comprise one or more accelerometers, gyroscopes, and magnetometers, or any combination thereof that generate electronic signals indicating one or more of the specific force, angular rate, and orientation of the HWD, or any combination thereof. Based on these electronic signals, the IMU may indicate an inertial pose of the HWD.
  • the IMU indicates the inertial pose in a 3-dimensional (3-D) rotational frame of reference (e.g., pitch, roll, and yaw) associated with the HWD.
  • the IMU indicates the inertial pose in a 3D translational frame of reference (e.g., along x, y, and z axes of a cartesian framework) associated with the HWD.
  • the IMU indicates the inertial pose in both the rotational and translational frame of reference, thus indicating a six degree of freedom (6 DoF) pose for the HWD.
  • fused pose data may be generated by fusing the AOA data with the inertial data, wherein identifying the relative pose comprises identifying the relative pose based on the fused pose data.
  • Fusing the AOA data with the inertial data in this context may relate to using both the AOA data and the inertial data as an input for a stochastic estimation process which is performed by one or more processors of the HWD or another device connected to the HWD and based on which a pose of the HWD relative to the wearable device is estimated.
  • the estimated pose of the HWD relative to the wearable device is indicated by the fused pose data.
  • the fused pose data may be further processed by at least one processor, for example, for modifying augmented reality (AR) or virtual reality (VR) content to be displayed to a user by the HWD.
  • AR augmented reality
  • VR virtual reality
  • the stochastic estimation process may for example comprise a Kalman filter, a particle filter, a weighted least square bundle adjustment and/or a combination thereof.
  • a weighted least square bundle adjustment in this context may relate to minimizing the mean squared distances between the observed pose data and projected pose data.
  • fusing the AOA data with the inertial data may comprise fusing the AOA data with the inertial data based on a machine learning model.
  • a convolutional neural engine may be used that exploits temporal coherence of each data (i.e., the AOA data and the IMU data), observed in each spatial dimension.
  • the signal based on which the AOA is determined may by an ultra-wideband (UWB) signal. Accordingly, generating the AOA data may be based on a transmitted UWB signal (e.g., from the wearable device) which is received by the HWD.
  • UWB ultra-wideband
  • the proposed solution further relates to a computer system, comprising a head wearable display (HWD) configured to determine angle of arrival (AOA) data based on an angle of arrival of a received signal; and a processor configured to identify a relative pose between the HWD and a wearable device based on the AOA data.
  • HWD head wearable display
  • AOA angle of arrival
  • the proposed solution further relates to a head-wearable display (HWD) which identifies a pose relative to a wearable device by determining an angle of arrival (AOA) of a signal received at the HWD.
  • the HWD may comprise a receiver or transceiver which includes a plurality of antennas, wherein each of the plurality of antennas receives a signal from a wearable device, such as a tag signal.
  • At least one processor of the HWD which, for example, is part of a signal module at the HWD, may then identify phase differences between the received signals and identify the AOA of the signal based on the identified phase differences.
  • the HWD identifies the pose based on a combination of AOA data generated based on the AOA and inertial data generated by an inertial measurement unit (IMU).
  • the HWD may fuse the AOA data with the inertial data using data integration techniques such as one or more of stochastic estimation (e.g., a Kalman filter), a machine learning model, and the like, or any combination thereof.
  • a computer device associated with the HWD can employ the fused data to identify a pose of the HWD (e.g., a six degree of freedom (6 DoF) pose) and can use the identified pose to modify virtual reality or augmented reality content implemented by the computer device, thereby providing an immersive and enjoyable experience for a user using the HWD.
  • a pose of the HWD e.g., a six degree of freedom (6 DoF) pose
  • 6 DoF six degree of freedom
  • a proposed computer system and a proposed HWD may be configured to implement an embodiment of a proposed method. Accordingly, features discussed herein in the context with an embodiment of proposed method shall also apply to an embodiment of a proposed computer system and a proposed HWD and vice versa.
  • FIG. 1 is a diagram of a computer system that employs pose tracking for wearable devices using integrated angle-of-arrival (AOA) and inertial data in accordance with some embodiments.
  • AOA integrated angle-of-arrival
  • FIG. 2 is a block diagram of a head-wearable display of the computer system of FIG.
  • FIG. 3 is a block diagram of a data fuse module of FIG. 2 that fuses AOA and inertial data for pose tracking in accordance with some embodiments.
  • FIG. 4 is a diagram of a computer system employing at least two AOA tracking modules for pose tracking in accordance with some embodiments.
  • FIG. 5 is a diagram of a computer system employing at least two AOA tracking modules for pose tracking in accordance with some embodiments.
  • FIG. 6 is a block diagram of the computer system of FIG. 1 in accordance with some embodiments.
  • a computer device associated with the HWD can employ the fused data to identify a pose of the HWD (e.g., a six degree of freedom (6 DoF) pose) and can use the identified pose to modify virtual reality or augmented reality content implemented by the computer device, thereby providing an immersive and enjoyable experience for a user. Further, the HWD can generate pose data based on the fused data while consuming relatively little power, while still ensuring relatively accurate pose data, thereby improving the overall user experience with the HWD.
  • a pose of the HWD e.g., a six degree of freedom (6 DoF) pose
  • 6 DoF six degree of freedom
  • AOA data supplied by a radio interface can be used to determine a pose of an HWD or other device in three dimensions, and typically experiences low drift, so that the data is reliable over time.
  • AOA data is typically reliable only when there is line-of-sight (LOS) between the radio transmitter and receiver, and the accuracy of the data is impacted by signal noise.
  • the AOA data can have reduced reliability in cases of high dynamic motion.
  • inertial data can be used to determine an HWD pose in six dimensions, is not impacted by LOS or signal noise issues, and can provide accurate pose data even in cases of high dynamic motion.
  • a computer system employs both a display device, such as an HWD, and a wearable input device, such as a ring, bracelet, or watch, and determines a relative pose between the display device and the input device using fused AOA and inertial pose data.
  • the computer system can employ the relative pose data to identify user input information and, based on the input information, modify one or more aspects of the system, such as initiating or terminating execution of a software program, changing one or more aspects of a virtual or augmented reality environment, and the like.
  • FIG. 1 illustrates a computer system 100 including an HWD 102 and a wearable input device 104.
  • the HWD 100 is a binocular device having a form factor substantially similar to eyeglasses.
  • the HWD 100 includes optical and electronic components that together support display of information to a user on behalf of the computer system 100.
  • the HWD 100 includes a microdisplay or other display module that generates display light based on supplied image information.
  • the image information is generated by a graphics processing unit (GPU) (not shown) of the computer system 100 based on image information generated by one or more computer programs executing at, for example, one or more central processing units (CPUs, not shown).
  • GPU graphics processing unit
  • the display module provides the display light to one or more optical components, such as one or more lightguides, mirrors, beamsplitters, polarizers, combiners, lenses, and the like, that together direct the image light to a display area 105 of the HWD 100.
  • the HWD includes two lenses, each corresponding to a different eye of the user, with each lens having a corresponding display area 105 where the display light is directed.
  • the computer system 100 provides different display light to each display area 105, so that different information is provided to each eye of the user. This allows the computer system 105 to support different visual effects, such as stereoscopic effects.
  • the wearable input device 104 is a ring, bracelet, watch, or other electronic device that has a wearable form factor. For purposes of description, it is assumed that the wearable input device 104 is a ring.
  • the wearable input device 104 includes electronic components that together are configured to provide input information to the computer system 100 based on a user’s interaction with the wearable input device 104.
  • the wearable input device 104 includes one or more buttons, touchscreens, switches, joysticks, motion detectors, or other input components that can be manipulated by a user to provide input information.
  • the wearable input device 104 further includes one or more wireless interfaces, such as WiFi interface, Bluetooth® interface, and the like, to communicate the input information to a processor (not shown) or other component of the computer system 100.
  • the computer system 100 is generally configured to identify relative poses between the HWD 102 and the wearable input device 104.
  • the computer system 100 can employ the relative poses to identify user input information from a user.
  • the computer system 100 can identify user input information based on the distance between the HWD 102 and the wearable input device 104, a relative angle of the wearable input device 102 relative to a plane associated with the HWD 102, a direction of a vector between a center of the wearable input device and the HWD 102, and the like, or any combination thereof.
  • the computer system 100 To determine the relative pose between the HWD 102 and the wearable input device 104, the computer system 100 employs a fused combination of inertial data and AOA data. To generate the inertial data, the computer system 100 includes an inertial measurement unit (IMU) 108 mounted in a frame of the HWD 102.
  • the IMU 108 is a module including one or more accelerometers, gyroscopes, and magnetometers, or any combination thereof that generate electronic signals indicating one or more of the specific force, angular rate, and orientation of the HWD 102, or any combination thereof. Based on these electronic signals, the IMU 108 indicates an inertial pose of the HWD 102.
  • the IMU 108 indicates the inertial pose in a 3- dimensional (3-D) rotational frame of reference (e.g., pitch, roll, and yaw) associated with the HWD 102.
  • the IMU 108 indicates the inertial pose in a 3D translational frame of reference (e.g., along x, y, and z axes of a cartesian framework) associated with the HWD 102.
  • the IMU 108 indicates the inertial pose in both the rotational and translational frame of reference, thus indicating a 6 DoF pose for the HWD 102.
  • the computer system 100 includes an ultra-wideband (UWB) module 106 mounted, at least partially, at or on a frame of the HWD 102.
  • the UWB module 106 is generally configured to employ UWB signals to determine a range, or distance between the HWD 102 and the wearable device 104, as well as an angle of arrival for signals communicated by the wearable device 104 and received at the UWB module 106.
  • the UWB module 106 and the wearable device 104 each include a UWB transceiver configured to send and receive UWB signals, wherein each UWB transceiver includes a plurality of antennas.
  • the UWB transceivers employ a handshake process by exchanging specified signals, and the UWB module determines a round trip time (RTT) based on the exchanged signals.
  • the UWB module 106 transmits a UWB signal, referred to as an anchor signal, to the transceiver of the wearable device 104, and records a time of transmission for the anchor signal.
  • the wearable device 104 waits for a specified amount of time, and then transmits a response UWB signal, referred to as a tag signal, to the UWB module 106.
  • the UWB module determines the signal receive time and, based on the difference between the anchor signal transmit time and the tag signal receive time, determines the distance between the wearable device 104 and the HWD 102.
  • the UWB module 106 determines an AOA for the received tag signal.
  • the UWB transceiver of the UWB module 106 includes a plurality of antennas, and each of the plurality of antennas receives the tag signal.
  • the UWB module 106 identifies the phase differences between the received signals and identifies the AOA of the tag signal based on the identified phase differences.
  • the UWB module 106 then identifies pose data, referred to as AOA data or AOA pose data, for the HWD 102 based on the AOA of the tag signal and the distance between the HWD 102 and the wearable device 104.
  • the computer system 100 can fuse the data together to generate fused pose data.
  • FIG. 2 is a block diagram illustrating aspects of the computer system 100 in accordance with some embodiments.
  • the computer system 100 includes the UWB module 106, the IMU 108, and a data fuse module 218.
  • the data fuse module 218 is a module generally configured to fuse IMU data and AOA data to determine fused pose information. Accordingly, the data fuse module 218 can be implemented in dedicated hardware of the computer system 100, by software executing at a processor (not shown at FIG. 2) of the computer system 100, and the like.
  • the UWB module 106 In operation, the UWB module 106 generates AOA data 215, representing 3D DoF poses of the HWD 102 in a translational frame of reference, while the IMU 108 generates IMU data 216, representing 6-DoF poses of the HWD in a translational and rotational frames of reference.
  • the data fuse module 218 is generally configured to fuse the AOA data 215 and the IMU data 216 to generate fused pose data 220, wherein the fused pose is a 6-DoF pose in the translational and rotational frames of reference.
  • the data fuse module 218 fuses the data in different ways.
  • the data fuse module 218 employs one or more stochastic estimation or approximation techniques, using the AOA data 215 and the IMU data 216 as inputs, to determine properties of a path or curve that represents the changing pose of the HWD 102 over time, relative to the wearable device 104.
  • the data fuse module 218 implements a Kalman filter, a particle filter, a weighted least square bundle adjustment, or other estimation technique to estimate the pose and thus fused pose data 220 of the HWD 102 based on the AOA data 215 and the IMU data 216.
  • the data fuse module 218 employs a machine learning model that has been trained to generate the fused pose data 220 based on the AOA data 215 and the IMU data 216.
  • the data fuse module 218 implements a translational 3 DoF tracker using a convolutional neural engine that exploits temporal coherence of each data (i.e. , the AOA data 215 and the IMU data 216), observed in each spatial dimension, with a 3-DoF output layer.
  • the neural engine can be trained using pose data generated in a test environment to determine an initial set of weights, layers, and other factors that govern the behavior of the neural engine. Further, the neural engine can update the weights and other factors over time to further refine the estimation process for generating the fused pose data 220.
  • FIG. 3 is a block diagram illustrating an example of the data fuse module 218 in accordance with some embodiments.
  • the data fuse module 218 includes a coordinate transform module 322, a translational tracker module 324, a rotational tracker module 326, and a merger module 328.
  • the coordinate transform module 322 is generally configured to transform the AOA data 215 into a set of x, y, and z, coordinates in a translational frame of reference.
  • the AOA data 215 is generated as 1) a range r, representing the distance between the wearable device 104 and the HWD 102; 2) an angle 9, representing a horizontal angle of arrival of the tag signal; and 3) an angle 0, representing a vertical angle of arrival of the tag signal.
  • the coordinate transform module transforms the AOA data to x, y, and z coordinates using the following formulas:
  • the translational tracker 324 is a module configured to determine a translational pose of the HWD 102 in the translational frame of reference based on the AOA data 215, as transformed by the coordinate transform module 322, and the translational portion of the IMU data 216.
  • the translational tracker 324 generates the translational pose using one or more stochastic estimation techniques, such as by using a Kalman filter, a particle filter, a weighted least square bundle adjustment, and the like, or a combination thereof.
  • the translational tracker 324 employs a machine learning model, such as a convolutional neural engine that exploits temporal coherence of the input data observed in each spatial dimension, with a 3-DoF output layer, including one or more of 3DoF translational coordinates or 3DoF rotational coordinates.
  • a machine learning model such as a convolutional neural engine that exploits temporal coherence of the input data observed in each spatial dimension
  • a 3-DoF output layer including one or more of 3DoF translational coordinates or 3DoF rotational coordinates.
  • the rotational portion of the IMU data 216 can be employed to determine rotation dynamics, and these rotational dynamics are employed to determine an error model for the translational pose identified by the coordinate transform module.
  • the rotational tracker 326 is a module configured to determine a translational pose of the HWD 102 in the rotational frame of reference based on the IMU data 216.
  • the rotational tracker 326 generates the rotational pose using one or more stochastic estimation techniques, such as by using a Kalman filter, a particle filter, a weighted least square bundle adjustment, and the like, or a combination thereof.
  • the stochastic estimation techniques employed by the rotational tracker 326 are different than stochastic estimation techniques employed by the translational tracker 324.
  • the merger module 328 is configured to merge the translational pose generated by the translational tracker 324 and the rotational pose generated by the rotational tracker 326. In some embodiments, the merger module 328 merges the poses by placing the poses in a data structure configured to store 6-DoF pose data, including translational (x, y, z) pose data and rotational (pitch, roll, and yaw) data.
  • the AOA data 215 represents a pose in a translational frame of reference having multiple dimensions.
  • Examples of computer systems supporting generation of multi-dimensional pose data are illustrated at FIGs. 4 and 5 in accordance with some embodiments.
  • FIG. 4 illustrates a computer system 400 that includes an HWD 402 and the wearable device 104 in accordance with some embodiments.
  • the HWD 402 is configured similarly to the HWD 102 of FIG. 1 , including the UWB module 106 and an IMU 408.
  • the HWD 402 includes an additional UWB module 430, located at or near the center of the HWD 402, and between the two lenses of the HWD 402 (e.g., at or near a bridge section of the HWD 402 configured to be placed over the nose of the user).
  • an additional UWB module 430 located at or near the center of the HWD 402, and between the two lenses of the HWD 402 (e.g., at or near a bridge section of the HWD 402 configured to be placed over the nose of the user).
  • computer system 400 generates AOA data by transmitting a UWB anchor signal from the UWB module 106 to the wearable device 104.
  • the wearable device 104 transmits a tag signal, as described above.
  • the tag signal is received via antennas at each of the UWB modules 106 and 430.
  • the UWB module 106 determines a phase difference between the received tag signals, and based on the phase difference, determines the horizontal angle of arrival for the tag signal, 9.
  • FIG. 5 illustrates a computer system 500 that includes an HWD 502 and the wearable device 104 in accordance with some embodiments.
  • the HWD 502 is configured similarly to the HWD 102 of FIG. 1 , including the UWB module 106 and an IMU 508.
  • the HWD 502 includes an additional UWB module 530, similar to the UWB module 430 of FIG. 4, and located at or near the center of the HWD 402, and between the two lenses of the HWD 402.
  • the HWD 502 includes another UWB module 532, located below the UWB module 532, along a side of a lens of the HWD 502, opposite the UWB module 530.
  • the computer system 100 in different embodiments, is implemented at any of a variety of devices, such as a desktop or laptop computer, a server, a tablet, a smartphone, a game console, and the like.
  • the computer system 100 includes additional modules and components, not illustrated at FIG. 6, to support execution of instructions and in particular execution of the AR/VR content 642.
  • the computer system 100 includes one or more additional processing units, such as one or more graphics processing units (GPUs), memory controllers, input/output controllers, network interfaces, memory modules, and the like.
  • GPUs graphics processing units
  • the computer system 100 implements the AR/VR content 642 by executing a corresponding set of instructions that, when executed at the processor 640, generates image frames for display at the HWD 102.
  • the computer system 100 is configured to modify the AR/VR content 642, and the corresponding image frames, based on the fused pose data 220.
  • the processor 640 modifies the AR/VR content 642, based on a corresponding set of instructions executing at the processor 640.
  • the AR/VR content 642 can be updated to allow the user to see different portions of a virtual or augmented environment, to interact with virtual objects in the virtual or augmented environment, to initiate or terminate execution of computer programs or applications, and the like.
  • the computer readable storage medium may be embedded in the computing system (e.g., system RAM or ROM), fixedly attached to the computing system (e.g., a magnetic hard drive), removably attached to the computing system (e.g., an optical disc or Universal Serial Bus (USB)-based Flash memory) or coupled to the computer system via a wired or wireless network (e.g., network accessible storage (NAS)).
  • system RAM or ROM system RAM or ROM
  • USB Universal Serial Bus
  • NAS network accessible storage

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Probability & Statistics with Applications (AREA)
  • User Interface Of Digital Computer (AREA)
  • Navigation (AREA)

Abstract

A wearable device, such as a head-wearable display (HWD), identifies a device pose based on a combination of angle-of-arrival (AOA) data generated by, for example, an ultra-wideband (UWB) positioning module, and inertial data generated by an inertial measurement unit (IMU). The HWD fuses the AOA data with the inertial data using data integration techniques such as one or more of stochastic estimation (e.g., a Kalman filter), a machine learning model, and the like, or any combination thereof. A computer device associated with the HWD can employ the fused data to identify a pose of the HWD (e.g., a six degree of freedom (6 DoF) pose) and can use the identified pose to modify virtual reality or augmented reality content implemented by the computer device, thereby providing an immersive and enjoyable experience for a user.

Description

DEVICE TRACKING WITH ANGLE-OF-ARRIVAL DATA
BACKGROUND
To provide a more immersive and enjoyable user experience, some computer systems employ wearable devices as part of the user interface. For example, some computer systems employ a head-wearable or head-mounted device to display information, and one or more wearable devices (e.g., a ring, bracelet, or watch) to provide input information from the user. Many of these systems rely on pose tracking to determine the input information, the information to display, or a combination thereof. Accordingly, accurate and reliable pose tracking can lead to a more immersive and enjoyable user experience. However, conventional pose tracking systems can require a relatively high amount of computing resources, relatively bulky or inflexible tracking equipment, or a combination thereof. These conventional systems consume a high amount of power, and can require special equipment set ups, such as for optical tracking systems employing one or more cameras external to the computer systems.
SUMMARY
The proposed solution in particular relates to a method comprising determining - or generating - angle of arrival (AOA) data based on the angle of arrival of a received signal; and identifying a relative pose between a head-wearable display (HWD) and a wearable device based on the AOA data. Identifying a relative pose between a headwearable display (HWD) and a wearable device may relate to identifying a position and/or orientation of the HWD relative to the wearable device.
In an exemplary embodiment, the HWD and the wearable device both comprise at least one signal transmitter and at least one signal receiver. For example, the HWD and/or the wearable device comprise at least one transceiver for transmitting and receiving signals. For example, in some embodiments, a receiver or transceiver of the HWD includes a plurality of antennas, wherein each of the plurality of antennas receives a signal from the wearable device, such as a tag signal. At least one processor of the HWD, which, for example, is part of a signal module at the HWD, may then identify phase differences between the received signals and identify the AOA of the signal based on the identified phase differences.
The AOA data may include at least one of a first angle representing a horizontal angle of arrival of the signal and a second angle representing a vertical angle of arrival of the signal. The first and second angles may respectively relate to an angle between a) a vector being calculated based on the phase differences and indicating the direction along which the received signal travelled from a transmitter to the receiver and b) a horizontal or vertical axis.
In some embodiments, determining/generating the AOA data further comprises determining a distance, or range, between the HWD and the wearable device. Determining the distance between the HWD and the wearable device may include measuring a time difference between transmitting a first signal from the HWD to the wearable device and receiving a second signal (response signal) from the wearable device at the HWD. For example, the first signal may be an anchor signal and the response signal may be a tag signal. The response signal may also be used for determining the AOA. In any exemplary embodiment, a round trip time (RTT) is determined based on the exchanged signals. For example, the HWD transmits a first signal, such as an UWD signal, to a transceiver of the wearable device, and records a time of transmission for the first signal. In response to receiving the first signal, the wearable device waits for a specified amount of time, and then transmits a response signal to the HWD. In response to receiving the response signal, the HWD determines the signal receive time and, based on the difference between the first signal transmit time and the response signal receive time, determines the distance between the wearable device and the HWD. Generating the pose data, referred to as AOA data or AOA pose date, for the HWD may then for example also include using the AOA of the response signal and the determined distance between the HWD and the wearable device.
In an exemplary embodiment, the method may additionally comprise receiving inertial data from an inertial measurement unit (IMU) of the HWD, wherein identifying the relative pose comprises identifying the relative pose based on the inertial data. Accordingly, for identifying the relative pose between the HWD and the wearable device measurements from an IMU of the HWD are taken into account. An IMU may comprise one or more accelerometers, gyroscopes, and magnetometers, or any combination thereof that generate electronic signals indicating one or more of the specific force, angular rate, and orientation of the HWD, or any combination thereof. Based on these electronic signals, the IMU may indicate an inertial pose of the HWD. In some embodiments, the IMU indicates the inertial pose in a 3-dimensional (3-D) rotational frame of reference (e.g., pitch, roll, and yaw) associated with the HWD. In other embodiments, the IMU indicates the inertial pose in a 3D translational frame of reference (e.g., along x, y, and z axes of a cartesian framework) associated with the HWD. In still other embodiments, the IMU indicates the inertial pose in both the rotational and translational frame of reference, thus indicating a six degree of freedom (6 DoF) pose for the HWD.
In an exemplary embodiment, fused pose data may be generated by fusing the AOA data with the inertial data, wherein identifying the relative pose comprises identifying the relative pose based on the fused pose data. Fusing the AOA data with the inertial data in this context may relate to using both the AOA data and the inertial data as an input for a stochastic estimation process which is performed by one or more processors of the HWD or another device connected to the HWD and based on which a pose of the HWD relative to the wearable device is estimated. The estimated pose of the HWD relative to the wearable device is indicated by the fused pose data. The fused pose data may be further processed by at least one processor, for example, for modifying augmented reality (AR) or virtual reality (VR) content to be displayed to a user by the HWD.
The stochastic estimation process may for example comprise a Kalman filter, a particle filter, a weighted least square bundle adjustment and/or a combination thereof. A weighted least square bundle adjustment in this context may relate to minimizing the mean squared distances between the observed pose data and projected pose data. Additionally or alternatively, fusing the AOA data with the inertial data may comprise fusing the AOA data with the inertial data based on a machine learning model. For example, a convolutional neural engine may be used that exploits temporal coherence of each data (i.e., the AOA data and the IMU data), observed in each spatial dimension.
The signal based on which the AOA is determined may by an ultra-wideband (UWB) signal. Accordingly, generating the AOA data may be based on a transmitted UWB signal (e.g., from the wearable device) which is received by the HWD.
The proposed solution further relates to a computer system, comprising a head wearable display (HWD) configured to determine angle of arrival (AOA) data based on an angle of arrival of a received signal; and a processor configured to identify a relative pose between the HWD and a wearable device based on the AOA data.
The proposed solution further relates to a head-wearable display (HWD) which identifies a pose relative to a wearable device by determining an angle of arrival (AOA) of a signal received at the HWD. For example, as outlined above, the HWD may comprise a receiver or transceiver which includes a plurality of antennas, wherein each of the plurality of antennas receives a signal from a wearable device, such as a tag signal. At least one processor of the HWD, which, for example, is part of a signal module at the HWD, may then identify phase differences between the received signals and identify the AOA of the signal based on the identified phase differences.
In an exemplary embodiment, the HWD identifies the pose based on a combination of AOA data generated based on the AOA and inertial data generated by an inertial measurement unit (IMU). The HWD may fuse the AOA data with the inertial data using data integration techniques such as one or more of stochastic estimation (e.g., a Kalman filter), a machine learning model, and the like, or any combination thereof. A computer device associated with the HWD can employ the fused data to identify a pose of the HWD (e.g., a six degree of freedom (6 DoF) pose) and can use the identified pose to modify virtual reality or augmented reality content implemented by the computer device, thereby providing an immersive and enjoyable experience for a user using the HWD.
A proposed computer system and a proposed HWD may be configured to implement an embodiment of a proposed method. Accordingly, features discussed herein in the context with an embodiment of proposed method shall also apply to an embodiment of a proposed computer system and a proposed HWD and vice versa.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure may be better understood, and its numerous features and advantages made apparent to those skilled in the art by referencing the accompanying drawings. The use of the same reference symbols in different drawings indicates similar or identical items.
FIG. 1 is a diagram of a computer system that employs pose tracking for wearable devices using integrated angle-of-arrival (AOA) and inertial data in accordance with some embodiments.
FIG. 2 is a block diagram of a head-wearable display of the computer system of FIG.
1 that integrates AOA and inertial data for pose tracking in accordance with some embodiments.
FIG. 3 is a block diagram of a data fuse module of FIG. 2 that fuses AOA and inertial data for pose tracking in accordance with some embodiments.
FIG. 4 is a diagram of a computer system employing at least two AOA tracking modules for pose tracking in accordance with some embodiments.
FIG. 5 is a diagram of a computer system employing at least two AOA tracking modules for pose tracking in accordance with some embodiments.
FIG. 6 is a block diagram of the computer system of FIG. 1 in accordance with some embodiments. DETAILED DESCRIPTION
FIGs. 1-6 illustrate techniques for identifying a pose of a wearable device, such as a head-wearable display (HWD), based on a combination of angle-of-arrival (AOA) data generated by, for example, an ultra-wideband (UWB) positioning module, and inertial data generated by an inertial measurement unit (I MU). The HWD fuses the AOA data with the inertial data using data integration techniques such as one or more of stochastic estimation (e.g., a Kalman filter), a machine learning model, and the like, or any combination thereof. A computer device associated with the HWD can employ the fused data to identify a pose of the HWD (e.g., a six degree of freedom (6 DoF) pose) and can use the identified pose to modify virtual reality or augmented reality content implemented by the computer device, thereby providing an immersive and enjoyable experience for a user. Further, the HWD can generate pose data based on the fused data while consuming relatively little power, while still ensuring relatively accurate pose data, thereby improving the overall user experience with the HWD.
To illustrate, AOA data supplied by a radio interface, such as a UWB or WIFI interface, can be used to determine a pose of an HWD or other device in three dimensions, and typically experiences low drift, so that the data is reliable over time. However, AOA data is typically reliable only when there is line-of-sight (LOS) between the radio transmitter and receiver, and the accuracy of the data is impacted by signal noise. In addition, the AOA data can have reduced reliability in cases of high dynamic motion. In contrast, inertial data can be used to determine an HWD pose in six dimensions, is not impacted by LOS or signal noise issues, and can provide accurate pose data even in cases of high dynamic motion. However, the accuracy of the inertial data can decay, or drift, with time due to IMU biases and other intrinsic errors. By fusing inertial data and AOA data, the HWD or other device can accurately identify the device pose under a wider variety of conditions, improving device flexibility. Furthermore, the inertial and AOA data can be fused using stochastic estimation, machine learning models, or other techniques that require relatively low computation overhead, reducing power consumption at the HWD or other device. In some embodiments, a computer system employs both a display device, such as an HWD, and a wearable input device, such as a ring, bracelet, or watch, and determines a relative pose between the display device and the input device using fused AOA and inertial pose data. The computer system can employ the relative pose data to identify user input information and, based on the input information, modify one or more aspects of the system, such as initiating or terminating execution of a software program, changing one or more aspects of a virtual or augmented reality environment, and the like.
FIG. 1 illustrates a computer system 100 including an HWD 102 and a wearable input device 104. In the depicted example, the HWD 100 is a binocular device having a form factor substantially similar to eyeglasses. In some embodiments, the HWD 100 includes optical and electronic components that together support display of information to a user on behalf of the computer system 100. For example, in some embodiments the HWD 100 includes a microdisplay or other display module that generates display light based on supplied image information. In various embodiments, the image information is generated by a graphics processing unit (GPU) (not shown) of the computer system 100 based on image information generated by one or more computer programs executing at, for example, one or more central processing units (CPUs, not shown). The display module provides the display light to one or more optical components, such as one or more lightguides, mirrors, beamsplitters, polarizers, combiners, lenses, and the like, that together direct the image light to a display area 105 of the HWD 100. In the depicted example, the HWD includes two lenses, each corresponding to a different eye of the user, with each lens having a corresponding display area 105 where the display light is directed. Further, in some embodiments the computer system 100 provides different display light to each display area 105, so that different information is provided to each eye of the user. This allows the computer system 105 to support different visual effects, such as stereoscopic effects.
As noted above, the wearable input device 104 is a ring, bracelet, watch, or other electronic device that has a wearable form factor. For purposes of description, it is assumed that the wearable input device 104 is a ring. The wearable input device 104 includes electronic components that together are configured to provide input information to the computer system 100 based on a user’s interaction with the wearable input device 104. For example, in some embodiments the wearable input device 104 includes one or more buttons, touchscreens, switches, joysticks, motion detectors, or other input components that can be manipulated by a user to provide input information. The wearable input device 104 further includes one or more wireless interfaces, such as WiFi interface, Bluetooth® interface, and the like, to communicate the input information to a processor (not shown) or other component of the computer system 100.
In some embodiments, the computer system 100 is generally configured to identify relative poses between the HWD 102 and the wearable input device 104. The computer system 100 can employ the relative poses to identify user input information from a user. For example, the computer system 100 can identify user input information based on the distance between the HWD 102 and the wearable input device 104, a relative angle of the wearable input device 102 relative to a plane associated with the HWD 102, a direction of a vector between a center of the wearable input device and the HWD 102, and the like, or any combination thereof. For example, in some embodiments, if the user holds the wearable input device 104 at a specified proximity to the HWD 102 and on a specified side (e.g., on a left side) of the HWD 102, the computer system 100 determines that the user is requesting initiation of a specified program (e.g., an email program). Alternatively, if the user holds the wearable input device 104 at the specified proximity to the HWD 102 and on a different specified side (e.g., on a right side) of the HWD 102, the computer system 100 determines that the user is requesting initiation of a different specified program (e.g., a chat program).
To determine the relative pose between the HWD 102 and the wearable input device 104, the computer system 100 employs a fused combination of inertial data and AOA data. To generate the inertial data, the computer system 100 includes an inertial measurement unit (IMU) 108 mounted in a frame of the HWD 102. The IMU 108 is a module including one or more accelerometers, gyroscopes, and magnetometers, or any combination thereof that generate electronic signals indicating one or more of the specific force, angular rate, and orientation of the HWD 102, or any combination thereof. Based on these electronic signals, the IMU 108 indicates an inertial pose of the HWD 102. In some embodiments, the IMU 108 indicates the inertial pose in a 3- dimensional (3-D) rotational frame of reference (e.g., pitch, roll, and yaw) associated with the HWD 102. In other embodiments, the IMU 108 indicates the inertial pose in a 3D translational frame of reference (e.g., along x, y, and z axes of a cartesian framework) associated with the HWD 102. In still other embodiments, the IMU 108 indicates the inertial pose in both the rotational and translational frame of reference, thus indicating a 6 DoF pose for the HWD 102.
To generate AOA data, the computer system 100 includes an ultra-wideband (UWB) module 106 mounted, at least partially, at or on a frame of the HWD 102. The UWB module 106 is generally configured to employ UWB signals to determine a range, or distance between the HWD 102 and the wearable device 104, as well as an angle of arrival for signals communicated by the wearable device 104 and received at the UWB module 106. To illustrate, in some embodiments, the UWB module 106 and the wearable device 104 each include a UWB transceiver configured to send and receive UWB signals, wherein each UWB transceiver includes a plurality of antennas.
To determine the distance, or range, between the UWB module 106 and the wearable device 104, the UWB transceivers employ a handshake process by exchanging specified signals, and the UWB module determines a round trip time (RTT) based on the exchanged signals. For example, the UWB module 106 transmits a UWB signal, referred to as an anchor signal, to the transceiver of the wearable device 104, and records a time of transmission for the anchor signal. In response to receiving the anchor signal, the wearable device 104 waits for a specified amount of time, and then transmits a response UWB signal, referred to as a tag signal, to the UWB module 106. In response to receiving the tag signal, the UWB module determines the signal receive time and, based on the difference between the anchor signal transmit time and the tag signal receive time, determines the distance between the wearable device 104 and the HWD 102.
In addition, the UWB module 106 determines an AOA for the received tag signal. For example, in some embodiments, the UWB transceiver of the UWB module 106 includes a plurality of antennas, and each of the plurality of antennas receives the tag signal. The UWB module 106 identifies the phase differences between the received signals and identifies the AOA of the tag signal based on the identified phase differences. The UWB module 106 then identifies pose data, referred to as AOA data or AOA pose data, for the HWD 102 based on the AOA of the tag signal and the distance between the HWD 102 and the wearable device 104.
For example, in the depicted embodiment of FIG. 1 , the UWB module 106 includes a single antenna located at a temple location of the HWD 102, and therefore determines only the distance between the HWD 102 and the wearable device 104. The UWB module 106 uses the distance, r, to generate a relative pose for the HWD 102 in a translational frame of reference along a single axis. In other embodiments, described below with respect to FIGs. 4 and 5, the HWD 102 includes multiple HWD antennas at different locations, and the UWB module 106 therefore can determine an angle of arrival, or multiple angles of arrival relative to different planes, to determine the relative pose for the HWD in a translational frame along two or three different axes.
In response to generating inertial data and AOA data, the computer system 100 can fuse the data together to generate fused pose data. An example is illustrated at FIG. 2 in accordance with some embodiments. In particular, FIG. 2 is a block diagram illustrating aspects of the computer system 100 in accordance with some embodiments. In the depicted example, the computer system 100 includes the UWB module 106, the IMU 108, and a data fuse module 218. The data fuse module 218 is a module generally configured to fuse IMU data and AOA data to determine fused pose information. Accordingly, the data fuse module 218 can be implemented in dedicated hardware of the computer system 100, by software executing at a processor (not shown at FIG. 2) of the computer system 100, and the like.
In operation, the UWB module 106 generates AOA data 215, representing 3D DoF poses of the HWD 102 in a translational frame of reference, while the IMU 108 generates IMU data 216, representing 6-DoF poses of the HWD in a translational and rotational frames of reference. The data fuse module 218 is generally configured to fuse the AOA data 215 and the IMU data 216 to generate fused pose data 220, wherein the fused pose is a 6-DoF pose in the translational and rotational frames of reference.
In different embodiments, the data fuse module 218 fuses the data in different ways. For example, in some embodiments, the data fuse module 218 employs one or more stochastic estimation or approximation techniques, using the AOA data 215 and the IMU data 216 as inputs, to determine properties of a path or curve that represents the changing pose of the HWD 102 over time, relative to the wearable device 104. Thus, for example, in different embodiments the data fuse module 218 implements a Kalman filter, a particle filter, a weighted least square bundle adjustment, or other estimation technique to estimate the pose and thus fused pose data 220 of the HWD 102 based on the AOA data 215 and the IMU data 216.
In other embodiments, the data fuse module 218 employs a machine learning model that has been trained to generate the fused pose data 220 based on the AOA data 215 and the IMU data 216. For example, in some embodiments, the data fuse module 218 implements a translational 3 DoF tracker using a convolutional neural engine that exploits temporal coherence of each data (i.e. , the AOA data 215 and the IMU data 216), observed in each spatial dimension, with a 3-DoF output layer. The neural engine can be trained using pose data generated in a test environment to determine an initial set of weights, layers, and other factors that govern the behavior of the neural engine. Further, the neural engine can update the weights and other factors over time to further refine the estimation process for generating the fused pose data 220.
FIG. 3 is a block diagram illustrating an example of the data fuse module 218 in accordance with some embodiments. In the depicted example, the data fuse module 218 includes a coordinate transform module 322, a translational tracker module 324, a rotational tracker module 326, and a merger module 328. The coordinate transform module 322 is generally configured to transform the AOA data 215 into a set of x, y, and z, coordinates in a translational frame of reference. For example, in some embodiments the AOA data 215 is generated as 1) a range r, representing the distance between the wearable device 104 and the HWD 102; 2) an angle 9, representing a horizontal angle of arrival of the tag signal; and 3) an angle 0, representing a vertical angle of arrival of the tag signal. The coordinate transform module transforms the AOA data to x, y, and z coordinates using the following formulas:
Figure imgf000013_0001
The translational tracker 324 is a module configured to determine a translational pose of the HWD 102 in the translational frame of reference based on the AOA data 215, as transformed by the coordinate transform module 322, and the translational portion of the IMU data 216. In some embodiments, the translational tracker 324 generates the translational pose using one or more stochastic estimation techniques, such as by using a Kalman filter, a particle filter, a weighted least square bundle adjustment, and the like, or a combination thereof. In other embodiments, the translational tracker 324 employs a machine learning model, such as a convolutional neural engine that exploits temporal coherence of the input data observed in each spatial dimension, with a 3-DoF output layer, including one or more of 3DoF translational coordinates or 3DoF rotational coordinates. In some embodiments, the rotational portion of the IMU data 216 can be employed to determine rotation dynamics, and these rotational dynamics are employed to determine an error model for the translational pose identified by the coordinate transform module.
The rotational tracker 326 is a module configured to determine a translational pose of the HWD 102 in the rotational frame of reference based on the IMU data 216. In some embodiments, the rotational tracker 326 generates the rotational pose using one or more stochastic estimation techniques, such as by using a Kalman filter, a particle filter, a weighted least square bundle adjustment, and the like, or a combination thereof. In some embodiments, the stochastic estimation techniques employed by the rotational tracker 326 are different than stochastic estimation techniques employed by the translational tracker 324.
The merger module 328 is configured to merge the translational pose generated by the translational tracker 324 and the rotational pose generated by the rotational tracker 326. In some embodiments, the merger module 328 merges the poses by placing the poses in a data structure configured to store 6-DoF pose data, including translational (x, y, z) pose data and rotational (pitch, roll, and yaw) data.
As noted above, in some embodiments the AOA data 215 represents a pose in a translational frame of reference having multiple dimensions. To generate this pose data, it is useful to have UWB antennas at multiple disparate locations of the HWD, allowing generation of pose data based on the phase difference between the received tag signal at each antenna. Examples of computer systems supporting generation of multi-dimensional pose data are illustrated at FIGs. 4 and 5 in accordance with some embodiments. In particular, FIG. 4 illustrates a computer system 400 that includes an HWD 402 and the wearable device 104 in accordance with some embodiments. The HWD 402 is configured similarly to the HWD 102 of FIG. 1 , including the UWB module 106 and an IMU 408. However, the HWD 402 includes an additional UWB module 430, located at or near the center of the HWD 402, and between the two lenses of the HWD 402 (e.g., at or near a bridge section of the HWD 402 configured to be placed over the nose of the user).
In operation, computer system 400 generates AOA data by transmitting a UWB anchor signal from the UWB module 106 to the wearable device 104. In response, the wearable device 104 transmits a tag signal, as described above. The tag signal is received via antennas at each of the UWB modules 106 and 430. The UWB module 106 determines a phase difference between the received tag signals, and based on the phase difference, determines the horizontal angle of arrival for the tag signal, 9.
FIG. 5 illustrates a computer system 500 that includes an HWD 502 and the wearable device 104 in accordance with some embodiments. The HWD 502 is configured similarly to the HWD 102 of FIG. 1 , including the UWB module 106 and an IMU 508. However, the HWD 502 includes an additional UWB module 530, similar to the UWB module 430 of FIG. 4, and located at or near the center of the HWD 402, and between the two lenses of the HWD 402. In addition, the HWD 502 includes another UWB module 532, located below the UWB module 532, along a side of a lens of the HWD 502, opposite the UWB module 530. In operation, computer system 500 generates AOA data by transmitting a UWB anchor signal from the UWB module 106 to the wearable device 104. In response, the wearable device 104 transmits a tag signal, as described above. The tag signal is received via antennas at each of the UWB modules 106, 430, and 532. The UWB module 106 determines phase differences between the received tag signals, and based on the phase differences, determines the horizontal angle of arrival for the tag signal, 9, and the vertical angle of arrival for the tag signal, 0.
In some embodiments, the computer system 100 supports modification of augmented reality (AR) or virtual reality (VR) content based on the fused pose data 220. An example is illustrated at FIG. 6, which depicts a block diagram illustrating aspects of the computer system 100 in accordance with some embodiments. In the illustrated embodiment, the computer system 100 includes the data fuse module 218 and a processor 640 configured to execute AR/VR content 642. The processor 640 is a general purpose or application specific processor configured to execute sets of instructions (e.g., computer programs or applications) in order to carry out operations on behalf of the computer system 100. Accordingly, the computer system 100, in different embodiments, is implemented at any of a variety of devices, such as a desktop or laptop computer, a server, a tablet, a smartphone, a game console, and the like. In some embodiments, the computer system 100 includes additional modules and components, not illustrated at FIG. 6, to support execution of instructions and in particular execution of the AR/VR content 642. For example, in various embodiments the computer system 100 includes one or more additional processing units, such as one or more graphics processing units (GPUs), memory controllers, input/output controllers, network interfaces, memory modules, and the like.
In some embodiments, the computer system 100 implements the AR/VR content 642 by executing a corresponding set of instructions that, when executed at the processor 640, generates image frames for display at the HWD 102. In addition, the computer system 100 is configured to modify the AR/VR content 642, and the corresponding image frames, based on the fused pose data 220. In operation, as the user changes the relative pose between the HWD 102 and the wearable device 104, the AOA data 215 and the IMU data 216 change, causing the data fuse module 218 to generate new fused pose data 220. As the fused pose data 220 changes, the processor 640 modifies the AR/VR content 642, based on a corresponding set of instructions executing at the processor 640. The user thereby interacts with the AR/VR content 642. Thus, for example, as the user changes the relative pose between the HWD 102 and the wearable device 104, the AR/VR content 642 can be updated to allow the user to see different portions of a virtual or augmented environment, to interact with virtual objects in the virtual or augmented environment, to initiate or terminate execution of computer programs or applications, and the like.
In some embodiments, certain aspects of the techniques described above may be implemented by one or more processors of a processing system executing software. The software comprises one or more sets of executable instructions stored or otherwise tangibly embodied on a non-transitory computer readable storage medium. The software can include the instructions and certain data that, when executed by the one or more processors, manipulate the one or more processors to perform one or more aspects of the techniques described above. The non-transitory computer readable storage medium can include, for example, a magnetic or optical disk storage device, solid state storage devices such as Flash memory, a cache, random access memory (RAM) or other non-volatile memory device or devices, and the like. The executable instructions stored on the non-transitory computer readable storage medium may be in source code, assembly language code, object code, or other instruction format that is interpreted or otherwise executable by one or more processors.
A computer readable storage medium may include any storage medium, or combination of storage media, accessible by a computer system during use to provide instructions and/or data to the computer system. Such storage media can include, but is not limited to, optical media (e.g., compact disc (CD), digital versatile disc (DVD), Blu-Ray disc), magnetic media (e.g., floppy disc, magnetic tape, or magnetic hard drive), volatile memory (e.g., random access memory (RAM) or cache), non-volatile memory (e.g., read-only memory (ROM) or Flash memory), or microelectromechanical systems (MEMS)-based storage media. The computer readable storage medium may be embedded in the computing system (e.g., system RAM or ROM), fixedly attached to the computing system (e.g., a magnetic hard drive), removably attached to the computing system (e.g., an optical disc or Universal Serial Bus (USB)-based Flash memory) or coupled to the computer system via a wired or wireless network (e.g., network accessible storage (NAS)).
Note that not all of the activities or elements described above in the general description are required, that a portion of a specific activity or device may not be required, and that one or more further activities may be performed, or elements included, in addition to those described. Still further, the order in which activities are listed are not necessarily the order in which they are performed. Also, the concepts have been described with reference to specific embodiments. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of the present disclosure.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any feature(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature of any or all the claims. Moreover, the particular embodiments disclosed above are illustrative only, as the disclosed subject matter may be modified and practiced in different but equivalent manners apparent to those skilled in the art having the benefit of the teachings herein. No limitations are intended to the details of construction or design herein shown, other than as described in the claims below. It is therefore evident that the particular embodiments disclosed above may be altered or modified and all such variations are considered within the scope of the disclosed subject matter. Accordingly, the protection sought herein is as set forth in the claims below.

Claims

WHAT IS CLAIMED IS:
1. A method comprising: determining angle of arrival, AOA, data based on the angle of arrival of a received signal; and identifying a relative pose between a head-wearable display, HWD, and a wearable device based on the AOA data.
2. The method of claim 1 , further comprising: receiving inertial data from an inertial measurement unit, IMU, of the HWD; and wherein identifying the relative pose comprises identifying the relative pose based on the inertial data.
3. The method of claim 2, further comprising: fusing the AOA data with the inertial data to generate fused pose data; and wherein identifying the relative pose comprises identifying the relative pose based on the fused pose data.
4. The method of claim 3, wherein fusing the AOA data with the inertial data comprises fusing the AOA data with the inertial data based on a stochastic estimation process.
5. The method of claim 4, wherein the stochastic estimation process comprises a
Kalman filter.
6. The method of claim 4 or 5, wherein the stochastic estimation process comprises a particle filter.
7. The method of any one of claims 4 to 6, wherein the stochastic estimation process comprises a weighted least square bundle adjustment. method of any one of claims 3 to 7, wherein fusing the AOA data with the inertial data comprises fusing the AOA data with the inertial data based on a machine learning model. method of any one of the preceding claims, wherein determining the AOA data comprises determining the AOA data based on an ultra-wideband, UWB, signal. e method of any one of the preceding claims, wherein the HWD includes a plurality of antennas and each of the plurality of antennas receives a signal from the wearable device, and wherein the AOA of the signal is identified based on phase differences between the received signals. e method of any one of the preceding claims, wherein determining the AOA data further comprises determining a distance between the HWD and the wearable device. e method of claim 11 , wherein determining the distance between the HWD and the wearable device includes measuring a time difference between transmitting a first signal from the HWD to the wearable device and receiving, in response to the first signal, a second signal from the wearable device at the HWD. e method of any one of the preceding claims, wherein an augmented reality
(AR) or virtual reality (VR) content to be displayed to a user by the HWD is modified based on the identified pose. method, comprising: receiving angle of arrival, AOA, data and inertial data at a head-wearable display, HWD,; and fusing the AOA data and the inertial data to determine a relative pose of the HWD and a wearable device. 19
15. The method of claim 14, further comprising: generating AOA data at the HWD by transmitting an ultra-wideband, UWB, signal from the HWD to the wearable device.
16. A computer system, comprising: a head wearable display, HWD, configured to determine angle of arrival, AOA, data based on an angle of arrival of a received signal; and at least one processor configured to identify a relative pose between the HWD and a wearable device based on the AOA data.
17. The computer system of claim 16, further comprising: an inertial measurement unit (I MU) to generate inertial data based on a pose of the HWD; and wherein the processor is to identify the relative pose based on the inertial data.
18. The computer system of claim 17, further comprising: a data fuse module configured to fuse the AOA data with the inertial data to generate fused pose data; and wherein the processor is to identify the relative pose based on the fused pose data.
19. The computer system of claim 18, wherein the data fuse module is configured to fuse the AOA data with the inertial data based on a stochastic estimation process.
20. The computer system of claim 19, wherein the stochastic estimation process comprises a Kalman filter.
21 . The computer system of claim 19 or 20, wherein the stochastic estimation process comprises a particle filter.
22. The computer system of any one of claims 19 to 21 , wherein the stochastic estimation process comprises a weighted least square bundle adjustment. 20 e computer system of any one of claims 18 to 22, wherein the data fuse module is configured to fuse the AOA data with the inertial data comprises based on a machine learning model. e computer system of any one of claims 16 to 23, wherein the HWD is configured to determine the AOA data based on transmission of an ultra- wideband, UWB, signal. A head wearable display, HWD, comprising at least one receiver configured to receive a signal from wearable device; and at least one processor configured to determine angle of arrival, AOA, data based on an angle of arrival of a received signal and to identify a relative pose between the HWD and the wearable device based on the AOA data.
PCT/US2020/067438 2020-12-30 2020-12-30 Device tracking with angle-of-arrival data WO2022146424A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202080105193.0A CN116113911A (en) 2020-12-30 2020-12-30 Equipment Tracking Using Angle of Arrival Data
US18/035,970 US20230418369A1 (en) 2020-12-30 2020-12-30 Device tracking with integrated angle-of-arrival data
PCT/US2020/067438 WO2022146424A1 (en) 2020-12-30 2020-12-30 Device tracking with angle-of-arrival data
EP20848822.1A EP4185938A1 (en) 2020-12-30 2020-12-30 Device tracking with angle-of-arrival data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/067438 WO2022146424A1 (en) 2020-12-30 2020-12-30 Device tracking with angle-of-arrival data

Publications (1)

Publication Number Publication Date
WO2022146424A1 true WO2022146424A1 (en) 2022-07-07

Family

ID=74418524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/067438 WO2022146424A1 (en) 2020-12-30 2020-12-30 Device tracking with angle-of-arrival data

Country Status (4)

Country Link
US (1) US20230418369A1 (en)
EP (1) EP4185938A1 (en)
CN (1) CN116113911A (en)
WO (1) WO2022146424A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116321418A (en) * 2023-03-02 2023-06-23 中国人民解放军国防科技大学 A Fusion Estimation and Positioning Method for Swarm UAV Based on Node Configuration Optimization
CN118732854A (en) * 2024-07-12 2024-10-01 方田医创(成都)科技有限公司 Mixed reality display system, display method and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020214708A1 (en) * 2019-04-17 2020-10-22 Prestacom Services Llc Finding a target device using augmented reality
US20200334393A1 (en) * 2017-02-22 2020-10-22 Middle Chart, LLC Method and Apparatus for Automated Site Augmentation
US20200364381A1 (en) * 2017-02-22 2020-11-19 Middle Chart, LLC Cold storage environmental control and product tracking

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016017997A1 (en) * 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10419655B2 (en) * 2015-04-27 2019-09-17 Snap-Aid Patents Ltd. Estimating and using relative head pose and camera field-of-view
US10078377B2 (en) * 2016-06-09 2018-09-18 Microsoft Technology Licensing, Llc Six DOF mixed reality input by fusing inertial handheld controller with hand tracking
US10295651B2 (en) * 2016-09-21 2019-05-21 Pinhas Ben-Tzvi Linear optical sensor arrays (LOSA) tracking system for active marker based 3D motion tracking
DK3545385T3 (en) * 2016-11-25 2021-10-04 Sensoryx AG PORTABLE MOTION TRACKING SYSTEM
US20210208232A1 (en) * 2018-05-18 2021-07-08 Ensco, Inc. Position and orientation tracking system, apparatus and method
CN112102498A (en) * 2019-06-18 2020-12-18 明日基金知识产权控股有限公司 System and method for virtually attaching applications to dynamic objects and enabling interaction with dynamic objects
US11948401B2 (en) * 2019-08-17 2024-04-02 Nightingale.ai Corp. AI-based physical function assessment system
CN115053204A (en) * 2020-01-31 2022-09-13 七哈格斯实验室公司 Low profile pointing device sensor fusion
JP7658375B2 (en) * 2020-02-06 2025-04-08 バルブ コーポレーション Position Tracking System for Head Mounted Display Systems
JP7012806B1 (en) * 2020-11-26 2022-01-28 三菱電機株式会社 Driver posture measuring device and vehicle control device
US11892550B2 (en) * 2020-12-22 2024-02-06 Samsung Electronics Co., Ltd. Three-dimensional angle of arrival capability in electronic devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200334393A1 (en) * 2017-02-22 2020-10-22 Middle Chart, LLC Method and Apparatus for Automated Site Augmentation
US20200364381A1 (en) * 2017-02-22 2020-11-19 Middle Chart, LLC Cold storage environmental control and product tracking
WO2020214708A1 (en) * 2019-04-17 2020-10-22 Prestacom Services Llc Finding a target device using augmented reality

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116321418A (en) * 2023-03-02 2023-06-23 中国人民解放军国防科技大学 A Fusion Estimation and Positioning Method for Swarm UAV Based on Node Configuration Optimization
CN116321418B (en) * 2023-03-02 2024-01-02 中国人民解放军国防科技大学 Cluster unmanned aerial vehicle fusion estimation positioning method based on node configuration optimization
CN118732854A (en) * 2024-07-12 2024-10-01 方田医创(成都)科技有限公司 Mixed reality display system, display method and storage medium

Also Published As

Publication number Publication date
EP4185938A1 (en) 2023-05-31
CN116113911A (en) 2023-05-12
US20230418369A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
EP3000011B1 (en) Body-locked placement of augmented reality objects
US12141341B2 (en) Systems and methods for tracking a controller
EP3486707B1 (en) Perception based predictive tracking for head mounted displays
KR102374251B1 (en) Environmental interrupt in a head-mounted display and utilization of non field of view real estate
EP3469458B1 (en) Six dof mixed reality input by fusing inertial handheld controller with hand tracking
EP3859495B1 (en) Systems and methods for tracking motion and gesture of heads and eyes
WO2021118745A1 (en) Content stabilization for head-mounted displays
US20180335834A1 (en) Tracking torso orientation to generate inputs for computer systems
CN109117684A (en) System and method for the selective scanning in binocular augmented reality equipment
WO2018106390A1 (en) Ocular video stabilization
CN116324580A (en) Geometric modeling of eye-wear devices with flexible frames
KR20160021126A (en) Shared and private holographic objects
US11762623B2 (en) Registration of local content between first and second augmented reality viewers
JP6986003B2 (en) Tracking system and its tracking method
JP2022516220A (en) User interaction on a head-mounted display with optotype tracking
CN109478096A (en) Display communication
US20230418369A1 (en) Device tracking with integrated angle-of-arrival data
KR102745328B1 (en) Map-aided inertial odometry with neural network for augmented reality devices
EP4222707A1 (en) Pose tracking for rolling shutter camera
JP2017161645A (en) Display control method, communication device, display control program, and display control device
CN118318219A (en) Augmented reality display with eye image stabilization
US20210201011A1 (en) Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device
US20180182124A1 (en) Estimation system, estimation method, and estimation program
WO2019221763A1 (en) Position and orientation tracking system, apparatus and method
US12154219B2 (en) Method and system for video transformation for video see-through augmented reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20848822

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202347012359

Country of ref document: IN

ENP Entry into the national phase

Ref document number: 2020848822

Country of ref document: EP

Effective date: 20230224

WWE Wipo information: entry into national phase

Ref document number: 18035970

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

OSZAR »