US10409057B2 - Systems, devices, and methods for laser eye tracking in wearable heads-up displays - Google Patents
Systems, devices, and methods for laser eye tracking in wearable heads-up displays Download PDFInfo
- Publication number
- US10409057B2 US10409057B2 US15/827,667 US201715827667A US10409057B2 US 10409057 B2 US10409057 B2 US 10409057B2 US 201715827667 A US201715827667 A US 201715827667A US 10409057 B2 US10409057 B2 US 10409057B2
- Authority
- US
- United States
- Prior art keywords
- eye
- user
- scan mirror
- processor
- laser light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 230000003595 spectral effect Effects 0.000 claims abstract description 7
- 230000003287 optical effect Effects 0.000 claims description 61
- 230000009466 transformation Effects 0.000 claims description 26
- 210000001747 pupil Anatomy 0.000 claims description 25
- 238000013507 mapping Methods 0.000 claims description 11
- 238000010408 sweeping Methods 0.000 claims description 10
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 5
- 238000013528 artificial neural network Methods 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 3
- YDCXJAACEIUSDW-UHFFFAOYSA-N 2-chloro-n-(2,2,2-trichloro-1-hydroxyethyl)acetamide Chemical compound ClC(Cl)(Cl)C(O)NC(=O)CCl YDCXJAACEIUSDW-UHFFFAOYSA-N 0.000 description 39
- 239000000463 material Substances 0.000 description 14
- 239000010410 layer Substances 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 210000004087 cornea Anatomy 0.000 description 6
- 230000008878 coupling Effects 0.000 description 6
- 238000010168 coupling process Methods 0.000 description 6
- 238000005859 coupling reaction Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 210000003128 head Anatomy 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 238000013461 design Methods 0.000 description 5
- 230000037361 pathway Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000005286 illumination Methods 0.000 description 3
- 210000001210 retinal vessel Anatomy 0.000 description 3
- 229910052709 silver Inorganic materials 0.000 description 3
- 239000004332 silver Substances 0.000 description 3
- -1 silver halide compound Chemical class 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 210000000554 iris Anatomy 0.000 description 2
- 239000002356 single layer Substances 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
- G02B26/10—Scanning systems
- G02B26/101—Scanning systems with both horizontal and vertical deflecting means, e.g. raster or XY scanners
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B27/0103—Head-up displays characterised by optical features comprising holographic elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/32—Holograms used as optical elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3102—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3161—Modulator illumination systems using laser light sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/015—Head-up displays characterised by mechanical features involving arrangement aiming to get less bulky devices
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
- G02B26/0816—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
- G02B26/0833—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
Definitions
- the present systems, devices, and methods generally relate to scanning laser-based eye tracking technologies and particularly relate to integrating eye tracking functionality into a scanning laser projector-based wearable heads-up display.
- a head-mounted display is an electronic device that is worn on a user's head and, when so worn, secures at least one electronic display within a viewable field of at least one of the user's eyes, regardless of the position or orientation of the user's head.
- a wearable heads-up display is a head-mounted display that enables the user to see displayed content but also does not prevent the user from being able to see their external environment.
- the “display” component of a wearable heads-up display is either transparent or at a periphery of the user's field of view so that it does not completely block the user from being able to see their external environment. Examples of wearable heads-up displays include: the Google Glass®, the Optinvent Ora®, the Epson Moverio®, and the Sony Glasstron®, just to name a few.
- a challenge in the design of wearable heads-up displays is to minimize the bulk of the face-worn apparatus will still providing displayed content with sufficient visual quality.
- Eye tracking is a process by which the position, orientation, and/or motion of the eye may be measured, detected, sensed, determined (collectively, “measured”), and/or monitored. In many applications, this is done with a view towards determining the gaze direction of a user.
- the position, orientation, and/or motion of the eye may be measured in a variety of different ways, the least invasive of which typically employ one or more optical sensor(s) (e.g., cameras) to optically track the eye.
- Common techniques involve illuminating or flooding the entire eye, all at once, with infrared light and measuring reflections with at least one optical sensor that is tuned to be sensitive to the infrared light. Information about how the infrared light is reflected from the eye is analyzed to determine the position(s), orientation(s), and/or motion(s) of one or more eye feature(s) such as the cornea, pupil, iris, and/or retinal blood vessels.
- Eye tracking functionality is highly advantageous in applications of wearable heads-up displays.
- Some examples of the utility of eye tracking in wearable heads-up displays include: influencing where content is displayed in the user's field of view, conserving power by not displaying content that is outside of the user's field of view, influencing what content is displayed to the user, determining where the user is looking or gazing, determining whether the user is looking at displayed content on the display or through the display at their external environment, and providing a means through which the user may control/interact with displayed content.
- incorporating eye tracking functionality in a wearable heads-up display conventionally adds unwanted bulk to the system.
- Eye tracking systems available today generally implement multiple dedicated components with very stringent positioning requirements which undesirably increase the overall size and form factor of the system when incorporated into a wearable heads-up display. There is a need in the art for systems, devices, and methods of eye tracking that can integrate into wearable heads-up displays with minimal effect on the size and form factor of the system.
- a method of determining a gaze direction of an eye of a user may be summarized as including: generating an infrared laser light by an infrared laser diode; scanning the infrared laser light over the eye of the user by at least one scan mirror, wherein scanning the infrared laser light over the eye of the user by the at least one scan mirror includes sweeping the at least one scan mirror through a range of orientations and, for a plurality of orientations of the at least one scan mirror, reflecting the infrared laser light to a respective region of the eye of the user, for example along an optical path that extends between the scan mirror and the eye of the user; detecting reflections of the infrared laser light from the eye of the user by at least one infrared photodetector; determining a respective intensity of a plurality of detected reflections of the infrared laser light by at least one processor communicatively coupled to the at least one infrared photodetector; identifying, by the processor, at least one detected reflection for
- Scanning the infrared laser light over the eye of the user by at least one scan mirror may include scanning, by the at least one scan mirror, the infrared laser light over an area of a holographic optical element positioned in the field of view of the eye of the user and redirecting the infrared laser light towards the eye of the user by the holographic optical element.
- Redirecting the infrared laser light towards the eye of the user by the holographic optical element may include converging the infrared laser light to an exit pupil at the eye of the user by the holographic optical element, where the exit pupil encompasses at least the cornea of the eye of the user.
- Scanning the infrared laser light over the eye of the user by at least one scan mirror may include scanning the infrared laser light across a first dimension of the eye of the user by a first scan mirror and scanning the infrared laser light across a second dimension of the eye of the user by a second scan mirror.
- determining, by the processor, the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value may include determining, by the processor, the combination of the first orientation of the first scan mirror and the second orientation of the second scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value; and determining, by the processor, a region in a field of view of the eye of the user
- Identifying, by the processor, at least one detected reflection for which the intensity exceeds a threshold value may include detecting, by the infrared photodetector, a spectral reflection of the infrared laser light from the eye of the user.
- Identifying, by the processor, at least one detected reflection for which the intensity exceeds a threshold value may include sampling, by the processor, a signal from the infrared photodetector and identifying, by the processor, a first sample for which the magnitude exceeds a threshold magnitude.
- identifying, by the processor, at least one detected reflection for which the intensity exceeds a threshold value may further include identifying, by the processor, a second sample for which the magnitude does not exceed the threshold magnitude.
- Determining, by the processor, a region in a field of view of the eye of the user at which a gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value may include effecting, by the processor, a mapping between the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value and the field of view of the eye of the user.
- Effecting, by the processor, a mapping between the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value and the field of view of the eye of the user may include performing, by the processor, at least one transformation selected from a group consisting of: a linear transformation between a set of scan mirror orientations and a set of gaze directions of the eye of the user, a geometric transformation between a set of scan mirror orientations and a set of gaze directions of the eye of the user, an affine transformation between a set of the scan mirror orientations and a set of gaze directions of the eye of the user, and a neural network transformation between a set of scan mirror orientations and a set of gaze directions of the eye of the user.
- the infrared laser diode and the at least scan mirror may be components of a scanning laser projector, the scanning laser projector may further include at least one additional laser diode to generate visible laser light, and the method may further include projecting visible display content in the field of view of the eye of the user by the scanning laser projector.
- determining, by the processor, a region in a field of view of the eye of the user at which a gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value may include determining, by the processor, a region of the visible display content at which the gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value.
- Determining, by the processor, a region of the visible display content at which the gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value may include performing, by the processor, at least one transformation selected from a group consisting of: a linear transformation between a set of scan mirror orientations and a set of regions of the visible display content, a geometric transformation between a set of scan mirror orientations and a set of regions of the visible display content, an affine transformation between a set of the scan mirror orientations and a set of regions of the visible display content, and a neural network transformation between a set of scan mirror orientations and a set of regions of the visible display content.
- the at least one infrared photodetector may be positioned at a first position at a periphery of the field of view of the eye of the user when the eye is gazing straight ahead, and projecting visible display content in the field of view of the eye of the user by the scanning laser projector may include positioning, by the scanning laser projector, the visible display content away-from-center in the field of view of the eye of the user and towards the position of the at least one infrared photodetector at the periphery of the field of view of the eye of the user.
- a wearable heads-up display may be summarized as including: a support frame that in use is worn on a head of a user; a scanning laser projector carried by the support frame, the scanning laser projector including: an infrared laser diode; at least one visible light laser diode; and at least one scan mirror; an infrared photodetector carried by the support frame; a processor carried by the support frame, the processor communicatively coupled to the scanning laser projector and the at least one infrared photodetector; and a non-transitory processor-readable storage medium communicatively coupled to the processor, wherein the non-transitory processor-readable storage medium stores data and/or processor-executable instructions that, when executed by the processor, cause the wearable heads-up display to: generate an infrared laser light by the infrared laser diode; scan the infrared laser light over the eye of the user by the at least one scan mirror, wherein scanning the infrared laser light over the eye of the user by the
- the wearable heads-up display may further include: a wavelength-multiplexed holographic optical element carried by the support frame and positioned within a field of view of an eye of the user when the support frame is worn on the head of the user, the wavelength-multiplexed holographic optical element aligned to receive both the infrared light and the visible light from the scanning laser projector and to redirect both the infrared light and the visible light towards the eye of the user when the support frame is worn on the head of the user, wherein the wavelength-multiplexed holographic optical element includes a first hologram that is responsive to the visible light and unresponsive to the infrared light and a second hologram that is responsive to the infrared light and unresponsive to the visible light, and wherein the wavelength-multiplexed holographic optical element is substantially transparent to environmental light.
- the first hologram that is responsive to the visible light may converge the visible light to a first exit pupil at the eye of the user and the second hologram that is responsive to the infrared light may converge the infrared light to a second exit pupil at the eye of the user, the first exit pupil contained within the second exit pupil at the eye of the user.
- the non-transitory processor-readable storage medium may further store data and/or processor-executable instructions that, when executed by the processor, cause the wearable heads-up display to project visible display content in the field of view of the eye of the user by the scanning laser projector.
- the data and/or processor-executable instructions that, when executed by the processor, cause the wearable heads-up display to determine, by the processor, a region in a field of view of the eye of the user at which a gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value may cause the wearable heads-up display to determine, by the processor, a region of the visible display content at which the gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value.
- the at least one infrared photodetector may be positioned on the support frame at a periphery of the field of view of the eye of the user when the eye is gazing straight ahead, and the data and/or processor-executable instructions that, when executed by the processor, cause the WHUD to project visible display content in the field of view of the eye of the user by the scanning laser projector, may cause the scanning laser projector to position the visible display content away-from-center in the field of view of the eye of the user and towards the position of the at least one infrared photodetector at the periphery of the field of view of the eye of the user.
- the support frame may have a general shape and appearance of a pair of eyeglasses.
- the at least one visible light laser diode in the scanning laser projector may include at least one visible light laser diode selected from a group consisting of: a red laser diode, a green laser diode, a blue laser diode, and any combination of a red laser diode, a green laser diode, and/or a blue laser diode.
- FIG. 1 is a flow-diagram showing a method of determining a gaze direction of an eye of a user in accordance with the present systems, devices, and methods.
- FIG. 2 is an illustrative diagram showing a side view of a wearable heads-up display that is operative to perform the method of FIG. 1 in accordance with the present systems, devices, and methods.
- FIG. 3 is an illustrative diagram showing a side view of a wearable heads-up display that includes a multiplexed holographic optical element and is operative to perform the method of FIG. 1 in accordance with the present systems, devices, and methods.
- FIG. 4 is a perspective view of a wearable heads-up display that is operative to perform the method of FIG. 1 in accordance with the present systems, devices, and methods.
- the various embodiments described herein provide systems, devices, and methods for laser eye tracking in wearable heads-up displays. More specifically, the various embodiments described herein provide methods of determining the gaze direction of an eye of a user and are particularly well-suited for use in wearable heads-up displays (“WHUDs”) that employ scanning laser projectors (“SLPs”). Examples of WHUD systems, devices, and methods that are particularly well-suited for use in conjunction with the present systems, devices, and methods for laser eye tracking are described in, for example, U.S. Non-Provisional patent application Ser. No. 15/167,458, U.S. Non-Provisional patent application Ser. No. 15/167,472, and U.S. Non-Provisional patent application Ser. No. 15/167,484.
- FIG. 1 is a flow-diagram showing a scanning laser-based method 100 of determining a gaze direction of an eye of a user in accordance with the present systems, devices, and methods.
- Method 100 includes seven acts 101 , 102 , 103 , 104 , 105 , 106 , and 107 , though those of skill in the art will appreciate that in alternative embodiments certain acts may be omitted and/or additional acts may be added. Those of skill in the art will also appreciate that the illustrated order of the acts is shown for exemplary purposes only and may change in alternative embodiments.
- the term “user” refers to a person that is operating and/or wearing the hardware elements described in acts 101 - 107 (e.g., a person that is wearing a wearable heads-up display, as described in more detail later on).
- an infrared laser diode generates infrared laser light.
- the infrared laser diode may activate and remain active in order to continuously generate a continuous beam of infrared laser light, or the infrared laser diode may be modulated to generate a sequence or pattern of infrared laser light.
- the term “infrared” includes “near infrared” and generally refers to a wavelength of light that is larger than the largest wavelength of light that is typically visible to the average human eye.
- visible light Light that is visible to the average human eye (i.e., “visible light” herein) is generally in the range of 400 nm-700 nm, so as used herein the term “infrared” refers to a wavelength that is greater than 700 nm, up to 1 mm. As used herein and in the claims, visible means that the light includes wavelengths within the human visible portion of the electromagnetic spectrum, typically from approximately 400 nm (violet) to approximately 700 nm (red).
- At 102 at least one scan mirror scans the infrared laser light over the eye of the user.
- the at least one scan mirror may scan the infrared laser light over (e.g., completely illuminate) a substantially continuous surface of the eye or the at least one scan mirror may scan the infrared laser light to form an illumination pattern on the surface of the eye (such as a grid pattern, a crosshairs pattern, and so on).
- the at least one scan mirror may sweep through a range of orientations and, for a plurality of orientations of the at least one scan mirror (i.e., for each respective orientation of the at least one scan mirror if the infrared laser diode is continuously active in order to completely illuminate the corresponding surface of the eye, or for a subset of orientations of the at least one scan mirror if the infrared laser diode is modulated such that the combination of subsets of orientations of the at least one scan mirror and the modulation pattern of the infrared laser diode produces an illumination pattern on the corresponding surface of the eye), the at least one scan mirror may receive the infrared laser light from the infrared laser diode and reflect the infrared laser light to a respective region of the eye of the user.
- the at least one scan mirror may include one or multiple (e.g., in a DLP configuration) digital microelectromechanical systems (“MEMS”) mirror(s) or one or multiple piezoelectric mirrors.
- MEMS digital microelectromechanical systems
- the at least one scan mirror may scan infrared laser light directly over at least a portion of the eye of the user. That is, infrared light may travel directly from the at least one scan mirror to the eye of the user without being redirected along the way by any intervening optics.
- the at least one scan mirror may indirectly scan infrared laser light over at least a portion of the eye of the user by scanning the infrared laser light over an area, or through a volume, of a light-redirection element (such as a holographic optical element (“HOE”) comprising at least one hologram, a diffraction grating, a mirror, a partial mirror, and/or a waveguide structure) positioned in the field of view of the eye of the user and the light-redirection element may redirect the infrared laser light towards the eye of the user.
- a light-redirection element such as a holographic optical element (“HOE”) comprising at least one hologram, a diffraction grating, a mirror, a partial mirror, and/or a waveguide structure
- infrared light may travel from the at least one scan mirror to any number of intervening optics (e.g., HOEs, waveguides, etc.) and ultimately arrive at the eye of the user after any number of further redirections by the intervening optics.
- the light-redirection element e.g., the HOE or waveguide
- the exit pupil may encompasses, for example, at least the cornea of the eye of the user (when the user is looking in a specific direction, such as straight ahead or straight towards display content displayed by a WHUD).
- the exit pupil may encompass only the pupil of the eye of the user, or only a region of the eye of the user where the “glint” is expected to occur (i.e., an area less than the cornea of the eye of the user).
- reflections of the infrared laser light from the eye of the user are detected by at least one infrared sensor, such as an infrared detector or, more specifically, an infrared photodetector.
- the at least one infrared sensor may be communicatively coupled to a processor (e.g., a digital processor, or an application-specific integrated circuit) and provide an output signal having a magnitude that depends on an intensity of the infrared laser light detected by the infrared sensor.
- At 104 at least one processor communicatively coupled to the at least one infrared sensor determines a respective intensity of a plurality of the reflections of the infrared laser light detected by the infrared sensor (i.e., “detected reflections”) at 103 .
- the percentage of detected reflections for which the processor determines an intensity may depend on, for example, the sampling rate of the processor.
- the “intensity” of a detected reflection may be a measure of, for example, the brightness of the detected reflection, the luminance of the detected reflection, and/or the power of the detected reflection.
- the processor identifies at least one detected reflection for which the intensity exceeds a threshold value.
- the at least one infrared sensor may be oriented to detect both spectral and diffuse reflections of the infrared laser light from the eye of the user; however, in some implementations the processor may specifically identify, at 105 , a detected reflection for which the intensity exceeds a threshold value only when the infrared sensor detects, at 103 , a spectral reflection of the infrared laser light from the eye of the user.
- spectral reflection may, for example, correspond to the cornea reflection, first Purkinje image, or “glint.”
- the processor may sample the signal output by the at least one infrared sensor, where the magnitude of the signal (and therefore the magnitude of each sample) depends on the intensity of the infrared laser light detected by the at least one infrared sensor.
- the processor may identify at least one detected reflection for which the intensity exceeds a threshold value by identifying a first sample (in a series of samples) for which the magnitude exceeds a threshold magnitude.
- identifying, by the processor, at least one detected reflection for which the intensity exceeds a threshold value may be an edge-triggered (e.g., rising edge-triggered) process.
- the processor may then continue to identify that subsequent detected reflections each have intensities that do exceed the threshold until the processor identifies a second sample in the series for which the magnitude does not exceed the threshold magnitude (e.g., a falling edge-triggered process).
- the processor determines the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value. In other words, the processor determines which orientation of the at least one scan mirror (from 102 ) caused the infrared laser light to reflect from the eye of the user (as detected at 103 ) with an intensity that exceeds the threshold value (as determined at 104 and 105 ).
- the processor determines a region in a field of view of the eye of the user at which a gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value (as determined at 106 ). Generally, this may include effecting, by the processor, a mapping between the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value and the field of view of the eye of the user.
- the processor may essentially effect a mapping between “detected reflection space” and “mirror orientation space” which, since only detected reflections that exceed the threshold value are of interest and since detected reflections that exceed the threshold value may generally be “glints,” may be interpreted as a mapping between “glint space” and “mirror orientation space.”
- the processor may essentially effect a mapping between “mirror orientation space” and gaze direction of the eye based on established correlations between various mirror orientations and where the corresponding infrared laser light would appear in the user's field of view (e.g., if redirected by a light-redirection element such as an HOE positioned in the user's field of view) if the infrared laser light was visible to the user.
- acts 103 - 107 may essentially effect a mapping between “glint space” and “gaze direction space.”
- the processor may, at 107 , effect a mapping between the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value (e.g., “glint space”) and the field of view of the eye of the user (e.g., “field of view space”) by performing at least one transformation between a set of scan mirror orientations and a set of gaze directions of the eye of the user, such as a linear transformation, a geometric transformation, an affine transformation, and/or a neural network-based transformation.
- the threshold value e.g., “glint space”
- the field of view of the eye of the user e.g., “field of view space”
- the at least one scan mirror may include a single scan mirror that is controllably orientable about two orthogonal axes or two scan mirrors that are each respectively controllable about a respective axis, with the respective axes about which the two scan mirrors are controllably orientable being orthogonal to one another.
- a single scan mirror may scan the infrared laser light over two dimensions of the user's eye, or a first scan mirror may scan the infrared laser light across a first dimension of the eye and a second scan mirror may scan the infrared laser light across a second dimension of the eye.
- the “at least one scan mirror” was said to “sweep through a range of orientations.” In the case of two orthogonal scan mirrors, this may mean that a first scan mirror sweeps through a first range of orientations and, for each respective orientation of the first scan mirror, a second scan mirror sweeps through a second range of orientations.
- the at least one scan mirror receives the infrared laser light from the infrared laser diode and reflects the infrared laser light to (either directly or indirectly via, e.g., an HOE or waveguide) a respective region of the eye of the user,” with two orthogonal scan mirrors the infrared laser light is reflected to a respective region of the eye of the user for each respective combination of a first orientation of the first scan mirror and a second orientation of the second scan mirror.
- the processor may determine, at 106 , the combination of the first orientation of the first scan mirror and the second orientation of the second scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value and the processor may, at 107 , determine the region in the field of view of the eye of the user at which the gaze of the eye is directed based on the combination of the first orientation of the first scan mirror and the second orientation of the second scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value.
- method 100 may be particularly advantageous when implemented in a WHUD that employs a SLP because in such an implementation the eye tracking (i.e., gaze direction detection) functionality of method 100 may be achieved with minimal hardware additions (and correspondingly minimal bulk and impact on aesthetic design) to the WHUD.
- method 100 may be extended to include a projection of display content to the user and a determination of where in the display content the user's gaze is directed.
- the infrared laser diode and the at least one scan mirror of method 100 may be components of a SLP, and the SLP may further include at least one additional laser diode to generate visible laser light.
- method 100 may be extended to include projecting visible display content in the field of view of the eye of the user by the SLP and, at 107 , the processor may determine a region of the visible display content at which the gaze of the eye is directed based on the orientation of the at least one scan mirror that corresponds to the at least one detected reflection for which the intensity exceeds the threshold value.
- the processor may determine a region of the visible display content at which the gaze of the eye is directed by performing a transformation between a set of scan mirror orientations and a set of regions of the visible display content. In other words, the processor may effect a mapping between “mirror orientation space” (or “glint space,” as previously described) and “display space.”
- a position of the at least one infrared sensor (e.g., the at least one infrared photodetector) relative to the eye of the user is an important design parameter that may influence the overall performance of method 100 and against which various acts of method 100 (e.g., acts 104 and 105 ) must be calibrated.
- the at least one infrared sensor may be positioned at a first position that corresponds to a periphery of the field of view of the eye of the user when the eye is gazing straight ahead (such that the at least one infrared photodetector does not obstruct the user's field of view when the user is gazing straight ahead).
- the at least one infrared photodetector In order to maximize the resolution and overall performance of the gaze detection achieved by method 100 , it can be advantageous for the at least one infrared photodetector to be positioned and oriented such that it has maximal “visibility” of the gaze directions of interest, which may generally be achieved by, in some exemplary implementations, positioning the at least one infrared photodetector as close as possible to the center of the range of gaze directions of interest without obscuring the user's field of view.
- FIG. 2 is an illustrative diagram showing a WHUD 200 that includes a SLP 210 with an integrated eye tracking functionality in accordance with the present systems, devices, and methods.
- SLP 210 comprises a laser module 211 that includes red laser diode (labelled “R” in FIG. 2 ), a green laser diode (labelled “G” in FIG. 2 ), and a blue laser diode (labelled “B” in FIG. 2 ) and a scan mirror 212 (a single mirror illustrated for simplicity, though as previously described at least two orthogonally-orientable mirrors may be used).
- laser module 211 also includes an infrared laser diode (labelled “IR” in FIG. 2 ) for use in eye tracking/gaze detection.
- Scan mirror 212 simultaneously serves as both the scan mirror for laser projection and a scan mirror for eye tracking, whereby scan mirror 212 scans infrared laser light (represented by dashed lines 222 in FIG. 2 ) over the area of eye 290 to sequentially illuminate an area of eye 290 (e.g., via a raster scan of IR light).
- an infrared laser diode is integrated into laser module 211 of SLP 210 and scan mirror 212 serves to scan both visible (R, G, and/or B) and infrared (IR) laser light over eye 290 .
- Scan mirror 212 may advantageously include one or multiple (e.g., in a DLP configuration) digital microelectromechanical systems (“MEMS”) mirror(s).
- MEMS digital microelectromechanical systems
- scan mirror 212 of SLP 210 repeatedly scans over its entire range of orientations and effectively scans over the entire field of view of the display. Whether or not an image/pixel is projected at each scan orientation depends on controlled modulation of laser module 211 and its synchronization with scan mirror 212 .
- the fact that scan mirror 212 generally scans over its entire range during operation as a laser projector makes scan mirror 212 of SLP 210 compatible with use for eye tracking purposes.
- SLP 210 is adapted to provide eye tracking functionality without having to compromise or modify its operation as a SLP.
- scan mirror 212 repeatedly scans over its entire range of orientations while the RGB laser diodes are modulated to provide the visible light 221 corresponding to pixels of a scanned image or, generally, “display content.”
- the infrared laser diode may be activated to illuminate the user's eye 290 (one spot or pixel at a time, each corresponding to a respective scan mirror orientation) with infrared laser light 222 for eye tracking purposes.
- the infrared laser diode may simply be on at all times to completely illuminate (i.e., scan over the entire area of) eye 290 with infrared laser light 222 or the infrared laser diode may be modulated to provide an illumination pattern (e.g., a grid, a set of parallel lines, a crosshair, or any other shape/pattern) on eye 290 . Because infrared laser light 222 is invisible to eye 290 of the user, infrared laser light 222 does not interfere with the scanned image being projected by SLP 210 .
- an illumination pattern e.g., a grid, a set of parallel lines, a crosshair, or any other shape/pattern
- WHUD 200 includes at least one infrared photodetector 250 . While only one photodetector 250 is depicted in FIG. 2 , in alternative embodiments any number of photodetectors 250 may be used (e.g., an array of photodetectors 250 , or a charge-coupled device based camera that is responsive to light in the infrared wavelength range) positioned in any arrangements and at any desired location(s) depending on the implementation.
- scan mirror 212 scans modulated R, G, and/or B light 221 over eye 290 to produce display content based on modulation of the R, G, and/or B laser diodes
- scan mirror 212 also scans infrared laser light 222 over eye 290 based on modulation of the IR laser diode.
- Photodetector 250 detects an intensity pattern or map of reflected infrared laser light 222 that depends on the position/orientation of eye 290 .
- each distinct orientation of scan mirror 212 may result in a respective intensity of infrared laser light 222 being detected by photodetector 250 that depends on the position/orientation of eye 290 (or the position/orientation of feature(s) of eye 290 , such as the cornea, iris, pupil, and so on).
- the intensity pattern/map detected by photodetector 250 depends on where eye 290 is looking. In this way, the same SLP 210 in WHUD 200 enables both i) image projection, and ii) the gaze direction and movements of eye 290 to be measured and tracked.
- WHUD 200 includes a HOE 230 that redirects laser light output from the laser module 211 of SLP 210 towards eye 290 ; however, in WHUD 200 , HOE 230 includes at least two wavelength-multiplexed holograms: at least a first hologram 231 that is responsive to (i.e., redirects at least a portion of, the magnitude of the portion depending on the playback efficiency of the first hologram) the visible light 221 output by laser module 211 and unresponsive to (i.e., transmits) the infrared light 222 output by laser module 211 , and a second hologram 232 that is responsive to (i.e., redirects at least a portion of, the magnitude of the portion depending on the playback efficiency of the second hologram) the infrared light 222 output by laser module 211 and unresponsive to (i.e
- FIG. 2 depicts first hologram 231 as a single hologram
- the aspect(s) of HOE 230 that is/are responsive to the visible light 221 output by laser module 211 may include any number of holograms that may be multiplexed in a variety of different ways, including without limitation: wavelength multiplexed (i.e., a “red” hologram that is responsive to only red light from the red laser diode of laser module 211 , a “green” hologram that is responsive to only green light from the green laser diode of laser module 211 , and a “blue” hologram that is responsive to only blue light from the blue laser diode of laser module 211 ), angle multiplexed (e.g., for the purpose of eye box expansion/replication), phase multiplexed, spatially multiplexed, temporally multiplexed, and so on.
- wavelength multiplexed i.e., a “red” hologram that is responsive to only red light from the red laser
- first hologram 231 may apply a first optical power to visible light 221 .
- the first optical power applied by first hologram 231 may be a positive optical power that focuses or converges the visible light 221 to, for example, an exit pupil having a diameter less than one centimeter (e.g., 6 mm, 5 mm, 4 mm, 3 mm) at the eye 290 of the user for the purpose of providing a clear and focused image with a wide field of view.
- second hologram 232 may apply a second optical power to infrared light 222 , where the second optical power applied by second hologram 232 is different from the first optical power applied by first hologram 231 .
- the first optical power may be greater than the second optical power (and therefore, the second optical power may be less than the first optical power) so that second hologram 232 redirects infrared light 222 over an area of eye 290 that is larger than the exit pupil of visible light 221 at eye 290 .
- the first hologram that is responsive to the visible light may converge the visible light to a first exit pupil at the eye of the user and the second hologram that is responsive to the infrared light may converge the infrared light to a second exit pupil at the eye of the user, where the first exit pupil is completely contained within the second exit pupil at the eye of the user.
- the second optical power of second hologram 232 may apply a rate of convergence to infrared light 222 that is less than the rate of convergence applied to visible light 221 by the first optical power of first hologram 231 , or the second optical power may be zero such that second hologram 232 redirects infrared light 222 towards eye 290 without applying any convergence thereto, or the second optical power may be negative (i.e., less than zero) so that the second optical power of second hologram 232 causes infrared light 222 to diverge (i.e., applies a rate of divergence thereto) to cover, for example, the entire area of eye 290 (and beyond, if desired) for the purpose of illuminating a large area of eye 290 and tracking all eye positions/motions within that illuminated area.
- HOE 230 may comprise a single volume of holographic material (e.g., photopolymer or a silver halide compound) that encodes, carries, has embedded therein or thereon, or generally includes both first hologram 231 and second hologram 232 , or alternatively HOE 230 may comprise at least two distinct layers of holographic material (e.g., photopolymer and/or a silver halide compound) that are laminated or generally layered together, a first layer of holographic material that includes first hologram 231 and a second layer of holographic material that includes second hologram 232 . More details of an exemplary multiplexed HOE are described later on with reference to FIG. 3 . In alternative implementations, other optics (such as waveguides, grating structures, and combinations thereof) may substitute for HOE 230 to achieve similar functionality.
- holographic material e.g., photopolymer or a silver halide compound
- HOE 230 may comprise at least two distinct layers of
- infrared light is advantageous in eye tracking systems because infrared light is invisible to the (average) human eye and so does not disrupt or interfere with other optical content being displayed to the user.
- Integrating an infrared laser diode into a SLP in accordance with the present systems, devices, and methods, enables visible laser projection and invisible eye tracking to be simultaneously performed by substantially the same hardware of a WHUD, thereby minimizing overall bulk and processing/power requirements of the system.
- FIG. 3 is an illustrative diagram showing a side view of a WHUD 300 that includes a wavelength-multiplexed HOE 330 that enables both image projection and eye tracking functionality in accordance with the present systems, devices, and methods.
- WHUD 300 is substantially similar to WHUD 200 from FIG. 2 with some details of HOE 230 enhanced for the purpose of illustration.
- WHUD 300 includes a SLP 310 adapted to include an infrared laser diode (labeled as “IR” in FIG. 3 ) for eye tracking purposes and a transparent combiner comprising a wavelength-multiplexed HOE 330 integrated with (e.g., laminated or otherwise layered upon, or cast within) an eyeglass lens 360 .
- IR infrared laser diode
- Integration of HOE 330 with lens 360 may include and/or employ the systems, devices, and methods described in U.S. Non-Provisional patent application Ser. No. 15/256,148 and/or U.S. Provisional Patent Application Ser. No. 62/268,892.
- HOE 330 is wavelength-multiplexed to respond differently (i.e., apply a different optical power to) different wavelengths of light incident thereon. More specifically, HOE 330 is a heterogeneous HOE including at least a first hologram that applies a first optical power to light 321 having a first wavelength (e.g., at least a first visible wavelength) and a second hologram that applies a second optical power to light 322 having a second wavelength (e.g., an infrared wavelength). The second optical power is different from the first optical power and the second wavelength is different from the first wavelength.
- a first hologram that applies a first optical power to light 321 having a first wavelength (e.g., at least a first visible wavelength)
- a second hologram that applies a second optical power to light 322 having a second wavelength (e.g., an infrared wavelength).
- the second optical power is different from the first optical power and the second wavelength is different from the first wavelength.
- HOE 330 may include any number of layers of holographic material (e.g., photopolymer, or a silver halide compound) carrying, encoding, containing, or otherwise including any number of holograms.
- a single layer of holographic material may include multiple holograms and/or individual holograms may be included on or in respective individual layers of holographic material.
- the “light having a first wavelength” and the “light having a second wavelength” respectively correspond to visible laser light 321 and infrared laser light 322 , both output by SLP 310 .
- SLP 310 outputs visible laser light 321 (represented by solid lines in FIG. 3 ) for the purpose of image projection and infrared laser light 322 (represented by dashed lines in FIG. 3 ) for the purpose of eye tracking.
- the visible laser light 321 may include light having at least one wavelength (e.g., red, green, or blue; or any combination of red, green, and/or blue) in the range of about 390 nm to about 700 nm and the infrared laser light 322 may include light having at least one wavelength in the range of about 700 nm to about 10 um.
- the visible laser light 321 may include light having at least one wavelength (e.g., red, green, or blue; or any combination of red, green, and/or blue) in the range of about 390 nm to about 700 nm
- the infrared laser light 322 may include light having at least one wavelength in the range of about 700 nm to about 10 um.
- wavelength-multiplexed HOE 330 redirects visible laser light 321 towards eye 390 in a different way from how wavelength-multiplexed HOE 330 redirects infrared laser light 322 towards eye 390 .
- Wavelength-multiplexed HOE 330 includes i) at least a first hologram that is responsive to (i.e., redirects and applies a first optical power to) visible laser light 321 (i.e., light having at least a first wavelength in the visible spectrum) towards eye 390 and, and ii) a second hologram that is responsive to (i.e., redirects and applies a second optical power) infrared laser light 322 (i.e., light having a second wavelength in the infrared spectrum) towards eye 390 .
- the first optical power (i.e., the optical power applied to the visible laser light 321 by at least a first hologram of wavelength-multiplexed HOE 330 ) has a first positive magnitude so that the at least a first hologram in wavelength-multiplexed HOE 330 causes the visible laser light 321 to converge to a first exit pupil at or near the eye 390 of the user. This convergence is advantageous to enable the user to see displayed content with a reasonable field of view. Because wavelength-multiplexed HOE 330 is integrated with lens 360 , wavelength-multiplexed HOE 330 may be positioned proximate eye 390 and the first optical power may be relatively high (e.g., greater than or equal to about 40 diopters) in order to provide the necessary convergence.
- the second optical power i.e., the optical power applied to the infrared laser light 322 by a second hologram of wavelength-multiplexed HOE 330
- the first optical power applied to the visible light by the at least a first hologram of wavelength-multiplexed HOE 330 is less than the first optical power applied to the visible light by the at least a first hologram of wavelength-multiplexed HOE 330 .
- the second optical power applied by the second hologram of wavelength-multiplexed HOE 330 is positive and less than the first optical power applied by the at least a first hologram of wavelength-multiplexed HOE 330 (e.g., less than about 40 diopters; enough to reduce a divergence of, collimate, or converge) such that the infrared light 322 converges to an exit pupil that has a larger diameter at eye 390 than the exit pupil of the visible light 321 and fully contains the exit pupil of the visible light 321 .
- the second optical power applied by the second hologram may be zero or negative so that the second hologram of wavelength-multiplexed HOE 330 causes the infrared laser light 322 to redirect towards 390 without convergence (i.e., as from a plane mirror) or to diverge.
- the second optical power may be less than or equal to about 0 diopters.
- the at least a first hologram in wavelength-multiplexed HOE 330 that is responsive to visible light may include any number of wavelength-multiplexed holograms, each of which may be responsive to a respective wavelength or respective range of wavelengths of visible light.
- the at least a first hologram in wavelength-multiplexed HOE 330 that is responsive to visible light may include a red hologram that is responsive to red light provided by SLP 310 , a green hologram that is responsive to green light provided by SLP 310 , and/or a blue hologram that is responsive to blue light provided by SLP 310 .
- each hologram that is responsive to visible light included in the at least a first hologram of wavelength-multiplexed HOE 330 may apply that same first optical power to the particular visible light to which the hologram is responsive.
- the integration of eye tracking functionality in a WHUD that already employs a SLP and a holographic combiner for display purposes may, in accordance with the present systems, devices, and methods, be achieved by mostly discreetly adapting existing hardware components as opposed to adding the bulk of many new components.
- an infrared laser diode may be added to the SLP (the infrared diode modulated independently of the visible light diode(s) in the projector), ii) an infrared hologram may be added to the holographic combiner (the infrared hologram applying a lower optical power (including zero or negative optical power) to the infrared laser light in order to cover a large eye area, in contrast to the relatively large optical power applied by the holographic combiner to the visible laser light), and iii) at least one infrared photodetector may be added to the WHUD to monitor reflections of the infrared laser light from the eye of the user.
- the SLP the infrared diode modulated independently of the visible light diode(s) in the projector
- an infrared hologram may be added to the holographic combiner (the infrared hologram applying a lower optical power (including zero or negative optical power) to the infrared laser light
- both the first hologram and the second hologram of wavelength-multiplexed HOE 330 may be included in or on a single layer of holographic material (e.g., film) or, alternatively, the first hologram may be included in or on a first layer of holographic material and the second hologram may be included in or on a second layer of holographic material. In the latter case, the first layer of holographic material and the second layer of holographic material may be laminated or otherwise layered together either directly or through any number of intervening layers/materials.
- wavelength-multiplexed HOE 330 may include any number of additional holograms distributed over any number of layers.
- wavelength-multiplexed HOE 330 may include a first hologram that is responsive to a red component of visible laser light 321 , a second hologram that is responsive to infrared laser light 322 , a third hologram that is responsive to a green component of visible laser light 321 , and a fourth hologram that is responsive to a blue component of visible laser light 321 .
- the first, third, and fourth holograms may each apply a same first optical power to the respective visible light to which each hologram is responsive and the second hologram may apply a second optical power to the infrared light.
- an eye tracking system may include one or more digital processor(s) communicatively coupled to the one or more infrared photodetector(s) and to one or more non-transitory processor-readable storage medium(ia) or memory(ies).
- the memory(ies) may store processor-executable instructions and/or data that, when executed by the processor, enable the processor to determine the position and/or motion of an eye of the user, or the gaze direction of the eye of the user, based on information (e.g., intensity information, such as an intensity pattern/map) provided by the one or more photodetector(s).
- FIG. 4 is a perspective view of a WHUD 400 that integrates eye tracking and scanning laser projection in accordance with the present systems, devices, and methods.
- WHUD 400 includes many of the elements depicted in FIGS. 2 and 3 , namely: a SLP 410 comprising laser module 411 with at least one visible laser diode (e.g., a red laser diode, a green laser diode, a blue laser diode, or any combination thereof) to output a visible laser light 421 (e.g., a red laser light, a green laser light, a blue laser light, or any combination thereof) and an infrared laser diode to output infrared laser light 422 , at least one scan mirror aligned to receive laser light output from the laser module 411 and controllably orientable to reflect (i.e., scan) the laser light, a wavelength-multiplexed HOE 430 aligned to redirect the laser light 421 and 422 towards an eye 490 of a user,
- the visible laser light 421 may correspond to any of, either alone or in any combination, a red laser light, a green laser light, and/or a blue laser light.
- WHUD 400 also includes a support frame 480 that has a general shape and appearance or a pair of eyeglasses.
- Support frame 480 carries SLP 410 , photodetector 450 , and wavelength-multiplexed HOE 430 so that HOE 430 is positioned within a field of view of the eye 490 of the user when support frame 480 is worn on a head of the user.
- other optics such as waveguide structures and/or grating structures
- Support frame 480 of WHUD 400 also carries a digital processor 460 communicatively coupled to SLP 410 and photodetector 450 , and a non-transitory processor-readable storage medium or memory 470 communicatively coupled to digital processor 470 .
- Memory 470 stores data and/or processor-executable instructions 471 that, when executed by processor 460 , cause WHUD 400 to perform method 100 from FIG. 1 .
- data and/or processor-executable instructions 471 when executed by processor 460 , cause WHUD 400 to: generate an infrared laser light 422 by the infrared laser diode of SLP 410 ; scan the infrared laser light 422 over the eye 490 of the user (either directly or indirectly via one or more intervening optics such as an HOE or waveguide) by the at least one scan mirror 412 , wherein scanning the infrared laser light 422 over the eye 490 of the user (either directly or indirectly via one or more intervening optics such as an HOE or waveguide) by the at least one scan mirror 412 includes sweeping the at least one scan mirror 412 through a range of orientations and, for a plurality of orientations of the at least one scan mirror 412 , reflecting the infrared laser light 422 to a respective region of the eye 490 of the user (either directly or indirectly via one or more intervening optics such as a HOE or waveguide); detect reflections 423 of
- memory 470 further stores data and/or processor-executable instructions that, when executed by processor 460 WHUD 400 to project visible display content 431 in the field of view of the eye 490 of the user by SLP 410 (in conjunction with HOE 430 ).
- data and/or processor-executable instructions 471 when executed by processor 460 , may cause WHUD 400 to determine, by the processor 460 , a region in a field of view of the eye 490 of the user at which a gaze of the eye 490 is directed based on the orientation of the at least one scan mirror 412 that corresponds to the at least one detected reflection 423 for which the intensity exceeds the threshold value, by causing WHUD 400 to determine, by the processor 460 , a region of the visible display content 431 at which the gaze of the eye 490 is directed based on the orientation of the at least one scan mirror 412 that corresponds to the at least one detected reflection 423 for which the intensity exceeds the threshold value.
- infrared photodetector 450 may advantageously be positioned on support frame 480 at a periphery of the field of view of the eye 490 of the user when the eye 490 is gazing straight ahead (e.g., on the rims of frame 480 that surround the eyeglass lens that carries HOE 430 ).
- the data and/or processor-executable instructions that, when executed by the processor 460 , cause WHUD 400 to project visible display content 431 in the field of view of the eye 490 of the user by the SLP 410 may advantageously cause the SLP 410 to position the visible display content 431 away-from-center in the field of view of the eye 490 of the user and towards the position of the at least one infrared photodetector 450 at the periphery of the field of view of the eye 490 of the user, as depicted in the exemplary implementation of FIG. 4 .
- FIGS. 2, 3, and 4 as well as the appended claims, reference is often made to the eye of the user.
- FIG. 2 depicts eye 290
- FIG. 3 depicts eye 390
- FIG. 4 depicts eye 490
- the systems, devices, and methods described herein are suitable for use in association with at least one eye of a user (e.g., 290 , 390 , or 490 ) but do not themselves include the eye of the user.
- eye 290 is not a part of WHUD 200
- eye 390 is not a part of WHUD 300
- eye 490 is not a part of WHUD 400 .
- the various embodiments described herein measure, sense, detect, identify, or otherwise determine the intensity of detected infrared reflections and use this information to identify when the intensity of a detected infrared reflection exceeds a threshold value.
- the threshold value may be a certain percentage above a baseline detection value, such as 10% above, 50% above, 100% above, 500% above, 1000% above, or so on depending on the specific implementation.
- a detected infrared reflection that exceeds the threshold value is used herein because such generally corresponds to a spectral reflection for the eye of the user known as the first Purkinje image or glint.
- the glint provides a useful, reliable, and sufficient detection feature for the purpose of determining the gaze direction of the eye of the user; thus, in method 100 only detected reflections that correspond to glints are used to determine the gaze direction of the eye of the user.
- the entire collection of detected reflections of the infrared laser light from the eye of the user can be useful in other applications.
- acts 101 , 102 , 103 , and 104 may be employed to produce a complete (depending on the resolution given, at least in part, by the step size between orientations of the at least one scan mirror) infrared image of the eye of the user.
- This infrared image may be used for more detailed (and more computational intensive) eye tracking and gaze detection purposes, or for other purposes such as user authentication via iris or retinal blood vessel recognition, or pupil/iris size detection that may be used to infer information about the user's environment such as ambient light brightness levels. That is, conventional techniques and algorithms for iris recognition and/or retinal blood vessel recognition (which typically use visible light and color photography or videography) may be adapted to employ scanned infrared laser light and infrared images of the eye of the user generated by performing acts 101 , 102 , 103 , and 104 of method 100 (together with further acts of data processing to produce an infrared image and image processing to achieve recognition).
- eye tracking systems and devices described herein may, in some implementations, make use of additional or alternative “Purkinje images” (i.e., other than the “glint”) and/or may employ the “corneal shadow based” methods of eye tracking described in U.S. Non-Provisional patent application Ser. No. 15/331,204.
- the WHUDs described herein may include one or more sensor(s) (e.g., microphone, camera, thermometer, compass, and/or others) for collecting data from the user's environment.
- one or more camera(s) may be used to provide feedback to the processor of the wearable heads-up display and influence where on the transparent display(s) any given image should be displayed.
- the WHUDs described herein may include one or more on-board power sources (e.g., one or more battery(ies)), a wireless transceiver for sending/receiving wireless communications, and/or a tethered connector port for coupling to a computer and/or charging the one or more on-board power source(s).
- on-board power sources e.g., one or more battery(ies)
- wireless transceiver for sending/receiving wireless communications
- a tethered connector port for coupling to a computer and/or charging the one or more on-board power source(s).
- communicative as in “communicative pathway,” “communicative coupling,” and in variants such as “communicatively coupled,” is generally used to refer to any engineered arrangement for transferring and/or exchanging information.
- exemplary communicative pathways include, but are not limited to, electrically conductive pathways (e.g., electrically conductive wires, electrically conductive traces), magnetic pathways (e.g., magnetic media), and/or optical pathways (e.g., optical fiber), and exemplary communicative couplings include, but are not limited to, electrical couplings, magnetic couplings, and/or optical couplings.
- infinitive verb forms are often used. Examples include, without limitation: “to detect,” “to provide,” “to transmit,” “to communicate,” “to process,” “to route,” and the like. Unless the specific context requires otherwise, such infinitive verb forms are used in an open, inclusive sense, that is as “to, at least, detect,” to, at least, provide,” “to, at least, transmit,” and so on.
- logic or information can be stored on any processor-readable medium for use by or in connection with any processor-related system or method.
- a memory is a processor-readable medium that is an electronic, magnetic, optical, or other physical device or means that contains or stores a computer and/or processor program.
- Logic and/or the information can be embodied in any processor-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions associated with logic and/or information.
- a “non-transitory processor-readable medium” can be any element that can store the program associated with logic and/or information for use by or in connection with the instruction execution system, apparatus, and/or device.
- the processor-readable medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device.
- the computer readable medium would include the following: a portable computer diskette (magnetic, compact flash card, secure digital, or the like), a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), a portable compact disc read-only memory (CDROM), digital tape, and other non-transitory media.
- a portable computer diskette magnetic, compact flash card, secure digital, or the like
- RAM random access memory
- ROM read-only memory
- EPROM erasable programmable read-only memory
- CDROM compact disc read-only memory
- digital tape digital tape
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
Claims (12)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/827,667 US10409057B2 (en) | 2016-11-30 | 2017-11-30 | Systems, devices, and methods for laser eye tracking in wearable heads-up displays |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662428320P | 2016-11-30 | 2016-11-30 | |
US15/827,667 US10409057B2 (en) | 2016-11-30 | 2017-11-30 | Systems, devices, and methods for laser eye tracking in wearable heads-up displays |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180149863A1 US20180149863A1 (en) | 2018-05-31 |
US10409057B2 true US10409057B2 (en) | 2019-09-10 |
Family
ID=62190796
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/827,667 Expired - Fee Related US10409057B2 (en) | 2016-11-30 | 2017-11-30 | Systems, devices, and methods for laser eye tracking in wearable heads-up displays |
US15/827,675 Expired - Fee Related US10459220B2 (en) | 2016-11-30 | 2017-11-30 | Systems, devices, and methods for laser eye tracking in wearable heads-up displays |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/827,675 Expired - Fee Related US10459220B2 (en) | 2016-11-30 | 2017-11-30 | Systems, devices, and methods for laser eye tracking in wearable heads-up displays |
Country Status (3)
Country | Link |
---|---|
US (2) | US10409057B2 (en) |
CA (1) | CA3045192A1 (en) |
WO (1) | WO2018098579A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190196195A1 (en) * | 2017-12-11 | 2019-06-27 | North Inc. | Wavelength combiner photonic integrated circuit with grating coupling of lasers |
US10877268B2 (en) * | 2019-04-16 | 2020-12-29 | Facebook Technologies, Llc | Active control of in-field light sources of a head mounted display |
US10948729B2 (en) | 2019-04-16 | 2021-03-16 | Facebook Technologies, Llc | Keep-out zone for in-field light sources of a head mounted display |
US11314085B2 (en) * | 2019-02-14 | 2022-04-26 | Thales | Viewing device comprising a pupil expander including two mirrors |
US11428930B2 (en) * | 2019-08-07 | 2022-08-30 | Meta Platforms Technologies, Llc | Stray light suppression in eye-tracking imaging |
US12231613B2 (en) | 2019-11-06 | 2025-02-18 | Hes Ip Holdings, Llc | System and method for displaying an object with depths |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10832051B1 (en) | 2016-06-13 | 2020-11-10 | Facebook Technologies, Llc | Eye tracking using optical coherence methods |
US10366674B1 (en) | 2016-12-27 | 2019-07-30 | Facebook Technologies, Llc | Display calibration in electronic displays |
CN108701363B (en) * | 2017-07-07 | 2021-06-29 | 广东虚拟现实科技有限公司 | Method, apparatus and system for object recognition and tracking using multiple cameras |
US20190018481A1 (en) * | 2017-07-17 | 2019-01-17 | Thalmic Labs Inc. | Dynamic calibration systems and methods for wearable heads-up displays |
US10564429B2 (en) * | 2018-02-01 | 2020-02-18 | Varjo Technologies Oy | Gaze-tracking system using illuminators emitting different wavelengths |
US10551914B2 (en) * | 2018-02-09 | 2020-02-04 | Microsoft Technology Licensing, Llc | Efficient MEMs-based eye tracking system with a silicon photomultiplier sensor |
US10627899B2 (en) | 2018-02-09 | 2020-04-21 | Microsoft Technology Licensing, Llc | Eye tracking system for use in a visible light display device |
US10963046B1 (en) * | 2018-05-17 | 2021-03-30 | Facebook Technologies, Llc | Drift corrected eye tracking |
US11238143B2 (en) * | 2018-06-05 | 2022-02-01 | Google Llc | Method and system for authenticating a user on a wearable heads-up display |
US11406257B2 (en) | 2018-07-27 | 2022-08-09 | Welch Allyn, Inc. | Vision screening device and methods |
EP3627194A1 (en) | 2018-09-20 | 2020-03-25 | Essilor International | An optical device with reduced reflection in deep red, near infrared and visible ranges |
US10859837B2 (en) * | 2018-09-21 | 2020-12-08 | Google Llc | Optical combiner lens for wearable heads-up display |
CN111176592A (en) * | 2018-11-09 | 2020-05-19 | 中兴通讯股份有限公司 | Terminal, content display method and device and computer readable storage medium |
US11740460B2 (en) * | 2018-11-29 | 2023-08-29 | Apple Inc. | Optical systems with multi-layer holographic combiners |
US20190146223A1 (en) * | 2018-12-21 | 2019-05-16 | Tuotuo LI | Multifocal dynamic lens for head mounted display |
US10831032B2 (en) * | 2019-02-28 | 2020-11-10 | Microsoft Technology Licensing, Llc | Photo-sensing reflectors for compact display module assembly |
US11276986B2 (en) | 2019-02-28 | 2022-03-15 | Microsoft Technologly Licensing, LLC | Photo-sensing reflectors for compact display module assembly comprising a reflective coating on a light receiving surface of a reflective photodiode |
US10832052B2 (en) | 2019-03-04 | 2020-11-10 | Microsoft Technology Licensing, Llc | IR illumination module for MEMS-based eye tracking |
US11624906B2 (en) | 2019-03-04 | 2023-04-11 | Microsoft Technology Licensing, Llc | IR illumination module for MEMS-based eye tracking |
US10838489B2 (en) * | 2019-03-04 | 2020-11-17 | Microsoft Technology Licensing, Llc | IR illumination module for MEMS-based eye tracking |
US11675190B2 (en) * | 2019-05-10 | 2023-06-13 | Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America | Head up display combined with holographic element for driver monitoring |
US11860371B1 (en) * | 2019-05-30 | 2024-01-02 | Snap Inc. | Eyewear with eye-tracking reflective element |
DE102020206452A1 (en) | 2020-05-25 | 2021-11-25 | Robert Bosch Gesellschaft mit beschränkter Haftung | Projection and oculography device for data glasses and method for operating a projection and oculography device for data glasses |
US11740695B2 (en) * | 2020-08-11 | 2023-08-29 | Inseye Inc. | Eye tracking system for use in head-mounted display units |
FR3115118B1 (en) * | 2020-10-09 | 2022-10-28 | Commissariat Energie Atomique | VIRTUAL OR AUGMENTED REALITY VISION SYSTEM WITH EYE IMAGE SENSOR |
US11914764B2 (en) * | 2021-03-26 | 2024-02-27 | Htc Corporation | Head mounted display device |
US11508275B1 (en) * | 2022-01-18 | 2022-11-22 | Google Llc | Laser energy integrator for display waveguide breakage safety system |
CN117310977A (en) * | 2022-06-21 | 2023-12-29 | 北京七鑫易维信息技术有限公司 | Eyeball tracking optical device, system and virtual reality equipment |
DE102022208721A1 (en) | 2022-08-23 | 2024-02-29 | Robert Bosch Gesellschaft mit beschränkter Haftung | Optical segmentation element and optical system for a virtual retinal display |
US12093450B2 (en) * | 2022-10-25 | 2024-09-17 | Meta Platforms Technologies, Llc | Scanning display with eye-tracking |
US11988828B1 (en) * | 2022-11-16 | 2024-05-21 | Meta Platforms Technologies, Llc | Multi-pupil display and eye-tracking with interferometric sensing |
Citations (138)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3408133A (en) | 1964-01-23 | 1968-10-29 | Electro Optical Systems Inc | Kerr-cell camera shutter |
US3712716A (en) | 1971-04-09 | 1973-01-23 | Stanford Research Inst | Eye tracker |
JPS61198892A (en) | 1985-02-27 | 1986-09-03 | Nec Corp | Display device |
US4978213A (en) | 1987-08-26 | 1990-12-18 | El Hage Sami G | Apparatus for determining the contour of the cornea of a human eye |
US5103323A (en) | 1990-04-18 | 1992-04-07 | Holographic Optics, Inc. | Multi-layer holographic notch filter |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US5467104A (en) | 1992-10-22 | 1995-11-14 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US5589956A (en) * | 1992-07-31 | 1996-12-31 | Canon Kabushiki Kaisha | Image display apparatus |
US5596339A (en) | 1992-10-22 | 1997-01-21 | University Of Washington | Virtual retinal display with fiber optic point source |
US5742421A (en) | 1996-03-01 | 1998-04-21 | Reflection Technology, Inc. | Split lens video display system |
JPH10319240A (en) | 1997-05-22 | 1998-12-04 | Fuji Xerox Co Ltd | Head-mounted display |
US6008781A (en) | 1992-10-22 | 1999-12-28 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US6027216A (en) | 1997-10-21 | 2000-02-22 | The Johns University School Of Medicine | Eye fixation monitor and tracker |
US6184847B1 (en) | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US6204829B1 (en) | 1998-02-20 | 2001-03-20 | University Of Washington | Scanned retinal display with exit pupil selected based on viewer's eye position |
US6236476B1 (en) | 1997-04-11 | 2001-05-22 | Korea Institute Of Science And Technology | Apparatus for making a high quality reflection type holographic optical element |
US20010033402A1 (en) | 2000-02-10 | 2001-10-25 | Popovich Milan M. | Switchable hologram and method of producing the same |
US20020003627A1 (en) | 2000-03-13 | 2002-01-10 | Rieder Ronald J. | Doubly-differential interferometer and method for evanescent wave surface detection |
US20020007118A1 (en) | 2000-03-15 | 2002-01-17 | Hideo Adachi | Ultrasonic wave transducer system and ultrasonic wave transducer |
US6353503B1 (en) | 1999-06-21 | 2002-03-05 | The Micropitical Corporation | Eyeglass display lens system employing off-axis optical design |
US20020030636A1 (en) | 2000-06-26 | 2002-03-14 | Richards Angus Duncan | Virtual reality display device |
US6377277B1 (en) | 1995-08-10 | 2002-04-23 | Sega Enterprises, Ltd. | Virtual image generation apparatus and method |
US20020093701A1 (en) | 2000-12-29 | 2002-07-18 | Xiaoxiao Zhang | Holographic multifocal lens |
US20020120916A1 (en) | 2001-01-16 | 2002-08-29 | Snider Albert Monroe | Head-up display system utilizing fluorescent material |
KR20040006609A (en) | 2002-07-13 | 2004-01-24 | 삼성아이텍 주식회사 | A method and tool for manufacturing polarized light lens |
US20040174287A1 (en) | 2002-11-21 | 2004-09-09 | Deak David G. | Self-contained switch |
US20050012715A1 (en) | 2003-06-12 | 2005-01-20 | Peter Charles Shann Hubird Ford | Method, system, and software for interactive communication and analysis |
US6972734B1 (en) | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US20060110008A1 (en) * | 2003-11-14 | 2006-05-25 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking |
US20060238707A1 (en) | 2002-11-21 | 2006-10-26 | John Elvesjo | Method and installation for detecting and following an eye and the gaze direction thereof |
US20070078308A1 (en) | 2003-10-24 | 2007-04-05 | Lein Applied Diagnostics Limited | Ocular property measuring apparatus and method therefor |
US20070132785A1 (en) | 2005-03-29 | 2007-06-14 | Ebersole John F Jr | Platform for immersive gaming |
US7473888B2 (en) | 1999-08-05 | 2009-01-06 | Microvision, Inc. | Display with compensated light source drive |
US20090109241A1 (en) | 2007-10-26 | 2009-04-30 | Canon Kabushiki Kaisha | Image display system, image display apparatus, and control method thereof |
US20090179824A1 (en) | 2008-01-10 | 2009-07-16 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, and system |
US20090207464A1 (en) | 2004-11-24 | 2009-08-20 | John David Wiltshire | Holograms and Hologram Fabrication Methods and Apparatus |
US20090258669A1 (en) | 2008-04-15 | 2009-10-15 | Hong Nie | Impulse ultra-wideband radio communication system |
US7640007B2 (en) | 1999-02-12 | 2009-12-29 | Fisher-Rosemount Systems, Inc. | Wireless handheld communicator in a process control environment |
US7637615B2 (en) * | 2004-08-19 | 2009-12-29 | Brother Kogyo Kabushiki Kaisha | Device for tracking pupil of eyeball using intensity changes of reflected light from eyeball and image display using the same |
US20090322653A1 (en) | 2008-06-25 | 2009-12-31 | Samsung Electronics Co., Ltd. | Compact virtual display |
US20100053555A1 (en) | 2008-08-27 | 2010-03-04 | Locarna Systems, Inc. | Method and apparatus for tracking eye movement |
US20100060551A1 (en) * | 2007-09-26 | 2010-03-11 | Keiji Sugiyama | Beam scanning-type display device, method, program and integrated circuit |
US7684105B2 (en) | 2005-02-24 | 2010-03-23 | National Research Council Of Canada | Microblinds and a method of fabrication thereof |
US20100142015A1 (en) | 2008-12-09 | 2010-06-10 | Sony Corporation | Hologram recording film and method of manufacturing same, and image display apparatus |
US20100150415A1 (en) | 2006-11-09 | 2010-06-17 | Optos Plc | Retinal scanning |
US20100149073A1 (en) | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
US7747113B2 (en) | 2004-03-29 | 2010-06-29 | Sony Corporation | Optical device and virtual image display device |
US7773111B2 (en) | 2005-03-16 | 2010-08-10 | Lc Technologies, Inc. | System and method for perceived image processing in a gaze tracking system |
US20100239776A1 (en) | 2007-07-25 | 2010-09-23 | Hoya Corporation | Method for producing plastic lens |
US7850306B2 (en) | 2008-08-28 | 2010-12-14 | Nokia Corporation | Visual cognition aware display and visual data transmission architecture |
US7925100B2 (en) | 2007-07-31 | 2011-04-12 | Microsoft Corporation | Tiled packaging of vector image data |
US7927522B2 (en) | 2007-10-25 | 2011-04-19 | Wen-Yi Hsu | Polarized lens and method of making polarized lens |
US20110109880A1 (en) * | 2006-01-26 | 2011-05-12 | Ville Nummela | Eye Tracker Device |
US20120002256A1 (en) | 2009-02-16 | 2012-01-05 | Lilian Lacoste | Laser Based Image Display System |
US8120828B2 (en) | 2006-05-12 | 2012-02-21 | Seereal Technologies S.A. | Reflective optical system, tracking system and holographic projection system and method |
US20120105486A1 (en) * | 2009-04-09 | 2012-05-03 | Dynavox Systems Llc | Calibration free, motion tolerent eye-gaze direction detector with contextually aware computer interaction and communication methods |
US8179604B1 (en) | 2011-07-13 | 2012-05-15 | Google Inc. | Wearable marker for passive interaction |
US8188937B1 (en) | 1999-09-06 | 2012-05-29 | Shimadzu Corporation | Body mounting type display system |
US20120139817A1 (en) | 2009-08-13 | 2012-06-07 | Bae Systems Plc | Head up display system |
US20120169752A1 (en) | 2010-04-28 | 2012-07-05 | Akira Kurozuka | Scanning type image display apparatus |
US20120182309A1 (en) | 2011-01-14 | 2012-07-19 | Research In Motion Limited | Device and method of conveying emotion in a messaging application |
US20120188158A1 (en) | 2008-06-26 | 2012-07-26 | Microsoft Corporation | Wearable electromyography-based human-computer interface |
US20120249797A1 (en) | 2010-02-28 | 2012-10-04 | Osterhout Group, Inc. | Head-worn adaptive display |
US20120290401A1 (en) | 2011-05-11 | 2012-11-15 | Google Inc. | Gaze tracking system |
US20120302289A1 (en) | 2011-05-27 | 2012-11-29 | Kang Heejoon | Mobile terminal and method of controlling operation thereof |
US20130009853A1 (en) | 2011-07-05 | 2013-01-10 | The Board Of Trustees Of The Leland Stanford Junior University | Eye-glasses mounted display |
US8355671B2 (en) | 2008-01-04 | 2013-01-15 | Kopin Corporation | Method and apparatus for transporting video signal over Bluetooth wireless interface |
US20130016292A1 (en) | 2011-07-15 | 2013-01-17 | Google Inc. | Eyepiece for near-to-eye display with multi-reflectors |
US20130016413A1 (en) | 2011-07-12 | 2013-01-17 | Google Inc. | Whole image scanning mirror display system |
US20130051631A1 (en) * | 2011-08-22 | 2013-02-28 | Eyelock Inc. | Systems and methods for capturing artifact free images |
US20130076800A1 (en) * | 2011-09-28 | 2013-03-28 | Michio HATAGI | Scanning projection apparatus and scanning image display |
US20130088413A1 (en) | 2011-10-05 | 2013-04-11 | Google Inc. | Method to Autofocus on Near-Eye Display |
US20130135722A1 (en) | 2011-11-29 | 2013-05-30 | Seiko Epson Corporation | Polarization separation device and display apparatus |
US20130165813A1 (en) | 2011-12-23 | 2013-06-27 | Industrial Technology Research Institute | Sensor for acquiring muscle parameters |
JP2013127489A (en) | 2010-03-29 | 2013-06-27 | Panasonic Corp | See-through display |
US20130169560A1 (en) | 2012-01-04 | 2013-07-04 | Tobii Technology Ab | System for gaze interaction |
US20130198694A1 (en) | 2011-06-10 | 2013-08-01 | Aliphcom | Determinative processes for wearable devices |
JP2013160905A (en) | 2012-02-03 | 2013-08-19 | Denso Corp | Vehicle-mounted head-up display |
US20130215235A1 (en) | 2011-04-29 | 2013-08-22 | Austin Russell | Three-dimensional imager and projection device |
US20130222384A1 (en) | 2010-11-08 | 2013-08-29 | Seereal Technologies S.A. | Display device, in particular a head-mounted display, based on temporal and spatial multiplexing of hologram tiles |
US20130265437A1 (en) | 2012-04-09 | 2013-10-10 | Sony Mobile Communications Ab | Content transfer via skin input |
US8560976B1 (en) | 2012-11-14 | 2013-10-15 | Lg Electronics Inc. | Display device and controlling method thereof |
US20130285901A1 (en) | 2012-04-27 | 2013-10-31 | Dongguk University Industry-Academic Cooperation Foundation | System and method for tracking gaze at distance |
US20130300652A1 (en) | 2011-11-30 | 2013-11-14 | Google, Inc. | Unlocking a Screen Using Eye Tracking Information |
US20130332196A1 (en) | 2012-06-07 | 2013-12-12 | The Government Of The United States As Represented By The Secretary Of The Army | Diabetes Monitoring Using Smart Device |
US20130335302A1 (en) | 2012-06-18 | 2013-12-19 | Randall T. Crane | Selective illumination |
US8634119B2 (en) | 2010-07-09 | 2014-01-21 | Tipd, Llc | System for holography |
US20140045547A1 (en) | 2012-08-10 | 2014-02-13 | Silverplus, Inc. | Wearable Communication Device and User Interface |
US8666212B1 (en) | 2011-04-28 | 2014-03-04 | Google Inc. | Head mounted display using a fused fiber bundle |
US8704882B2 (en) | 2011-11-18 | 2014-04-22 | L-3 Communications Corporation | Simulated head mounted display system and method |
US20140125760A1 (en) | 2012-11-05 | 2014-05-08 | Honeywell International Inc. | Visual system having multiple cameras |
US20140198035A1 (en) | 2013-01-14 | 2014-07-17 | Thalmic Labs Inc. | Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display |
US20140204465A1 (en) * | 2013-01-22 | 2014-07-24 | Denso Corporation | Head-up display device |
US20140202643A1 (en) | 2011-08-31 | 2014-07-24 | Koninklijke Philips N.V. | Light control panel |
US20140204455A1 (en) | 2011-08-24 | 2014-07-24 | Milan Momcilo Popovich | Wearable data display |
US20140226193A1 (en) | 2011-10-25 | 2014-08-14 | National Central University | Optical head-mounted display with mechanical one-dimensional scanner |
US20140232651A1 (en) | 2013-02-15 | 2014-08-21 | Google Inc. | Cascading optics in optical combiners of head mounted displays |
US20140285429A1 (en) | 2013-03-15 | 2014-09-25 | John Castle Simmons | Light Management for Image and Data Control |
WO2014155288A2 (en) | 2013-03-25 | 2014-10-02 | Ecole Polytechnique Federale De Lausanne (Epfl) | Method and apparatus for head worn display with multiple exit pupils |
US20140368896A1 (en) | 2012-03-15 | 2014-12-18 | Panasonic Corporation | Optical reflecting element and actuator |
US8922481B1 (en) | 2012-03-16 | 2014-12-30 | Google Inc. | Content annotation |
US8922898B2 (en) | 2008-09-04 | 2014-12-30 | Innovega Inc. | Molded lens with nanofilaments and related methods |
US20150036221A1 (en) | 2013-08-04 | 2015-02-05 | Robert S. Stephenson | Wide-field head-up display (HUD) eyeglasses |
US8971023B2 (en) | 2012-03-21 | 2015-03-03 | Google Inc. | Wearable computing device frame |
US8970571B1 (en) | 2012-03-13 | 2015-03-03 | Google Inc. | Apparatus and method for display lighting adjustment |
US20150156716A1 (en) | 2013-12-03 | 2015-06-04 | Google Inc. | On-head detection for head-mounted display |
US9086687B2 (en) | 2013-05-07 | 2015-07-21 | Lg Electronics Inc. | Smart watch and method for controlling the same |
US20150205126A1 (en) | 2013-11-27 | 2015-07-23 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US20150205134A1 (en) | 2014-01-17 | 2015-07-23 | Thalmic Labs Inc. | Systems, articles, and methods for wearable heads-up displays |
WO2015123775A1 (en) | 2014-02-18 | 2015-08-27 | Sulon Technologies Inc. | Systems and methods for incorporating a real image stream in a virtual image stream |
US9135708B2 (en) | 2010-08-09 | 2015-09-15 | National University Corporation Shizuoka University | Gaze point detection method and gaze point detection device |
US20150268821A1 (en) | 2014-03-20 | 2015-09-24 | Scott Ramsby | Selection using eye gaze evaluation over time |
US20150325202A1 (en) | 2014-05-07 | 2015-11-12 | Thalmic Labs Inc. | Systems, devices, and methods for wearable computers with heads-up displays |
US20150362734A1 (en) | 2013-01-28 | 2015-12-17 | Ecole Polytechnique Federale De Lausanne (Epfl) | Transflective holographic film for head worn display |
US20150369920A1 (en) * | 2014-06-20 | 2015-12-24 | Funai Electric Co., Ltd. | Electronic apparatus and method for measuring direction of output laser light |
US20150378162A1 (en) | 2014-06-25 | 2015-12-31 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
US20160085084A1 (en) * | 2013-05-10 | 2016-03-24 | Intel Corporation | Projection device |
US20160166146A1 (en) * | 2014-12-11 | 2016-06-16 | Icspi Corp. | Eye-Tracking System and Method Therefor |
US20160202081A1 (en) | 2013-09-04 | 2016-07-14 | Essilor International (Compagnie Genrale d'Optique | Navigation method based on a see-through head-mounted device |
US20160238845A1 (en) | 2015-02-17 | 2016-08-18 | Thalmic Labs Inc. | Systems, devices, and methods for eyebox expansion in wearable heads-up displays |
US20160274365A1 (en) | 2015-03-17 | 2016-09-22 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays with heterogeneous display quality |
US20160274758A1 (en) | 2015-03-20 | 2016-09-22 | Thalmic Labs Inc. | Systems, devices, and methods for mitigating false positives in human-electronics interfaces |
US20160327796A1 (en) | 2015-05-04 | 2016-11-10 | Thalmic Labs Inc. | Systems, devices, and methods for eyeboxes with heterogeneous exit pupils |
US20160349516A1 (en) | 2015-05-28 | 2016-12-01 | Thalmic Labs Inc. | Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays |
US20170068095A1 (en) | 2015-09-04 | 2017-03-09 | Thalmic Labs Inc. | Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses |
US20170097753A1 (en) | 2015-10-01 | 2017-04-06 | Thalmic Labs Inc. | Systems, devices, and methods for interacting with content displayed on head-mounted displays |
US20170115483A1 (en) | 2015-10-23 | 2017-04-27 | Thalmic Labs Inc. | Systems, devices, and methods for laser eye tracking |
US20170153701A1 (en) | 2015-12-01 | 2017-06-01 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays as wireless controllers |
US20170184847A1 (en) * | 2015-12-28 | 2017-06-29 | Oculus Vr, Llc | Determining interpupillary distance and eye relief of a user wearing a head-mounted display |
US20170205876A1 (en) | 2016-01-20 | 2017-07-20 | Thalmic Labs Inc. | Systems, devices, and methods for proximity-based eye tracking |
US20170212290A1 (en) | 2015-12-17 | 2017-07-27 | Thalmic Labs Inc. | Systems, devices, and methods for curved holographic optical elements |
US20170219829A1 (en) | 2016-01-29 | 2017-08-03 | Thalmic Labs Inc. | Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display |
US20170243061A1 (en) * | 2016-02-22 | 2017-08-24 | Fujitsu Limited | Detection system and detection method |
US20170299956A1 (en) | 2016-04-13 | 2017-10-19 | Thalmic Labs Inc. | Systems, devices, and methods for focusing laser projectors |
US20180007255A1 (en) | 2016-06-30 | 2018-01-04 | Thalmic Labs Inc. | Image capture systems, devices, and methods that autofocus based on eye-tracking |
US9916005B2 (en) * | 2012-02-06 | 2018-03-13 | Sony Corporation | Gaze tracking with projector |
US9936163B1 (en) * | 2016-10-05 | 2018-04-03 | Avaya Inc. | System and method for mirror utilization in meeting rooms |
US20180373024A1 (en) * | 2015-12-22 | 2018-12-27 | Qd Laser, Inc. | Image projection device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU1935397A (en) * | 1996-03-15 | 1997-10-10 | Retinal Display Cayman Limited | Method of and apparatus for viewing an image |
US6281862B1 (en) * | 1998-11-09 | 2001-08-28 | University Of Washington | Scanned beam display with adjustable accommodation |
US9699433B2 (en) * | 2013-01-24 | 2017-07-04 | Yuchen Zhou | Method and apparatus to produce re-focusable vision with detecting re-focusing event from human eye |
JP6120611B2 (en) * | 2013-02-27 | 2017-04-26 | 日立マクセル株式会社 | Beam scanning display device |
US9547365B2 (en) * | 2014-09-15 | 2017-01-17 | Google Inc. | Managing information display |
US20180003961A1 (en) * | 2016-07-01 | 2018-01-04 | Intel Corporation | Gaze detection in head worn display |
-
2017
- 2017-11-30 WO PCT/CA2017/051440 patent/WO2018098579A1/en active Application Filing
- 2017-11-30 CA CA3045192A patent/CA3045192A1/en not_active Abandoned
- 2017-11-30 US US15/827,667 patent/US10409057B2/en not_active Expired - Fee Related
- 2017-11-30 US US15/827,675 patent/US10459220B2/en not_active Expired - Fee Related
Patent Citations (154)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3408133A (en) | 1964-01-23 | 1968-10-29 | Electro Optical Systems Inc | Kerr-cell camera shutter |
US3712716A (en) | 1971-04-09 | 1973-01-23 | Stanford Research Inst | Eye tracker |
JPS61198892A (en) | 1985-02-27 | 1986-09-03 | Nec Corp | Display device |
US4978213B1 (en) | 1987-08-26 | 1997-03-11 | Alcon Lab Inc | Apparatus for determining the contour of the cornea of a human eye |
US4978213A (en) | 1987-08-26 | 1990-12-18 | El Hage Sami G | Apparatus for determining the contour of the cornea of a human eye |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US5103323A (en) | 1990-04-18 | 1992-04-07 | Holographic Optics, Inc. | Multi-layer holographic notch filter |
US5589956A (en) * | 1992-07-31 | 1996-12-31 | Canon Kabushiki Kaisha | Image display apparatus |
US5596339A (en) | 1992-10-22 | 1997-01-21 | University Of Washington | Virtual retinal display with fiber optic point source |
US6008781A (en) | 1992-10-22 | 1999-12-28 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US5467104A (en) | 1992-10-22 | 1995-11-14 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US6639570B2 (en) | 1992-10-22 | 2003-10-28 | University Of Washington | Retinal display scanning of image with plurality of image sectors |
US6317103B1 (en) | 1992-10-22 | 2001-11-13 | University Of Washington | Virtual retinal display and method for tracking eye position |
US6377277B1 (en) | 1995-08-10 | 2002-04-23 | Sega Enterprises, Ltd. | Virtual image generation apparatus and method |
US5742421A (en) | 1996-03-01 | 1998-04-21 | Reflection Technology, Inc. | Split lens video display system |
US6236476B1 (en) | 1997-04-11 | 2001-05-22 | Korea Institute Of Science And Technology | Apparatus for making a high quality reflection type holographic optical element |
JPH10319240A (en) | 1997-05-22 | 1998-12-04 | Fuji Xerox Co Ltd | Head-mounted display |
US6027216A (en) | 1997-10-21 | 2000-02-22 | The Johns University School Of Medicine | Eye fixation monitor and tracker |
US6204829B1 (en) | 1998-02-20 | 2001-03-20 | University Of Washington | Scanned retinal display with exit pupil selected based on viewer's eye position |
US6184847B1 (en) | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US7640007B2 (en) | 1999-02-12 | 2009-12-29 | Fisher-Rosemount Systems, Inc. | Wireless handheld communicator in a process control environment |
US6972734B1 (en) | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US6353503B1 (en) | 1999-06-21 | 2002-03-05 | The Micropitical Corporation | Eyeglass display lens system employing off-axis optical design |
US7473888B2 (en) | 1999-08-05 | 2009-01-06 | Microvision, Inc. | Display with compensated light source drive |
US8188937B1 (en) | 1999-09-06 | 2012-05-29 | Shimadzu Corporation | Body mounting type display system |
US20010033402A1 (en) | 2000-02-10 | 2001-10-25 | Popovich Milan M. | Switchable hologram and method of producing the same |
US20020003627A1 (en) | 2000-03-13 | 2002-01-10 | Rieder Ronald J. | Doubly-differential interferometer and method for evanescent wave surface detection |
US20020007118A1 (en) | 2000-03-15 | 2002-01-17 | Hideo Adachi | Ultrasonic wave transducer system and ultrasonic wave transducer |
US20020030636A1 (en) | 2000-06-26 | 2002-03-14 | Richards Angus Duncan | Virtual reality display device |
US20020093701A1 (en) | 2000-12-29 | 2002-07-18 | Xiaoxiao Zhang | Holographic multifocal lens |
US20020120916A1 (en) | 2001-01-16 | 2002-08-29 | Snider Albert Monroe | Head-up display system utilizing fluorescent material |
KR20040006609A (en) | 2002-07-13 | 2004-01-24 | 삼성아이텍 주식회사 | A method and tool for manufacturing polarized light lens |
US20060238707A1 (en) | 2002-11-21 | 2006-10-26 | John Elvesjo | Method and installation for detecting and following an eye and the gaze direction thereof |
US20040174287A1 (en) | 2002-11-21 | 2004-09-09 | Deak David G. | Self-contained switch |
US20050012715A1 (en) | 2003-06-12 | 2005-01-20 | Peter Charles Shann Hubird Ford | Method, system, and software for interactive communication and analysis |
US20070078308A1 (en) | 2003-10-24 | 2007-04-05 | Lein Applied Diagnostics Limited | Ocular property measuring apparatus and method therefor |
US20060110008A1 (en) * | 2003-11-14 | 2006-05-25 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking |
US7747113B2 (en) | 2004-03-29 | 2010-06-29 | Sony Corporation | Optical device and virtual image display device |
US7637615B2 (en) * | 2004-08-19 | 2009-12-29 | Brother Kogyo Kabushiki Kaisha | Device for tracking pupil of eyeball using intensity changes of reflected light from eyeball and image display using the same |
US20090207464A1 (en) | 2004-11-24 | 2009-08-20 | John David Wiltshire | Holograms and Hologram Fabrication Methods and Apparatus |
US7684105B2 (en) | 2005-02-24 | 2010-03-23 | National Research Council Of Canada | Microblinds and a method of fabrication thereof |
US7773111B2 (en) | 2005-03-16 | 2010-08-10 | Lc Technologies, Inc. | System and method for perceived image processing in a gaze tracking system |
US20070132785A1 (en) | 2005-03-29 | 2007-06-14 | Ebersole John F Jr | Platform for immersive gaming |
US20110109880A1 (en) * | 2006-01-26 | 2011-05-12 | Ville Nummela | Eye Tracker Device |
US8120828B2 (en) | 2006-05-12 | 2012-02-21 | Seereal Technologies S.A. | Reflective optical system, tracking system and holographic projection system and method |
US20100150415A1 (en) | 2006-11-09 | 2010-06-17 | Optos Plc | Retinal scanning |
US20100239776A1 (en) | 2007-07-25 | 2010-09-23 | Hoya Corporation | Method for producing plastic lens |
US7925100B2 (en) | 2007-07-31 | 2011-04-12 | Microsoft Corporation | Tiled packaging of vector image data |
US20100060551A1 (en) * | 2007-09-26 | 2010-03-11 | Keiji Sugiyama | Beam scanning-type display device, method, program and integrated circuit |
US7927522B2 (en) | 2007-10-25 | 2011-04-19 | Wen-Yi Hsu | Polarized lens and method of making polarized lens |
US20090109241A1 (en) | 2007-10-26 | 2009-04-30 | Canon Kabushiki Kaisha | Image display system, image display apparatus, and control method thereof |
US8355671B2 (en) | 2008-01-04 | 2013-01-15 | Kopin Corporation | Method and apparatus for transporting video signal over Bluetooth wireless interface |
US20090179824A1 (en) | 2008-01-10 | 2009-07-16 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, and system |
US20090258669A1 (en) | 2008-04-15 | 2009-10-15 | Hong Nie | Impulse ultra-wideband radio communication system |
US20090322653A1 (en) | 2008-06-25 | 2009-12-31 | Samsung Electronics Co., Ltd. | Compact virtual display |
US20120188158A1 (en) | 2008-06-26 | 2012-07-26 | Microsoft Corporation | Wearable electromyography-based human-computer interface |
US20100053555A1 (en) | 2008-08-27 | 2010-03-04 | Locarna Systems, Inc. | Method and apparatus for tracking eye movement |
US7850306B2 (en) | 2008-08-28 | 2010-12-14 | Nokia Corporation | Visual cognition aware display and visual data transmission architecture |
US8922898B2 (en) | 2008-09-04 | 2014-12-30 | Innovega Inc. | Molded lens with nanofilaments and related methods |
US20100149073A1 (en) | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
US20100142015A1 (en) | 2008-12-09 | 2010-06-10 | Sony Corporation | Hologram recording film and method of manufacturing same, and image display apparatus |
US20120002256A1 (en) | 2009-02-16 | 2012-01-05 | Lilian Lacoste | Laser Based Image Display System |
US20120105486A1 (en) * | 2009-04-09 | 2012-05-03 | Dynavox Systems Llc | Calibration free, motion tolerent eye-gaze direction detector with contextually aware computer interaction and communication methods |
US20120139817A1 (en) | 2009-08-13 | 2012-06-07 | Bae Systems Plc | Head up display system |
US20120249797A1 (en) | 2010-02-28 | 2012-10-04 | Osterhout Group, Inc. | Head-worn adaptive display |
JP2013127489A (en) | 2010-03-29 | 2013-06-27 | Panasonic Corp | See-through display |
US20120169752A1 (en) | 2010-04-28 | 2012-07-05 | Akira Kurozuka | Scanning type image display apparatus |
US8634119B2 (en) | 2010-07-09 | 2014-01-21 | Tipd, Llc | System for holography |
US9135708B2 (en) | 2010-08-09 | 2015-09-15 | National University Corporation Shizuoka University | Gaze point detection method and gaze point detection device |
US20130222384A1 (en) | 2010-11-08 | 2013-08-29 | Seereal Technologies S.A. | Display device, in particular a head-mounted display, based on temporal and spatial multiplexing of hologram tiles |
US20120182309A1 (en) | 2011-01-14 | 2012-07-19 | Research In Motion Limited | Device and method of conveying emotion in a messaging application |
US8666212B1 (en) | 2011-04-28 | 2014-03-04 | Google Inc. | Head mounted display using a fused fiber bundle |
US20130215235A1 (en) | 2011-04-29 | 2013-08-22 | Austin Russell | Three-dimensional imager and projection device |
US20120290401A1 (en) | 2011-05-11 | 2012-11-15 | Google Inc. | Gaze tracking system |
US20120302289A1 (en) | 2011-05-27 | 2012-11-29 | Kang Heejoon | Mobile terminal and method of controlling operation thereof |
US20130198694A1 (en) | 2011-06-10 | 2013-08-01 | Aliphcom | Determinative processes for wearable devices |
US20130009853A1 (en) | 2011-07-05 | 2013-01-10 | The Board Of Trustees Of The Leland Stanford Junior University | Eye-glasses mounted display |
US20130016413A1 (en) | 2011-07-12 | 2013-01-17 | Google Inc. | Whole image scanning mirror display system |
US8179604B1 (en) | 2011-07-13 | 2012-05-15 | Google Inc. | Wearable marker for passive interaction |
US20130016292A1 (en) | 2011-07-15 | 2013-01-17 | Google Inc. | Eyepiece for near-to-eye display with multi-reflectors |
US20130051631A1 (en) * | 2011-08-22 | 2013-02-28 | Eyelock Inc. | Systems and methods for capturing artifact free images |
US20140204455A1 (en) | 2011-08-24 | 2014-07-24 | Milan Momcilo Popovich | Wearable data display |
US20140202643A1 (en) | 2011-08-31 | 2014-07-24 | Koninklijke Philips N.V. | Light control panel |
US20130076800A1 (en) * | 2011-09-28 | 2013-03-28 | Michio HATAGI | Scanning projection apparatus and scanning image display |
US20130088413A1 (en) | 2011-10-05 | 2013-04-11 | Google Inc. | Method to Autofocus on Near-Eye Display |
US20140226193A1 (en) | 2011-10-25 | 2014-08-14 | National Central University | Optical head-mounted display with mechanical one-dimensional scanner |
US8704882B2 (en) | 2011-11-18 | 2014-04-22 | L-3 Communications Corporation | Simulated head mounted display system and method |
US20130135722A1 (en) | 2011-11-29 | 2013-05-30 | Seiko Epson Corporation | Polarization separation device and display apparatus |
US20130300652A1 (en) | 2011-11-30 | 2013-11-14 | Google, Inc. | Unlocking a Screen Using Eye Tracking Information |
US20130165813A1 (en) | 2011-12-23 | 2013-06-27 | Industrial Technology Research Institute | Sensor for acquiring muscle parameters |
US20130169560A1 (en) | 2012-01-04 | 2013-07-04 | Tobii Technology Ab | System for gaze interaction |
JP2013160905A (en) | 2012-02-03 | 2013-08-19 | Denso Corp | Vehicle-mounted head-up display |
US9916005B2 (en) * | 2012-02-06 | 2018-03-13 | Sony Corporation | Gaze tracking with projector |
US8970571B1 (en) | 2012-03-13 | 2015-03-03 | Google Inc. | Apparatus and method for display lighting adjustment |
US20140368896A1 (en) | 2012-03-15 | 2014-12-18 | Panasonic Corporation | Optical reflecting element and actuator |
US8922481B1 (en) | 2012-03-16 | 2014-12-30 | Google Inc. | Content annotation |
US8971023B2 (en) | 2012-03-21 | 2015-03-03 | Google Inc. | Wearable computing device frame |
US20130265437A1 (en) | 2012-04-09 | 2013-10-10 | Sony Mobile Communications Ab | Content transfer via skin input |
US20130285901A1 (en) | 2012-04-27 | 2013-10-31 | Dongguk University Industry-Academic Cooperation Foundation | System and method for tracking gaze at distance |
US20130332196A1 (en) | 2012-06-07 | 2013-12-12 | The Government Of The United States As Represented By The Secretary Of The Army | Diabetes Monitoring Using Smart Device |
US20130335302A1 (en) | 2012-06-18 | 2013-12-19 | Randall T. Crane | Selective illumination |
US20140045547A1 (en) | 2012-08-10 | 2014-02-13 | Silverplus, Inc. | Wearable Communication Device and User Interface |
US20140125760A1 (en) | 2012-11-05 | 2014-05-08 | Honeywell International Inc. | Visual system having multiple cameras |
US8560976B1 (en) | 2012-11-14 | 2013-10-15 | Lg Electronics Inc. | Display device and controlling method thereof |
US20140198035A1 (en) | 2013-01-14 | 2014-07-17 | Thalmic Labs Inc. | Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display |
US20140198034A1 (en) | 2013-01-14 | 2014-07-17 | Thalmic Labs Inc. | Muscle interface device and method for interacting with content displayed on wearable head mounted displays |
US20140204465A1 (en) * | 2013-01-22 | 2014-07-24 | Denso Corporation | Head-up display device |
US20150362734A1 (en) | 2013-01-28 | 2015-12-17 | Ecole Polytechnique Federale De Lausanne (Epfl) | Transflective holographic film for head worn display |
US20140232651A1 (en) | 2013-02-15 | 2014-08-21 | Google Inc. | Cascading optics in optical combiners of head mounted displays |
US20140285429A1 (en) | 2013-03-15 | 2014-09-25 | John Castle Simmons | Light Management for Image and Data Control |
WO2014155288A2 (en) | 2013-03-25 | 2014-10-02 | Ecole Polytechnique Federale De Lausanne (Epfl) | Method and apparatus for head worn display with multiple exit pupils |
US20160033771A1 (en) | 2013-03-25 | 2016-02-04 | Ecole Polytechnique Federale De Lausanne | Method and apparatus for head worn display with multiple exit pupils |
US9086687B2 (en) | 2013-05-07 | 2015-07-21 | Lg Electronics Inc. | Smart watch and method for controlling the same |
US20160085084A1 (en) * | 2013-05-10 | 2016-03-24 | Intel Corporation | Projection device |
US20150036221A1 (en) | 2013-08-04 | 2015-02-05 | Robert S. Stephenson | Wide-field head-up display (HUD) eyeglasses |
US20160202081A1 (en) | 2013-09-04 | 2016-07-14 | Essilor International (Compagnie Genrale d'Optique | Navigation method based on a see-through head-mounted device |
US20150205126A1 (en) | 2013-11-27 | 2015-07-23 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US20150156716A1 (en) | 2013-12-03 | 2015-06-04 | Google Inc. | On-head detection for head-mounted display |
US20150205134A1 (en) | 2014-01-17 | 2015-07-23 | Thalmic Labs Inc. | Systems, articles, and methods for wearable heads-up displays |
WO2015123775A1 (en) | 2014-02-18 | 2015-08-27 | Sulon Technologies Inc. | Systems and methods for incorporating a real image stream in a virtual image stream |
US20150268821A1 (en) | 2014-03-20 | 2015-09-24 | Scott Ramsby | Selection using eye gaze evaluation over time |
US20150325202A1 (en) | 2014-05-07 | 2015-11-12 | Thalmic Labs Inc. | Systems, devices, and methods for wearable computers with heads-up displays |
US20150369920A1 (en) * | 2014-06-20 | 2015-12-24 | Funai Electric Co., Ltd. | Electronic apparatus and method for measuring direction of output laser light |
US20150378162A1 (en) | 2014-06-25 | 2015-12-31 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
US20170343796A1 (en) | 2014-06-25 | 2017-11-30 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
US20170343797A1 (en) | 2014-06-25 | 2017-11-30 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
US9477079B2 (en) | 2014-06-25 | 2016-10-25 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
US9766449B2 (en) | 2014-06-25 | 2017-09-19 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
US20160166146A1 (en) * | 2014-12-11 | 2016-06-16 | Icspi Corp. | Eye-Tracking System and Method Therefor |
US20160377866A1 (en) | 2015-02-17 | 2016-12-29 | Thalmic Labs Inc. | Systems, devices, and methods for eyebox expansion in wearable heads-up displays |
US20160238845A1 (en) | 2015-02-17 | 2016-08-18 | Thalmic Labs Inc. | Systems, devices, and methods for eyebox expansion in wearable heads-up displays |
US20160377865A1 (en) | 2015-02-17 | 2016-12-29 | Thalmic Labs Inc. | Systems, devices, and methods for eyebox expansion in wearable heads-up displays |
US20160274365A1 (en) | 2015-03-17 | 2016-09-22 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays with heterogeneous display quality |
US20160274758A1 (en) | 2015-03-20 | 2016-09-22 | Thalmic Labs Inc. | Systems, devices, and methods for mitigating false positives in human-electronics interfaces |
US20170212349A1 (en) | 2015-05-04 | 2017-07-27 | Thalmic Labs Inc. | Systems, devices, and methods for spatially-multiplexed holographic optical elements |
US20160327797A1 (en) | 2015-05-04 | 2016-11-10 | Thalmic Labs Inc. | Systems, devices, and methods for angle- and wavelength-multiplexed holographic optical elements |
US20160327796A1 (en) | 2015-05-04 | 2016-11-10 | Thalmic Labs Inc. | Systems, devices, and methods for eyeboxes with heterogeneous exit pupils |
US20160349516A1 (en) | 2015-05-28 | 2016-12-01 | Thalmic Labs Inc. | Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays |
US20160349514A1 (en) | 2015-05-28 | 2016-12-01 | Thalmic Labs Inc. | Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays |
US20160349515A1 (en) | 2015-05-28 | 2016-12-01 | Thalmic Labs Inc. | Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays |
US20170068095A1 (en) | 2015-09-04 | 2017-03-09 | Thalmic Labs Inc. | Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses |
US20170097753A1 (en) | 2015-10-01 | 2017-04-06 | Thalmic Labs Inc. | Systems, devices, and methods for interacting with content displayed on head-mounted displays |
US20170115483A1 (en) | 2015-10-23 | 2017-04-27 | Thalmic Labs Inc. | Systems, devices, and methods for laser eye tracking |
US9904051B2 (en) * | 2015-10-23 | 2018-02-27 | Thalmic Labs Inc. | Systems, devices, and methods for laser eye tracking |
US20170153701A1 (en) | 2015-12-01 | 2017-06-01 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays as wireless controllers |
US20170212290A1 (en) | 2015-12-17 | 2017-07-27 | Thalmic Labs Inc. | Systems, devices, and methods for curved holographic optical elements |
US20180373024A1 (en) * | 2015-12-22 | 2018-12-27 | Qd Laser, Inc. | Image projection device |
US20170184847A1 (en) * | 2015-12-28 | 2017-06-29 | Oculus Vr, Llc | Determining interpupillary distance and eye relief of a user wearing a head-mounted display |
US20170205876A1 (en) | 2016-01-20 | 2017-07-20 | Thalmic Labs Inc. | Systems, devices, and methods for proximity-based eye tracking |
US20170219829A1 (en) | 2016-01-29 | 2017-08-03 | Thalmic Labs Inc. | Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display |
US20170243061A1 (en) * | 2016-02-22 | 2017-08-24 | Fujitsu Limited | Detection system and detection method |
US20170299956A1 (en) | 2016-04-13 | 2017-10-19 | Thalmic Labs Inc. | Systems, devices, and methods for focusing laser projectors |
US20180007255A1 (en) | 2016-06-30 | 2018-01-04 | Thalmic Labs Inc. | Image capture systems, devices, and methods that autofocus based on eye-tracking |
US9936163B1 (en) * | 2016-10-05 | 2018-04-03 | Avaya Inc. | System and method for mirror utilization in meeting rooms |
Non-Patent Citations (38)
Title |
---|
Amitai, "P-27: A Two-dimensional Aperture Expander for Ultra-Compact, High-Performance Head-Worn Displays," SID Symposium Digest of Technical Papers, vol. 36, No. 1 (2005), pp. 360-363. |
Ayras et al., "Exit pupil expander with a large field of view based on diffractive optics," Journal of the SID, vol. 17, No. 8 (2009), pp. 659-664. |
Chellappan et al., "Laser-based display: a review," Applied Optics, vol. 49, No. 25 (2010), pp. 79-98. |
Cui et al., "Diffraction from angular multiplexing slanted volume hologram gratings," Optik, vol. 116 (2005), pp. 118-122. |
Curatu et al., "Dual Purpose Lens for an Eye-tracked Projection Head-Mounted Display," International Optical Design Conference 2006, SPIE-OSA, vol. 6342 (2007), pp. 63420X-1-63420X-7. |
Curatu et al., "Projection-based head-mounted display with eye-tracking capabilities," Proc. of SPIE, vol. 5875 (2005), pp. 58750J-1-58750J-9. |
Essex, "Tutorial on Optomechanical Beam Steering Mechanisms," College of Optical Sciences, University of Arizona, 2006, 8 pages. |
Fernandez et al., "Optimization of a thick polyvinyl alcohol-acrylamide photopolymer for data storage using a combination of angular and peristrophic holographic multiplexing," Applied Optics, vol. 45, No. 29 (2006), pp. 7661-7666. |
Hainich et al., "Chapter 10: Near-Eye Displays," in: Displays-Fundamentals & Applications, 2011, pp. 439-503. |
Hainich et al., "Chapter 10: Near-Eye Displays," in: Displays—Fundamentals & Applications, 2011, pp. 439-503. |
Hornstein et al., "Maradin's Micro-Minor-System Level Synchronization Notes," SID 2012 Digest (2012), pp. 981-984. |
Hornstein et al., "Maradin's Micro-Minor—System Level Synchronization Notes," SID 2012 Digest (2012), pp. 981-984. |
International Search Report and Written Opinion, dated Apr. 25, 2017, for International Application No. PCT/US2016/067246, 10 pages. |
International Search Report and Written Opinion, dated Dec. 8, 2016, for International Application No. PCT/US2016/050225, 15 pages. |
International Search Report and Written Opinion, dated Jan. 18, 2017, for International Application No. PCT/US2016/054852, 12 pages. |
International Search Report and Written Opinion, dated Jun. 8, 2016, for International Application No. PCT/US2016/018293, 17 pages. |
International Search Report and Written Opinion, dated Jun. 8, 2016, for International Application No. PCT/US2016/018298, 14 pages. |
International Search Report and Written Opinion, dated Jun. 8, 2016, for International Application No. PCT/US2016/018299, 12 pages. |
International Search Report and Written Opinion, dated Oct. 13, 2017, for International Application No. PCT/US2017/040323, 16 pages. |
International Search Report and Written Opinion, dated Sep. 28, 2017, for International Application No. PCT/US2017/027479, 13 pages. |
Itoh et al., "Interaction-free calibration for optical see-through head-mounted displays based on 3D eye localization," 2014 IEEE Symposium on 3D User Interfaces (3DUI), (2014), pp. 75-82. |
Janssen, "Radio Frequency (RF)" 2013, retrieved from https://web.archive.org/web/20130726153946/https://www.techopedia.com/definition/5083/radio-frequency-rf, retrieved on Jul. 12, 2017, 2 pages. |
Kessler, "Optics of Near to Eye Displays (NEDs)," Oasis 2013, Tel Aviv, Israel, Feb. 19, 2013, 37 pages. |
Kress et al., "A review of head-mounted displays (HMD) technologies and applications for consumer electronics," Proc. of SPIE, vol. 8720 (2013), pp. 87200A-1-87200A-13. |
Kress et al., "Diffractive and Holographic Optics as Optical Combiners in Head Mounted Displays," Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, Sep. 8-12, 2013, pp. 1479-1482. |
Kress, "Optical architectures for see-through wearable displays," Bay Area-SID Seminar, Bay Area, Apr. 30, 2014, 156 pages. |
Kress, "Optical architectures for see-through wearable displays," Bay Area—SID Seminar, Bay Area, Apr. 30, 2014, 156 pages. |
Levola, "7.1: Invited Paper: Novel Diffractive Optical Components for Near to Eye Displays," SID Symposium Digest of Technical Papers, vol. 37, No. 1 (2006), pp. 64-67. |
Liao et al., "The Evolution of MEMS Displays," IEEE Transcations on Industrial Electronics, vol. 56, No. 4 (2009), pp. 1057-1065. |
Lippert, "Chapter 6: Display Devices: RSD (Retinal Scanning Display)," in: The Avionics Handbook, 2001, 8 pages. |
Majaranta et al., "Chapter 3: Eye-Tracking and Eye-Based Human-Computer Interaction," in Advances in Physiological Computing, 2014, pp. 39-65. |
Merriam-Webster, "Radio Frequencies" retrieved from https://www.merriam-webster.com/table/collegiate/radiofre.htm, retrieved on Jul. 12, 2017, 2 pages. |
Schowengerdt et al., "Stereoscopic retinal scanning laser display with integrated focus cues for ocular accommodation," Proc. of SPIE-IS&T Electronic Imaging, vol. 5291 (2004), pp. 366-376. |
Silverman et al., "58.5L: Late-News Paper: Engineering a Retinal Scanning Laser Display with Integrated Accommodative Depth Cues," SID 03 Digest, (2003), pp. 1538-1541. |
Takatsuka et al., "Retinal projection display using diffractive optical element," Tenth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IEEE, (2014), pp. 403-406. |
Urey et al., "Optical performance requirements for MEMS-scanner based microdisplays," Conf on MOEMS and Miniaturized Systems, SPIE, vol. 4178 (2000), pp. 176-185. |
Urey, "Diffractive exit-pupil expander for display applications," Applied Optics, vol. 40, No. 32 (2001), pp. 5840-5851. |
Viirre et al., "The Virtual Retina Display: A New Technology for Virtual Reality and Augmented Vision in Medicine," Proc. ofMedicine Meets Virtual Reality (1998), pp. 252-257. |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190196195A1 (en) * | 2017-12-11 | 2019-06-27 | North Inc. | Wavelength combiner photonic integrated circuit with grating coupling of lasers |
US10942359B2 (en) * | 2017-12-11 | 2021-03-09 | Google Llc | Wavelength combiner photonic integrated circuit with grating coupling of lasers |
US11314085B2 (en) * | 2019-02-14 | 2022-04-26 | Thales | Viewing device comprising a pupil expander including two mirrors |
US10877268B2 (en) * | 2019-04-16 | 2020-12-29 | Facebook Technologies, Llc | Active control of in-field light sources of a head mounted display |
US10948729B2 (en) | 2019-04-16 | 2021-03-16 | Facebook Technologies, Llc | Keep-out zone for in-field light sources of a head mounted display |
US11428930B2 (en) * | 2019-08-07 | 2022-08-30 | Meta Platforms Technologies, Llc | Stray light suppression in eye-tracking imaging |
US12270992B2 (en) | 2019-08-07 | 2025-04-08 | Meta Platforms Technologies, Llc | Beam shaping optic for light sources |
US12231613B2 (en) | 2019-11-06 | 2025-02-18 | Hes Ip Holdings, Llc | System and method for displaying an object with depths |
Also Published As
Publication number | Publication date |
---|---|
US20180149863A1 (en) | 2018-05-31 |
US20180149874A1 (en) | 2018-05-31 |
CA3045192A1 (en) | 2018-06-07 |
WO2018098579A1 (en) | 2018-06-07 |
US10459220B2 (en) | 2019-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10459220B2 (en) | Systems, devices, and methods for laser eye tracking in wearable heads-up displays | |
US10429655B2 (en) | Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays | |
US10606072B2 (en) | Systems, devices, and methods for laser eye tracking | |
US11619992B2 (en) | Method and system for eye tracking with glint space recalibration on wearable heads-up display | |
US11042031B2 (en) | Eye tracking system and method, eyeglass lens, and wearable heads-up display | |
US10852817B1 (en) | Eye tracking combiner having multiple perspectives | |
US11157077B2 (en) | Method and system for dual mode eye tracking on wearable heads-up display | |
US11093034B2 (en) | Eye tracking method and system and integration of the same with wearable heads-up displays | |
US11205069B1 (en) | Hybrid cornea and pupil tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
AS | Assignment |
Owner name: THALMIC LABS INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ALEEM, IDRIS S.;BHARGAVA, MAYANK;REEL/FRAME:048626/0565 Effective date: 20180411 Owner name: NORTH INC., CANADA Free format text: CHANGE OF NAME;ASSIGNOR:THALMIC LABS INC.;REEL/FRAME:048628/0408 Effective date: 20180830 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTH INC.;REEL/FRAME:054113/0814 Effective date: 20200916 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20230910 |