US12201265B2 - Location pad surrounding at least part of patient eye for tracking position of a medical instrument - Google Patents
Location pad surrounding at least part of patient eye for tracking position of a medical instrument Download PDFInfo
- Publication number
- US12201265B2 US12201265B2 US17/221,921 US202117221921A US12201265B2 US 12201265 B2 US12201265 B2 US 12201265B2 US 202117221921 A US202117221921 A US 202117221921A US 12201265 B2 US12201265 B2 US 12201265B2
- Authority
- US
- United States
- Prior art keywords
- eye
- patient
- roi
- image
- medical instrument
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00158—Holding or positioning arrangements using magnetic field
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0041—Operational features thereof characterised by display arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
- A61B5/062—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6821—Eye
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6847—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting in contact-lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/397—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
- A61B2090/3975—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
- A61B2090/3979—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active infrared
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/05—Surgical care
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting in contact-lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/00736—Instruments for removal of intra-ocular material or intra-ocular injection, e.g. cataract instruments
Definitions
- a surgeon navigates a medical instrument to a target location within a patient eye.
- patient tissue may obstruct at least part of the medical instrument.
- Various techniques have been developed for tracking and visualizing medical instruments during minimally invasive procedures.
- U.S. Patent Publication No. 2018/0245461 describes a sensor, employed to sense a distance to the surface of a subject to be examined, so that a range image may be acquired. Intensity information may be acquired alongside the distance information. The distance information and intensity information may be evaluated to track the pose of the sensor means relative to the surface of the subject to be examined, so that anatomical data related to said subject may be displayed as seen from the position and/or orientation of the sensor means or display means.
- U.S. Patent Publication No. 2004/0199072 describes a patient positioning device used for positioning a patient during a navigated medical procedure.
- the positioning device includes a contoured patient support and a portion of a navigation system.
- the contoured patient support positions the patient in a desired manner.
- the portion of the navigation system is integrated within the patient support, such that the navigated medical procedure may be performed in a substantially unobstructed manner.
- U.S. Patent Publication No. 2006/0281971 describes a method and apparatus for presenting three-dimensional data to a surgeon, which is provided to facilitate the flexible navigation of an endoscope and surgical instruments with respect to anatomical structures.
- a first set of data corresponding to a three-dimensional model of a patient's anatomy is received.
- This three-dimensional model may be rendered from images taken in CT or MRI scanning.
- this model is then combined with a second set of data corresponding to a view obtained from an endoscope.
- the processor is configured to receive at least an image of the organ, and the system includes a display, which is configured, based on the position signal, to visualize the medical instrument overlaid on the image.
- the image includes an optical image and an anatomical image
- the processor is configured to visualize the medical instrument overlaid on at least one of the optical image and the anatomical image.
- the image includes an optical image and an anatomical image
- the display includes an augmented reality display
- the processor is configured to simultaneously display, on the display, the optical image on a first section of the display, and the anatomical image on a second section of the display.
- a method for manufacturing a location pad includes receiving a flexible frame to be attached to tissue that is at least partially surrounding an organ of a patient. Two or more field-generators, are fixed to the flexible frame at respective positions surrounding a region-of-interest (ROI) of the organ, for generating respective magnetic fields at least in the ROI.
- ROI region-of-interest
- FIG. 1 is a schematic pictorial illustration of an ophthalmic surgical system, in accordance with an embodiment of the present invention
- FIG. 2 is a schematic pictorial illustration of a location pad used for tracking a medical instrument treating a patient eye, in accordance with an embodiment of the present invention
- FIG. 3 is a flow chart that schematically illustrates a method for producing a location pad, in accordance with an embodiment of the present invention.
- Accurate position tracking and visualization of a medical instrument are particularly important in surgical procedures carried out in small organs, such as in a patient eye.
- Embodiments of the present invention that are described hereinbelow provide improved techniques for tracking and visualizing a medical instrument, which is at least partially obstructed or hidden from view to a surgeon during an ophthalmic surgical procedure.
- an ophthalmic surgical system comprises a location pad having a frame made from a flexible substrate, such as a flexible printed circuit board (PCB), which is configured to be attached to facial tissue surrounding at least part of a patient eye.
- the location pad comprises multiple field-generators of a position tracking system (PTS), which are coupled to the frame at respective positions surrounding at least a portion of the eye and are configured to generate respective magnetic fields at least in a region-of-interest (ROI) of the patient eye.
- PTS position tracking system
- the ophthalmic surgical system comprises a processor, which is configured to receive one or more of (a) a stereoscopic optical image of the patient eye, (b) an anatomical image, such as a computerized tomography image (CTI), of the patient eye, and (c) a position signal of the PTS.
- the processor is further configured to register the optical image and the anatomical image in a coordinate system of the PTS, and to estimate the position of the medical instrument in at least one of the optical image and the CTI.
- the ophthalmic surgical system comprises a display, which is configured to visualize the medical instrument overlaid on at least one of the optical image and the CTI.
- eye tissue or any other blocking element may obstruct or conceal (from the surgeon's view) at a portion of the surgical tool, like the distal end of the surgical tool, at for example, the ROI.
- the display comprises an augmented reality display, and the processor is configured to display, on the display, the position of the medical instrument unobstructed.
- the processor is configured to simultaneously display the optical image surrounding the ROI, and the CTI on the ROI, so as to visualize the estimated position of the surgical tool in the ROI.
- the location pad comprises tracking elements, fixed at predefined positions on the frame for registering the location pad with the patient eye.
- the tracking elements may comprise infrared light emitting diodes (LEDs), each of which having a different flashing rate.
- the augmented reality display comprises a head mount display (HMD) having an image sensor, which is configured to acquire infrared images of the tracking elements during the procedure. Based on the infrared images, the processor is configured to improve the registration between the ROI and the coordinate system of the PTS, so as to improve the accuracy and visualization of the estimated position of the surgical tool during the ophthalmic procedure.
- HMD head mount display
- the disclosed techniques improve the quality of a medical procedure carried out in an organ, by visualizing a hidden section of a medical instrument operated within a ROI of the organ. Specifically, the disclosed techniques improve the positioning accuracy of a surgical tool in a small organ.
- FIG. 1 is a schematic pictorial illustration of an ophthalmic surgical system 20 , in accordance with an embodiment of the present invention.
- System 20 is configured to carry out various types of ophthalmic procedures, such as but not limited to a cataract surgery.
- system 20 comprises a medical instrument, such as but not limited to a phacoemulsification handpiece or any other suitable type of an ophthalmic surgical tool, referred to herein as a tool 55 , used by a surgeon 24 to carry out the ophthalmic surgical procedure.
- a medical instrument such as but not limited to a phacoemulsification handpiece or any other suitable type of an ophthalmic surgical tool, referred to herein as a tool 55 , used by a surgeon 24 to carry out the ophthalmic surgical procedure.
- Other surgical tools may comprise an irrigation and aspiration (I/A) handpiece, a diathermy handpiece, a vitrectomy handpiece, and similar instruments.
- I/A irrigation and aspiration
- surgeon 24 applies tool 55 for treating eye 22 , in the present example, surgeon 24 inserts a distal end 88 of tool 55 into a region-of-interest (ROI) 76 of eye 22 .
- ROI region-of-interest
- surgeon 24 inserts tool 55 below iris tissue 99 so as to apply phacoemulsification to a lens 89 of eye 22 .
- tool 55 comprises one or more position sensor(s) 56 of a magnetic position tracking system (PTS) described in detail below.
- At least one position sensor 56 may comprise a triple-axis sensor (TAS) made from three coils or a single-axis sensor (SAS) implemented on a printed circuit board (PCB) or using any other suitable technique.
- TAS triple-axis sensor
- SAS single-axis sensor
- PCB printed circuit board
- the one or more position sensor(s) 56 may be located anywhere on tool 55 , for example, anywhere on a shaft of the tool or a portion of the tool located near the treatment site. In the present example, position sensor 56 is coupled to distal end 88 of tool 55 .
- system 20 comprises a location pad 40 having a frame and a plurality of field-generators shown and described in detail in FIG. 2 below.
- location pad 40 comprises a flexible substrate, which is configured to be attached to facial tissue (e.g., skin) of patient 23 .
- facial tissue e.g., skin
- attached means that, when head 41 of patient 23 is moved in a given offset, location pad 40 is moved in the same offset. In other words, location pad 40 and head 41 are considered to be a single rigid body.
- system 20 comprises the aforementioned magnetic position tracking system, which is configured to track the position of one or more position sensors, such as position sensor 56 located on tool 55 that is used for treating eye 22 , and/or other position sensors coupled to tools inserted into head 41 , eye 22 , or into any other organ of patient 23 .
- the magnetic position tracking system comprises magnetic field-generators (not shown) fixed at respective positions of the aforementioned frame of location pad 40 , whose details are shown and described in FIG. 2 below.
- position sensor 56 is configured to generate one or more position signals in response to sensing external magnetic fields generated by the field-generators of location pad 40 .
- a processor 34 (described in detail below) of system 20 is configured to estimate, based on the position signals, the position of tool 55 , e.g. distal end 88 , within ROI 76 of eye 22 .
- system 20 comprises a console 33 , which comprises a memory 49 , and a driver circuit 42 configured to drive, via a cable 37 , the field-generators with suitable signals so as to generate magnetic fields in a predefined working volume, such as in ROI 76 of eye 22 .
- console 33 comprises processor 34 , typically a general-purpose computer, with suitable front end and interface circuits for receiving the position signals from position sensor 56 coupled to tool 55 .
- processor 34 receives the position signals via a cable 32 ; and may use cable 32 for exchanging any suitable signals with other components of tool 55 .
- Other means of transmitting and receiving signals known in the art are also contemplated, e.g. BLUETOOTH or other wireless connection.
- Console 33 further comprises input device 39 and a display 36 (which may also be, for example, a keyboard, touch screen graphical user interface, or the like).
- system 20 comprises an ophthalmic surgical microscope 11 , such as ZEISS OPMI LUMERA series or ZEISS ARTEVO series supplied by Carl Zeiss Meditec AG (Oberkochen, Germany), or any other suitable type of ophthalmic surgical microscope provided by other suppliers.
- Ophthalmic surgical microscope 11 is configured to produce stereoscopic optical images and two-dimensional (2D) optical images of eye 22 .
- system 20 comprises two cameras 25 coupled, respectively, to two eyepieces 26 of ophthalmic surgical microscope 11 , and configured to acquire two respective optical images of eye 22 .
- the coupling between cameras 25 and eyepieces 26 may be carried out using a suitable jig, or any other suitable method and/or apparatus.
- processor 34 is configured to receive the optical images from cameras 25 , via a cable 28 (although other means of transmitting and receiving signals known in the art may be used), and, based on the received optical images, to display an optical image 35 on display 36 .
- processor 34 is configured to display in image 35 : (i) a stereoscopic image by using two separate optical paths with two objectives and eyepieces 26 to provide slightly different viewing angles to two respective cameras 25 , or (ii) a 2D optical image, e.g., by using an optical image received from one selected camera 25 of system 20 . Note that in most cases surgeon 24 may prefer using the stereoscopic image in such surgical applications.
- iris tissue 99 constitutes a blocking element for imaging distal end 88 in optical image 35 .
- surgeon 24 cannot see the location of distal end 88 due to the blocking element within ROI 76 , so as to accurately emulsify lens 89 of eye 22 .
- processor 34 is configured to receive, from an anatomical imaging system, such as a computerized tomography (CT) system (not shown), a three-dimensional (3D) anatomical image acquired prior to the ophthalmic procedure.
- an anatomical imaging system such as a computerized tomography (CT) system (not shown)
- CT computerized tomography
- 3D three-dimensional
- system 20 comprises an optical head mount display (HMD) 66 using augmented reality techniques for visualizing distal end 88 of tool 55 overlaid on at least one of optical image 35 and the anatomical image, as described herein.
- HMD optical head mount display
- processor 34 is configured to select, from the 3D anatomical image, a 2D slice of the anatomical image comprising CT imaging of ROI 76 , referred to herein as a CT image (CTI) 77 .
- CTI CT image
- distal end 88 of tool 55 may be invisible in optical image 35 , for being obstructed by a blocking element (e.g., iris tissue 99 , any other tissue, or a medical apparatus used in the ophthalmic procedure).
- a blocking element e.g., iris tissue 99 , any other tissue, or a medical apparatus used in the ophthalmic procedure.
- processor 34 is configured to display the position of distal end 88 unobstructed. In the example of inset 27 , the visualization of distal end 88 is shown as a dashed line.
- HMD 66 and console 33 have wireless devices (not shown) configured to exchange wireless signals 54 for transferring, inter alia, the aforementioned augmented image and/or any suitable combination of image 35 , CTI 77 , and the position signals of position sensor 56 .
- processor 34 is configured to display, on HMD 66 , a visualization of distal end 88 overlaid on CTI 77 .
- processor 34 is configured to replace, in ROI 76 , the section of the optical image with a corresponding CTI 77 , or with any other suitable section of a slice of the CT image.
- processor 34 is configured to display iris tissue 99 (or any other blocking element) transparent, so as to display the position of distal end 88 unobstructed.
- processor 34 is configured to register optical image 35 and the anatomical image (e.g., a slice comprising CTI 77 ) in a common coordinate system, such as a coordinate system of the position tracking system.
- processor 34 receives two or more of the following inputs: (a) the optical (2D or stereoscopic) image from ophthalmic surgical microscope 11 , (b) the anatomical image from the CT system, and (c) the position signal (generated by position sensor 56 ) from the position tracking system.
- processor 34 processes at least some of the received three inputs (for example, by producing optical image 35 , and/or CTI 77 , and registers the coordinate systems of optical image 35 , CTI 77 and the position signals received from position sensor 56 , in a common coordinate system (e.g., the coordinate system of the position tracking system).
- a common coordinate system e.g., the coordinate system of the position tracking system.
- processor 34 is configured to track the position of distal end 88 , based on position signals received from one or more position sensor(s) 56 . Moreover, processor 34 is configured to visualize the position of distal end 88 overlaid on at least one of the registered CTI 77 and optical image 35 . In the example of inset 27 , processor 34 is configured to produce the aforementioned augmented image comprising: (a) CTI 77 displayed on the section of ROI 76 , (b) optical image 35 displaying tool 55 and eye 22 surrounding the section of ROI 76 , and (c) a visualization of distal end 88 , overlaid on CTI 77 in the section of ROI 76 . In the context of the present disclosure and in the claims, the terms “produce” and “generate” are used interchangeably, e.g., for signals and images made by one or more position sensor(s) 56 , processor 34 and any other component of system 20 .
- processor 34 is configured to transmit the augmented image shown in inset 27 and described above, to HMD 66 so that surgeon 24 can see eye 22 and a visualization of the estimated position of distal end 88 of tool 55 .
- the augmented image shown in inset 27 provides surgeon 24 with a complete visualization of tool 55 , including distal end 88 .
- processor 34 is configured to dynamically control the size of ROI 76 , automatically (e.g., based on the position and/or obstruction of distal end 88 ) or in response to an instruction received from surgeon 24 using input device 39 .
- HMD 66 may comprise a processor (not shown), which is configured to carry out at least some of the operations carried out by processor 34 and described above.
- processor 34 e.g., optical images from ophthalmic surgical microscope 11 , CTI 77 from processor 34 or the CTI from the CT system, the position signals from position sensor(s) 56
- the operations described above may be divided, using any suitable definition, between processor 34 and the processor of HMD 66 , so that the augmented image is displayed on HMD 66 as described in detail above.
- system 20 is shown by way of example, in order to illustrate certain problems that are addressed by embodiments of the present invention and to demonstrate the application of these embodiments in enhancing the performance of such a system.
- Embodiments of the present invention are by no means limited to this specific sort of example system, and the principles described herein may similarly be applied to other sorts of ophthalmic and other minimally invasive and surgical systems.
- FIG. 2 is a schematic pictorial illustration of location pad 40 used for tracking tool 55 when treating eye 22 , in accordance with an embodiment of the present invention.
- location pad 40 comprises a frame 46 made from a flexible substrate, such as a flexible printed circuit board (PCB), and a plurality of field-generators 44 coupled with frame 46 .
- PCB flexible printed circuit board
- frame 46 is attached to tissue (e.g., cheek and forehead) that is at least partially surrounding eye 22 and is configured to place a plurality of field-generators 44 at respective positions surrounding ROI 76 .
- each field-generator 44 comprises one or more coils arranged in any suitable configuration, e.g., concentric or non-concentric arrangement.
- pad 40 comprises three field-generators 44 , but may alternatively comprise any other suitable number of field-generators 44 .
- the magnetic position tracking system comprises magnetic field-generators 44 fixed at respective positions of frame 46 of location pad 40 .
- Position sensor 56 is configured to generate one or more position signals in response to sensing external magnetic fields generated by the field-generators 44 of location pad 40
- processor 34 is configured to estimate, based on the one or more position signals, the position of distal end 88 within ROI 76 of eye 22 .
- any suitable type of location pad having field-generators generating respective magnetic fields at least in ROI 76 is possible to use any suitable type of location pad having field-generators generating respective magnetic fields at least in ROI 76 .
- Such location pads do not enable positioning accuracy sufficient for performing a cataract surgical procedure, mainly because of insufficient proximity between the field-generators and the ROI in which the surgeon performs the procedure.
- a cataract surgery procedure requires a sub-millimeter positioning accuracy, which can be obtained when field-generators 44 are positioned in close proximity to ROI 76 .
- any movement of head 41 may spoil the registration between optical image 35 , CTI 77 and position signals produced by position sensor 56 , and therefore may degrade the quality of the cataract surgical procedure.
- location pad 40 is attached to and conforms to the skin surrounding at least part of eye 22 . Therefore, location pad 40 moves together with head 41 , so that any movement of head 41 may not spoil the registration described in FIG. 1 above.
- the close proximity between ROI 76 and the surrounding field-generators 44 improves the positioning accuracy of the position sensor(s) 56 in the coordinate system of the position tracking system.
- the improved positioning accuracy results in improved overlay accuracy of distal end 88 visualized on the augmented image described in FIG. 1 above, and/or the overlay accuracy in at least one of optical image 35 and CTI 77 .
- location pad 40 comprises one or more tracking elements 45 for registering location pad 40 with eye 22 .
- tracking elements 45 comprise optical tracking elements, such as infrared light emitting diodes (LEDs), each of which having a different flashing rate.
- LEDs infrared light emitting diodes
- HMD 66 comprises an image sensor 80 , which is configured, to acquire images of the LEDs of tracking elements 45 , and to send the images (e.g., carried on wireless signals 54 as described in FIG. 1 above) to processor 34 , e.g., during the cataract surgical procedure.
- image sensor 80 is configured, to acquire images of the LEDs of tracking elements 45 , and to send the images (e.g., carried on wireless signals 54 as described in FIG. 1 above) to processor 34 , e.g., during the cataract surgical procedure.
- processor 34 is configured to dynamically update (e.g., in real-time) the registration between ROI 76 and the coordinate system of the PTS (or any other common coordinate system).
- the real-time registration may improve the quality of the cataract surgical procedure, by improving the accuracy and visualization of the estimated position of distal end 88 in ROI 76 .
- location pad 40 may comprise any other suitable type of LEDs or other sorts of tracking elements. Moreover, in the example of FIG. 2 , location pad comprises three tracking elements 45 , but in other embodiments, location pad 40 may have any other suitable number tracking elements 45 , typically but not necessarily, arranged around eye 22 .
- location pad 40 is shown by way of example, in order to illustrate certain alignment and/or registration problems that are addressed by embodiments of the present invention and to demonstrate the application of these embodiments in enhancing the performance of system 20 .
- Embodiments of the present invention are by no means limited to this specific sort of example location pad and/or system, and the principles described herein may similarly be applied to other sorts of location pads and/or medical systems. For example, in FIG.
- frame 46 has a horseshoe shape partially surrounding eye 22 and open at the side of the patient nose, in other embodiments, frame 46 may have any other suitable shape, e.g., a bagel-shape fully surrounding eye 22 , or a goggles-shape or eye-mask shape comprising two bagel-shaped frames fully surrounding both eyes of patient 23 .
- a substantially identical location pad 40 may be flipped 180° for being used on the second eye of patient 23 .
- a location pad for the second eye may have a horseshoe shape open at the side of the patient nose, e.g., having a symmetric configuration to that of location pad 40 .
- the location pad frame may have any other suitable shape and may have any suitable number of at least field-generators 44 at suitable respective positions. In such embodiments, the location pad may have only field-generators 44 fixed on the frame. In alternative embodiments, the location pad may have both field-generators 44 and tracking elements fixed on the frame having any suitable shape.
- FIG. 3 is a flow chart that schematically illustrates a method for producing location pad 40 , in accordance with an embodiment of the present invention.
- the method begins at a step 100 with receiving one or more field-generators 44 for generating magnetic fields at least in a ROI of eye 22 .
- receiving tracking elements 45 such as infrared LEDs or any other suitable tracking elements, for registering location pad 40 with eye 22 .
- field-generators 44 and tracking elements 45 are fixed to frame 46 .
- frame 46 comprises a flexible substrate, such as the aforementioned flexible PCB, which is configured to (a) conform to the shape and geometry of facial tissue surrounding at least part of eye 22 , and (b) be attached to the facial tissue, so that head 41 and frame 46 of location pad 40 are moving together as a single unit.
- frame 46 when head 41 is moved by a given offset during the ophthalmic procedure, frame 46 is moved in the same offset, so that location pad 40 remains at the same position relative to eye 22 and/or head 41 and particularly to ROI 76 , as described in FIG. 2 above.
- both field-generators and tracking elements 45 are fixed to frame 46 at respective positions surrounding at least a portion of ROI 76 , and in alternative embodiments, field-generators 44 and tracking elements 45 surrounding ROI 76 .
- Field-generators 44 are arranged at first respective positions for obtaining the specified magnetic fields, at least within ROI 76 .
- Tracking elements 45 are arranged at second respective positions for obtaining the specified physical registration between location pad 40 and ROI 76 within eye 22 .
- the first and second positions differ from one another, but in other embodiments, at least one field-generator 44 and one tracking element 45 may be fixed at the same position on frame 46 .
- tracking element 45 may not be fixed on frame 46 and only field-generators 44 may be attached to frame 46 for producing the respective magnetic fields. This configuration may be used, for example, when not using augmented reality techniques, or when accurate registration between the eye and location pad is not required.
- FIG. 4 is a flow chart that schematically illustrates a method for augmented-reality visualization of tool 55 overlaid on registered CTI 77 and optical image 35 , in accordance with an embodiment of the present invention.
- the method is implemented on processor 34 , but in other embodiments, the method may be implemented, mutatis mutandis, on any other suitable type of computing device or system.
- the method begins at an anatomical image receiving step 200 , with processor 34 receiving one or more anatomical images (e.g., CT images) of patient eye 22 .
- processor 34 produces CTI 77 , which is a 2D slice of the anatomical image comprising the CT imaging of ROI 76 .
- surgeon 24 moves tool 55 to ROI 76 for treating patient eye 22 , e.g., for removing the cataract using phacoemulsification.
- processor 34 receives, e.g., from position sensor 56 , a position signal indicative of the position of distal end 88 of tool 55 within ROI 76 , as described in FIG. 1 above.
- processor 34 receives, e.g., from ophthalmic surgical microscope 11 , one or more stereoscopic or 2D optical images of eye 22 and tool 55 . In some embodiments, based on the received images, processor 34 produces optical image 35 of eye 22 , as described in FIG. 1 above.
- processor 34 registers optical image 35 and CTI 77 (or any other suitable type of anatomical image), in a common coordinate system. For example, in the coordinate system of the position tracking system, as described in FIG. 1 above.
- processor 34 estimates, based on the one or more position signals received from position sensor 55 , the position of distal end 88 in the registered optical image 35 and CTI 77 , as described in FIG. 1 above.
- processor 34 produces the augmented image shown in inset 27 and described in detail in FIG. 1 above.
- the augmented image comprises CTI 77 in ROI 76 , optical image 35 surrounding ROI 76 , and a visualization of distal end 88 overlaid on CTI 77 shown in ROI 76 .
- processor 34 displays the augmented image (e.g., the image shown in inset 27 ) on HMD 66 or on any other suitable type of augmented reality display.
- augmented image e.g., the image shown in inset 27
- optical image 35 also displays tool 55 shown out of ROI 76 , therefore, surgeon 24 can see both tool 55 and distal end 88 in the augmented image shown, for example, in inset 27 of FIG. 1 above.
- surgeon 24 may decide to carry out the procedure at more than one location within eye 22 .
- the method may loop back to moving step 202 , in which surgeon 24 moves distal end 88 to a different location within eye 22 .
- the position of the ROI within eye 22 could be updated relative to the original position of ROI 76 , in response to the updated position, surgeon 24 moves tool 55 as described in step 202 above, and the method is carried out using the same steps, mutatis mutandis, of FIG. 4 .
- surgeon 24 may use ophthalmic surgical microscope 11 , or any other suitable image acquisition sensor, for inspecting eye 22 and verifying that eye 22 does not have residues of lens 89 . After the verification, surgeon 24 may extract tool 55 out of patient eye 22 and start implanting, in eye 22 , an intraocular lens (IOL) (not shown) in place of the aspirated lens 89 .
- IOL intraocular lens
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Ophthalmology & Optometry (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Vascular Medicine (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Eye Examination Apparatus (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Image Analysis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Surgical Instruments (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Prostheses (AREA)
Abstract
Description
Claims (13)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/221,921 US12201265B2 (en) | 2020-04-23 | 2021-04-05 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
AU2021261648A AU2021261648A1 (en) | 2020-04-23 | 2021-04-12 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
PCT/IB2021/053017 WO2021214591A1 (en) | 2020-04-23 | 2021-04-12 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
EP21719290.5A EP4138651A1 (en) | 2020-04-23 | 2021-04-12 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
CA3182696A CA3182696A1 (en) | 2020-04-23 | 2021-04-12 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063014402P | 2020-04-23 | 2020-04-23 | |
US202063014383P | 2020-04-23 | 2020-04-23 | |
US202063014376P | 2020-04-23 | 2020-04-23 | |
US17/221,921 US12201265B2 (en) | 2020-04-23 | 2021-04-05 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210330395A1 US20210330395A1 (en) | 2021-10-28 |
US12201265B2 true US12201265B2 (en) | 2025-01-21 |
Family
ID=78221023
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/221,960 Active 2043-05-22 US12201266B2 (en) | 2020-04-23 | 2021-04-05 | Location pad surrounding at least part of patient eye and having optical tracking elements |
US17/221,908 Active 2043-07-20 US12161296B2 (en) | 2020-04-23 | 2021-04-05 | Augmented-reality visualization of an ophthalmic surgical tool |
US17/221,921 Active US12201265B2 (en) | 2020-04-23 | 2021-04-05 | Location pad surrounding at least part of patient eye for tracking position of a medical instrument |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/221,960 Active 2043-05-22 US12201266B2 (en) | 2020-04-23 | 2021-04-05 | Location pad surrounding at least part of patient eye and having optical tracking elements |
US17/221,908 Active 2043-07-20 US12161296B2 (en) | 2020-04-23 | 2021-04-05 | Augmented-reality visualization of an ophthalmic surgical tool |
Country Status (5)
Country | Link |
---|---|
US (3) | US12201266B2 (en) |
EP (3) | EP4138651A1 (en) |
AU (3) | AU2021259504A1 (en) |
CA (3) | CA3180843A1 (en) |
WO (3) | WO2021214590A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12096917B2 (en) * | 2019-09-27 | 2024-09-24 | Alcon Inc. | Tip camera systems and methods for vitreoretinal surgery |
US11832883B2 (en) | 2020-04-23 | 2023-12-05 | Johnson & Johnson Surgical Vision, Inc. | Using real-time images for augmented-reality visualization of an ophthalmology surgical tool |
US12201266B2 (en) | 2020-04-23 | 2025-01-21 | Johnson & Johnson Surgical Vision, Inc. | Location pad surrounding at least part of patient eye and having optical tracking elements |
DE102021206568A1 (en) * | 2021-06-24 | 2022-12-29 | Siemens Healthcare Gmbh | Display device for displaying a graphical representation of an augmented reality |
Citations (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5391199A (en) | 1993-07-20 | 1995-02-21 | Biosense, Inc. | Apparatus and method for treating cardiac arrhythmias |
WO1996005768A1 (en) | 1994-08-19 | 1996-02-29 | Biosense, Inc. | Medical diagnosis, treatment and imaging systems |
EP0951874A2 (en) | 1994-09-15 | 1999-10-27 | Visualization Technology, Inc. | Position tracking and imaging system for use in medical applications using a reference unit secured to a patients head |
US6239724B1 (en) | 1997-12-30 | 2001-05-29 | Remon Medical Technologies, Ltd. | System and method for telemetrically providing intrabody spatial position |
US6332089B1 (en) | 1996-02-15 | 2001-12-18 | Biosense, Inc. | Medical procedures and apparatus using intrabody probes |
US6381485B1 (en) * | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies, Inc. | Registration of human anatomy integrated for electromagnetic localization |
US20020065455A1 (en) | 1995-01-24 | 2002-05-30 | Shlomo Ben-Haim | Medical diagnosis, treatment and imaging systems |
US6484118B1 (en) | 2000-07-20 | 2002-11-19 | Biosense, Inc. | Electromagnetic position single axis system |
US6498944B1 (en) | 1996-02-01 | 2002-12-24 | Biosense, Inc. | Intrabody measurement |
US20030023161A1 (en) | 1999-03-11 | 2003-01-30 | Assaf Govari | Position sensing system with integral location pad and position display |
US20030120150A1 (en) | 2001-12-21 | 2003-06-26 | Assaf Govari | Wireless position sensor |
US6618612B1 (en) | 1996-02-15 | 2003-09-09 | Biosense, Inc. | Independently positionable transducers for location system |
US20040068178A1 (en) | 2002-09-17 | 2004-04-08 | Assaf Govari | High-gradient recursive locating system |
US20040101086A1 (en) | 2002-11-27 | 2004-05-27 | Sabol John Michael | Method and apparatus for quantifying tissue fat content |
US20040199072A1 (en) | 2003-04-01 | 2004-10-07 | Stacy Sprouse | Integrated electromagnetic navigation and patient positioning device |
US20050054900A1 (en) * | 2003-07-21 | 2005-03-10 | Vanderbilt University | Ophthalmic orbital surgery apparatus and method and image-guided navigation system |
US20050203380A1 (en) | 2004-02-17 | 2005-09-15 | Frank Sauer | System and method for augmented reality navigation in a medical intervention procedure |
US20060161062A1 (en) | 2003-06-12 | 2006-07-20 | Bracco Research Sa | Blood flow estimates through replenishment curve fitting in ultrasound contrast imaging |
US20060281971A1 (en) | 2005-06-14 | 2006-12-14 | Siemens Corporate Research Inc | Method and apparatus for minimally invasive surgery using endoscopes |
EP1829477A2 (en) | 2006-03-03 | 2007-09-05 | Biosense Webster, Inc. | Resolution of magnetic dipole ambiguity in position tracking measurements |
US20070265526A1 (en) | 2006-05-11 | 2007-11-15 | Assaf Govari | Low-profile location pad |
US20130015848A1 (en) * | 2011-07-13 | 2013-01-17 | Assaf Govari | Field generator patch with distortion cancellation |
US20130060146A1 (en) | 2010-04-28 | 2013-03-07 | Ryerson University | System and methods for intraoperative guidance feedback |
US20130245461A1 (en) | 2010-11-12 | 2013-09-19 | Deutsches Krebsforschungszentrum Stiftung Des Offentlichen Rechts | Visualization of Anatomical Data by Augmented Reality |
US20140275760A1 (en) | 2013-03-13 | 2014-09-18 | Samsung Electronics Co., Ltd. | Augmented reality image display system and surgical robot system comprising the same |
EP2829218A1 (en) | 2012-03-17 | 2015-01-28 | Waseda University | Image completion system for in-image cutoff region, image processing device, and program therefor |
US20150272694A1 (en) | 2012-06-27 | 2015-10-01 | CamPlex LLC | Surgical visualization system |
US20150327948A1 (en) * | 2014-05-14 | 2015-11-19 | Stryker European Holdings I, Llc | Navigation System for and Method of Tracking the Position of a Work Target |
US20150366628A1 (en) | 2014-06-18 | 2015-12-24 | Covidien Lp | Augmented surgical reality environment system |
US20160015469A1 (en) | 2014-07-17 | 2016-01-21 | Kyphon Sarl | Surgical tissue recognition and navigation apparatus and method |
US20170007156A1 (en) * | 2015-07-06 | 2017-01-12 | Biosense Webster (Israel) Ltd. | Flat location pad using nonconcentric coils |
US20170007155A1 (en) | 2015-07-06 | 2017-01-12 | Biosense Webster (Israel) Ltd. | Fluoro-invisible location pad structure for cardiac procedures |
US20170172696A1 (en) | 2015-12-18 | 2017-06-22 | MediLux Capitol Holdings, S.A.R.L. | Mixed Reality Imaging System, Apparatus and Surgical Suite |
US20170280989A1 (en) | 2016-03-31 | 2017-10-05 | Novartis Ag | Visualization system for ophthalmic surgery |
US20170367771A1 (en) * | 2015-10-14 | 2017-12-28 | Surgical Theater LLC | Surgical Navigation Inside A Body |
US20180068441A1 (en) | 2014-07-23 | 2018-03-08 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US20180098816A1 (en) | 2016-10-06 | 2018-04-12 | Biosense Webster (Israel) Ltd. | Pre-Operative Registration of Anatomical Images with a Position-Tracking System Using Ultrasound |
US20180220100A1 (en) | 2017-01-30 | 2018-08-02 | Novartis Ag | Systems and method for augmented reality ophthalmic surgical microscope projection |
US20180228392A1 (en) | 2017-02-15 | 2018-08-16 | Biosense Webster (Israel) Ltd. | Multi-axial position sensors printed on a folded flexible circuit board |
US20180245461A1 (en) | 2015-10-08 | 2018-08-30 | Halliburton Energy Services, Inc. | Mud Pulse Telemetry Preamble For Sequence Detection And Channel Estimation |
US20180270436A1 (en) | 2016-04-07 | 2018-09-20 | Tobii Ab | Image sensor for vision based on human computer interaction |
US20180286132A1 (en) | 2017-03-30 | 2018-10-04 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
EP3400871A1 (en) | 2017-12-27 | 2018-11-14 | Biosense Webster (Israel) Ltd. | Location pad with improved immunity to interference |
US20180325608A1 (en) | 2017-05-10 | 2018-11-15 | Mako Surgical Corp. | Robotic Spine Surgery System And Methods |
US20190058859A1 (en) | 2017-08-17 | 2019-02-21 | Microsoft Technology Licensing, Llc | Localized depth map generation |
US20190083115A1 (en) | 2017-09-19 | 2019-03-21 | Biosense Webster (Israel) Ltd. | Nail hole guiding system |
IL263948A (en) * | 2017-12-26 | 2019-03-31 | Biosense Webster Israel Ltd | Use of augmented reality to assist navigation during medical procedures |
US20190159843A1 (en) * | 2017-11-28 | 2019-05-30 | Biosense Webster (Israel) Ltd. | Low profile dual pad magnetic field location system with self tracking |
US20190209116A1 (en) | 2018-01-08 | 2019-07-11 | Progenics Pharmaceuticals, Inc. | Systems and methods for rapid neural network-based image segmentation and radiopharmaceutical uptake determination |
WO2019141704A1 (en) | 2018-01-22 | 2019-07-25 | Medivation Ag | An augmented reality surgical guidance system |
US20190365346A1 (en) * | 2018-06-04 | 2019-12-05 | Think Surgical, Inc. | Acoustic analyzation of bone quality during orthopedic surgery |
US20200015923A1 (en) | 2018-07-16 | 2020-01-16 | Ethicon Llc | Surgical visualization platform |
EP3387984B1 (en) | 2015-12-10 | 2020-04-22 | Topcon Corporation | Ophthalmologic image display device and ophthalmologic imaging device |
US20200188173A1 (en) | 2017-06-16 | 2020-06-18 | Michael S. Berlin | Methods and systems for oct guided glaucoma surgery |
EP3241051B1 (en) | 2014-12-29 | 2020-06-24 | Alcon Inc. | Magnification in ophthalmic procedures and associated devices, systems, and methods |
US20200253673A1 (en) | 2017-06-28 | 2020-08-13 | Intuitive Surgical Operations, Inc, | Systems and methods for projecting an endoscopic image to a three-dimensional volume |
WO2021076560A1 (en) * | 2019-10-14 | 2021-04-22 | Smith & Nephew, Inc. | Surgical tracking using a 3d passive pin marker |
US20210196105A1 (en) | 2019-12-31 | 2021-07-01 | Biosense Webster (Israel) Ltd. | Wiring of trocar having movable camera and fixed position sensor |
US20210196384A1 (en) | 2019-12-30 | 2021-07-01 | Ethicon Llc | Dynamic surgical visualization systems |
US20210196424A1 (en) | 2019-12-30 | 2021-07-01 | Ethicon Llc | Visualization systems using structured light |
US20210330394A1 (en) | 2020-04-23 | 2021-10-28 | Johnson & Johnson Surgical Vision, Inc. | Augmented-reality visualization of an ophthalmic surgical tool |
US20210330393A1 (en) | 2020-04-23 | 2021-10-28 | Johnson & Johnson Surgical Vision, Inc. | Using real-time images for augmented-reality visualization of an ophthalmology surgical tool |
-
2021
- 2021-04-05 US US17/221,960 patent/US12201266B2/en active Active
- 2021-04-05 US US17/221,908 patent/US12161296B2/en active Active
- 2021-04-05 US US17/221,921 patent/US12201265B2/en active Active
- 2021-04-12 AU AU2021259504A patent/AU2021259504A1/en not_active Abandoned
- 2021-04-12 EP EP21719290.5A patent/EP4138651A1/en active Pending
- 2021-04-12 WO PCT/IB2021/053015 patent/WO2021214590A1/en unknown
- 2021-04-12 CA CA3180843A patent/CA3180843A1/en active Pending
- 2021-04-12 CA CA3182696A patent/CA3182696A1/en active Pending
- 2021-04-12 CA CA3180651A patent/CA3180651A1/en active Pending
- 2021-04-12 AU AU2021258555A patent/AU2021258555A1/en not_active Abandoned
- 2021-04-12 WO PCT/IB2021/053017 patent/WO2021214591A1/en unknown
- 2021-04-12 EP EP21719730.0A patent/EP4138626A1/en active Pending
- 2021-04-12 AU AU2021261648A patent/AU2021261648A1/en not_active Abandoned
- 2021-04-12 WO PCT/IB2021/053019 patent/WO2021214592A1/en unknown
- 2021-04-26 EP EP21719980.1A patent/EP4203762A1/en not_active Withdrawn
Patent Citations (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5391199A (en) | 1993-07-20 | 1995-02-21 | Biosense, Inc. | Apparatus and method for treating cardiac arrhythmias |
WO1996005768A1 (en) | 1994-08-19 | 1996-02-29 | Biosense, Inc. | Medical diagnosis, treatment and imaging systems |
EP0951874A2 (en) | 1994-09-15 | 1999-10-27 | Visualization Technology, Inc. | Position tracking and imaging system for use in medical applications using a reference unit secured to a patients head |
US20020065455A1 (en) | 1995-01-24 | 2002-05-30 | Shlomo Ben-Haim | Medical diagnosis, treatment and imaging systems |
US6690963B2 (en) | 1995-01-24 | 2004-02-10 | Biosense, Inc. | System for determining the location and orientation of an invasive medical instrument |
US6498944B1 (en) | 1996-02-01 | 2002-12-24 | Biosense, Inc. | Intrabody measurement |
US6618612B1 (en) | 1996-02-15 | 2003-09-09 | Biosense, Inc. | Independently positionable transducers for location system |
US6332089B1 (en) | 1996-02-15 | 2001-12-18 | Biosense, Inc. | Medical procedures and apparatus using intrabody probes |
US6239724B1 (en) | 1997-12-30 | 2001-05-29 | Remon Medical Technologies, Ltd. | System and method for telemetrically providing intrabody spatial position |
US20030023161A1 (en) | 1999-03-11 | 2003-01-30 | Assaf Govari | Position sensing system with integral location pad and position display |
US6381485B1 (en) * | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies, Inc. | Registration of human anatomy integrated for electromagnetic localization |
US6484118B1 (en) | 2000-07-20 | 2002-11-19 | Biosense, Inc. | Electromagnetic position single axis system |
US20030120150A1 (en) | 2001-12-21 | 2003-06-26 | Assaf Govari | Wireless position sensor |
US20040068178A1 (en) | 2002-09-17 | 2004-04-08 | Assaf Govari | High-gradient recursive locating system |
US20040101086A1 (en) | 2002-11-27 | 2004-05-27 | Sabol John Michael | Method and apparatus for quantifying tissue fat content |
US20040199072A1 (en) | 2003-04-01 | 2004-10-07 | Stacy Sprouse | Integrated electromagnetic navigation and patient positioning device |
US20060161062A1 (en) | 2003-06-12 | 2006-07-20 | Bracco Research Sa | Blood flow estimates through replenishment curve fitting in ultrasound contrast imaging |
US20050054900A1 (en) * | 2003-07-21 | 2005-03-10 | Vanderbilt University | Ophthalmic orbital surgery apparatus and method and image-guided navigation system |
US20050203380A1 (en) | 2004-02-17 | 2005-09-15 | Frank Sauer | System and method for augmented reality navigation in a medical intervention procedure |
US8180430B2 (en) | 2005-02-22 | 2012-05-15 | Biosense Webster, Inc. | Resolution of magnetic dipole ambiguity in position tracking measurements |
US20060281971A1 (en) | 2005-06-14 | 2006-12-14 | Siemens Corporate Research Inc | Method and apparatus for minimally invasive surgery using endoscopes |
EP1829477A2 (en) | 2006-03-03 | 2007-09-05 | Biosense Webster, Inc. | Resolution of magnetic dipole ambiguity in position tracking measurements |
US20070265526A1 (en) | 2006-05-11 | 2007-11-15 | Assaf Govari | Low-profile location pad |
US20130060146A1 (en) | 2010-04-28 | 2013-03-07 | Ryerson University | System and methods for intraoperative guidance feedback |
US20130245461A1 (en) | 2010-11-12 | 2013-09-19 | Deutsches Krebsforschungszentrum Stiftung Des Offentlichen Rechts | Visualization of Anatomical Data by Augmented Reality |
US20130015848A1 (en) * | 2011-07-13 | 2013-01-17 | Assaf Govari | Field generator patch with distortion cancellation |
EP2829218A1 (en) | 2012-03-17 | 2015-01-28 | Waseda University | Image completion system for in-image cutoff region, image processing device, and program therefor |
US20150272694A1 (en) | 2012-06-27 | 2015-10-01 | CamPlex LLC | Surgical visualization system |
US20140275760A1 (en) | 2013-03-13 | 2014-09-18 | Samsung Electronics Co., Ltd. | Augmented reality image display system and surgical robot system comprising the same |
US20150327948A1 (en) * | 2014-05-14 | 2015-11-19 | Stryker European Holdings I, Llc | Navigation System for and Method of Tracking the Position of a Work Target |
US20150366628A1 (en) | 2014-06-18 | 2015-12-24 | Covidien Lp | Augmented surgical reality environment system |
US20160015469A1 (en) | 2014-07-17 | 2016-01-21 | Kyphon Sarl | Surgical tissue recognition and navigation apparatus and method |
US20180068441A1 (en) | 2014-07-23 | 2018-03-08 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
EP3241051B1 (en) | 2014-12-29 | 2020-06-24 | Alcon Inc. | Magnification in ophthalmic procedures and associated devices, systems, and methods |
US20170007156A1 (en) * | 2015-07-06 | 2017-01-12 | Biosense Webster (Israel) Ltd. | Flat location pad using nonconcentric coils |
US20170007155A1 (en) | 2015-07-06 | 2017-01-12 | Biosense Webster (Israel) Ltd. | Fluoro-invisible location pad structure for cardiac procedures |
US20180245461A1 (en) | 2015-10-08 | 2018-08-30 | Halliburton Energy Services, Inc. | Mud Pulse Telemetry Preamble For Sequence Detection And Channel Estimation |
US20170367771A1 (en) * | 2015-10-14 | 2017-12-28 | Surgical Theater LLC | Surgical Navigation Inside A Body |
EP3387984B1 (en) | 2015-12-10 | 2020-04-22 | Topcon Corporation | Ophthalmologic image display device and ophthalmologic imaging device |
US20170172696A1 (en) | 2015-12-18 | 2017-06-22 | MediLux Capitol Holdings, S.A.R.L. | Mixed Reality Imaging System, Apparatus and Surgical Suite |
US20170280989A1 (en) | 2016-03-31 | 2017-10-05 | Novartis Ag | Visualization system for ophthalmic surgery |
US20180270436A1 (en) | 2016-04-07 | 2018-09-20 | Tobii Ab | Image sensor for vision based on human computer interaction |
US20180098816A1 (en) | 2016-10-06 | 2018-04-12 | Biosense Webster (Israel) Ltd. | Pre-Operative Registration of Anatomical Images with a Position-Tracking System Using Ultrasound |
EP3305202B1 (en) | 2016-10-06 | 2024-03-06 | Biosense Webster (Israel) Ltd. | Pre-operative registration of anatomical images with a position-tracking system using ultrasound |
US20180220100A1 (en) | 2017-01-30 | 2018-08-02 | Novartis Ag | Systems and method for augmented reality ophthalmic surgical microscope projection |
US20180228392A1 (en) | 2017-02-15 | 2018-08-16 | Biosense Webster (Israel) Ltd. | Multi-axial position sensors printed on a folded flexible circuit board |
US20180286132A1 (en) | 2017-03-30 | 2018-10-04 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
US20180325608A1 (en) | 2017-05-10 | 2018-11-15 | Mako Surgical Corp. | Robotic Spine Surgery System And Methods |
US20200188173A1 (en) | 2017-06-16 | 2020-06-18 | Michael S. Berlin | Methods and systems for oct guided glaucoma surgery |
US20200253673A1 (en) | 2017-06-28 | 2020-08-13 | Intuitive Surgical Operations, Inc, | Systems and methods for projecting an endoscopic image to a three-dimensional volume |
US20190058859A1 (en) | 2017-08-17 | 2019-02-21 | Microsoft Technology Licensing, Llc | Localized depth map generation |
US20190083115A1 (en) | 2017-09-19 | 2019-03-21 | Biosense Webster (Israel) Ltd. | Nail hole guiding system |
US20190159843A1 (en) * | 2017-11-28 | 2019-05-30 | Biosense Webster (Israel) Ltd. | Low profile dual pad magnetic field location system with self tracking |
IL263948A (en) * | 2017-12-26 | 2019-03-31 | Biosense Webster Israel Ltd | Use of augmented reality to assist navigation during medical procedures |
US20190192232A1 (en) | 2017-12-26 | 2019-06-27 | Biosense Webster (Israel) Ltd. | Use of augmented reality to assist navigation during medical procedures |
EP3400871A1 (en) | 2017-12-27 | 2018-11-14 | Biosense Webster (Israel) Ltd. | Location pad with improved immunity to interference |
US20190209116A1 (en) | 2018-01-08 | 2019-07-11 | Progenics Pharmaceuticals, Inc. | Systems and methods for rapid neural network-based image segmentation and radiopharmaceutical uptake determination |
WO2019141704A1 (en) | 2018-01-22 | 2019-07-25 | Medivation Ag | An augmented reality surgical guidance system |
US20190365346A1 (en) * | 2018-06-04 | 2019-12-05 | Think Surgical, Inc. | Acoustic analyzation of bone quality during orthopedic surgery |
US20200015923A1 (en) | 2018-07-16 | 2020-01-16 | Ethicon Llc | Surgical visualization platform |
WO2021076560A1 (en) * | 2019-10-14 | 2021-04-22 | Smith & Nephew, Inc. | Surgical tracking using a 3d passive pin marker |
US20210196384A1 (en) | 2019-12-30 | 2021-07-01 | Ethicon Llc | Dynamic surgical visualization systems |
US20210196424A1 (en) | 2019-12-30 | 2021-07-01 | Ethicon Llc | Visualization systems using structured light |
US20210196105A1 (en) | 2019-12-31 | 2021-07-01 | Biosense Webster (Israel) Ltd. | Wiring of trocar having movable camera and fixed position sensor |
US20210330394A1 (en) | 2020-04-23 | 2021-10-28 | Johnson & Johnson Surgical Vision, Inc. | Augmented-reality visualization of an ophthalmic surgical tool |
US20210330396A1 (en) | 2020-04-23 | 2021-10-28 | Johnson & Johnson Surgical Vision, Inc. | Location pad surrounding at least part of patient eye and having optical tracking elements |
US20210330393A1 (en) | 2020-04-23 | 2021-10-28 | Johnson & Johnson Surgical Vision, Inc. | Using real-time images for augmented-reality visualization of an ophthalmology surgical tool |
Also Published As
Publication number | Publication date |
---|---|
US12201266B2 (en) | 2025-01-21 |
AU2021258555A1 (en) | 2023-01-05 |
CA3182696A1 (en) | 2021-10-28 |
EP4138626A1 (en) | 2023-03-01 |
CA3180651A1 (en) | 2021-10-28 |
AU2021261648A1 (en) | 2023-01-05 |
US12161296B2 (en) | 2024-12-10 |
WO2021214590A1 (en) | 2021-10-28 |
US20210330394A1 (en) | 2021-10-28 |
EP4203762A1 (en) | 2023-07-05 |
AU2021259504A1 (en) | 2023-01-05 |
WO2021214591A1 (en) | 2021-10-28 |
WO2021214592A1 (en) | 2021-10-28 |
CA3180843A1 (en) | 2021-10-28 |
US20210330396A1 (en) | 2021-10-28 |
US20210330395A1 (en) | 2021-10-28 |
EP4138651A1 (en) | 2023-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12201265B2 (en) | Location pad surrounding at least part of patient eye for tracking position of a medical instrument | |
US11800970B2 (en) | Computerized tomography (CT) image correction using position and direction (P and D) tracking assisted optical visualization | |
US11832883B2 (en) | Using real-time images for augmented-reality visualization of an ophthalmology surgical tool | |
JP7662627B2 (en) | ENT PROCEDURE VISUALIZATION SYSTEM AND METHOD | |
CA2892554C (en) | System and method for dynamic validation, correction of registration for surgical navigation | |
US7774044B2 (en) | System and method for augmented reality navigation in a medical intervention procedure | |
EP3445267B1 (en) | Method and system for registration verification | |
US20150049174A1 (en) | System and method for non-invasive patient-image registration | |
AU2018202620A1 (en) | Improving registration of an anatomical image with a position-tracking coordinate system based on visual proximity to bone tissue | |
KR101923927B1 (en) | Image registration system and method using subject-specific tracker | |
EP4192330B1 (en) | Jig assembled on stereoscopic surgical microscope for applying augmented reality techniques to surgical procedures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: JOHNSON & JOHNSON SURGICAL VISION, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOVARI, ASSAF;GLINER, VADIM;REEL/FRAME:055830/0385 Effective date: 20210406 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |