US20060036162A1 - Method and apparatus for guiding a medical instrument to a subsurface target site in a patient - Google Patents
Method and apparatus for guiding a medical instrument to a subsurface target site in a patient Download PDFInfo
- Publication number
- US20060036162A1 US20060036162A1 US11/045,013 US4501305A US2006036162A1 US 20060036162 A1 US20060036162 A1 US 20060036162A1 US 4501305 A US4501305 A US 4501305A US 2006036162 A1 US2006036162 A1 US 2006036162A1
- Authority
- US
- United States
- Prior art keywords
- image
- spatial feature
- target site
- instrument
- indicated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 47
- 238000003384 imaging method Methods 0.000 claims abstract description 32
- 238000002604 ultrasonography Methods 0.000 abstract description 13
- 230000003287 optical effect Effects 0.000 abstract description 6
- 238000005259 measurement Methods 0.000 description 8
- 230000009466 transformation Effects 0.000 description 7
- 210000003484 anatomy Anatomy 0.000 description 5
- 238000001356 surgical procedure Methods 0.000 description 5
- 238000002675 image-guided surgery Methods 0.000 description 3
- 239000000523 sample Substances 0.000 description 3
- 238000012285 ultrasound imaging Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 241000566150 Pandion haliaetus Species 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 230000033001 locomotion Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000008733 trauma Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
- A61B5/062—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
- A61B5/064—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
Definitions
- Precise imaging of portions of the anatomy is an increasingly important technique in the medical and surgical fields.
- techniques have been developed for performing surgical procedures within the body through small incisions with minimal invasion. These procedures generally require the surgeon to operate on portions of the anatomy that are not directly visible, or can be seen only with difficulty.
- some parts of the body contain extremely complex or small structures and it is necessary to enhance the visibility of these structures to enable the surgeon to perform more delicate procedures.
- planning such procedures requires the evaluation of the location and orientation of these structures within the body in order to determine the optimal surgical trajectory.
- the computer memory is loaded with data from an MRI, CT, or other volumetric scan of a patient, and this data is utilized to dynamically display 3-dimensional perspective images in real time of the patient's anatomy from the viewpoint of the pointer.
- the images are segmented and displayed in color to highlight selected anatomical features and to allow the viewer to see beyond obscuring surfaces and structures.
- the displayed image tracks the movement of the instrument during surgical procedures.
- the instrument may include an imaging device such as an endoscope or ultrasound transducer, and the system displays also the two images to be fused so that a combined image is displayed.
- the system is adapted for easy and convenient operating room use during surgical procedures.
- the Shahidi 296 patent uses pre-operative volumetric scans of the patient, e.g., from an MRI, CT. Hence, it is necessary to register the preoperative volume image with the patient in the operating room. It would be beneficial to provide a navigation system that utilizes intraoperative images to eliminate the registration step. It would also be desirable to provide a system that uses intraoperative images to aid the user in navigating to a target site within the patient anatomy.
- Certain aspects of an embodiment of the present invention relate to a system and method for aiding a user in guiding a medical instrument to a target site in a patient.
- the system comprises an imaging device for generating one or more intraoperative images, on which spatial features of a patient target site can be defined in a 3-dimensional coordinate system.
- a tracking system tracks the position and optionally, the orientation of the medical instrument and imaging device in a reference coordinate system.
- An indicator allows a user to indicate a spatial feature of a target site on such image(s).
- the system also includes a display device, an electronic computer (operably connected to said tracking system, display device, and indicator), and computer-readable code.
- the computer-readable code when used to control the operation of the computer, is operable to carry out the steps of (i) recording target-site spatial information indicated by the user on said image(s), (ii) determining from the spatial feature of the target site indicated on said image(s), 3-D coordinates of the target-site spatial feature in a reference coordinate system, (iii)tracking the position of the instrument in the reference coordinate system, (iv) projecting onto a display device, a view field as seen from a known position and, optionally, a known orientation, with respect to the tool, in the reference coordinate system, and (v) projecting onto the displayed view field, indicia whose states indicate the indicated spatial feature of the target site with respect to said known position and, optionally, said known orientation.
- the system allows the user, by observing the states of said indicia, to guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
- the imaging device is an x-ray (fluoroscopic) imaging device.
- the x-ray imaging device is capable of generating first and second digitized projection images of the patient target site from first and second positions, respectively, while the tracking device is operable to record the positions of the x-ray imaging device at the first and second positions.
- the imaging device is an ultrasound imaging device and the tracking device is operable for generating tracking measurements which are recorded by the computer system when the ultrasound image(s) is generated.
- the medical instrument may be any of a variety of devices, such as a pointer, a drill, or an endoscope (or other intraoperative video or optical device).
- a pointer such as a pointer, a drill, or an endoscope (or other intraoperative video or optical device).
- an endoscope or other intraoperative video or optical device.
- the view field projected onto the display device may be the image seen by the endoscope.
- a method involves generating one or more intraoperative images on which a spatial feature of a patient target site can be indicated, indicating a spatial feature of the target site on said image(s), using the spatial feature of the target site indicated on said image(s) to determine 3-D coordinates of the target site spatial feature in a reference coordinate system, tracking the position of the instrument in the reference coordinate system, projecting onto a display device a view field as seen from a known position and, optionally, a known orientation, with respect to the tool, in the reference coordinate system, and projecting onto the displayed view field, indicia whose states are related to the indicated spatial feature of the target site with respect to the known position and, optionally, said known orientation.
- This method allows the user, by observing the states of said indicia, to guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
- the view field projected onto the display device may be that view as seen from the tip-end position and orientation of the medical instrument having a defined field of view.
- the view field projected onto the display device may be that a seen from a position along the axis of the instrument that is different from the tip-end .
- Other view fields may also be shown without departing from the scope of the present invention.
- the medical instrument is an endoscope.
- the view field projected onto the display device may be the image seen by the endoscope.
- the method may include the steps of generating first and second digitized projection images, such as x-ray projection images, of the patient target site from first and second positions, respectively, and indicating the spatial feature of the target site on the first and second digitized projection images.
- first and second digitized projection images such as x-ray projection images
- the step of generating first and second projection images may includes moving an x-ray imaging device to a first position, to generate the first image, moving the x-ray imaging device to a second position, to generate the second image, and tracking the position of the imaging device at the first and second positions, in the reference coordinate system.
- target-site spatial features are indicated on the first image and then projected onto the second image.
- the spatial feature projected onto the second image may be used to constrain the target-site spatial feature indicated on the second image.
- the target-site spatial feature indicated on the first image is selected from an area, a line, and a point, and the corresponding spatial feature projected onto the second image is a volume, an area, and a line, respectively.
- the indicating step may be carried out independently for both images, in which instance the 3-D coordinates of the target site are determined from the independently indicated spatial features.
- the step of generating includes using an ultrasonic source to generate an ultrasonic image of the patient, and the 3-D coordinates of a spatial feature indicated on the image are determined from the 2-D coordinates of the spatial feature on the image and the position of the ultrasonic source.
- the target site spatial feature indicated is a volume or area, and the indicia are arranged in a geometric pattern which defines the boundary of the indicated spatial feature.
- the target site spatial feature indicated is a volume, area or point, and the indicia are arranged in a geometric pattern that indicates the position of a point within the target site.
- the spacing between or among indicia is indicative of the distance of the instrument from the target-site position.
- the size or shape of the individual indicia is indicative of the distance of the instrument from the target-site position.
- the size or shape of individual indicia is indicative of the orientation of said tool.
- the step of indicating includes indicating on each image, a second spatial feature which, together with the first-indicated spatial feature, defines a surgical trajectory on the displayed image.
- the method further includes using the instrument to indicate on a patient surface region, an entry point that defines, with the indicated spatial feature, a surgical trajectory on the displayed image.
- the surgical trajectory on the displayed image may be indicated by two sets of indicia, one set corresponding to the first-indicated spatial feature and the second, by the second spatial feature or entry point indicated.
- the surgical trajectory on the displayed image may for example be indicated by a geometric object defined, at its end regions, by the first-indicated spatial feature and the second spatial feature or entry point indicated.
- FIG. 1 is a schematic diagram of an image-guided surgery system according to certain aspects of an embodiment of the invention.
- FIG. 2 is a schematic diagram depicting the architecture of a computer system which may be used in the image guided surgery system of FIG. 1 .
- FIG. 3 is a flow chart illustrating an image guided surgical method according to certain aspects of an embodiment of the invention.
- FIG. 4 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention.
- FIG. 5 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention.
- FIG. 6 is a flow chart illustrating operating of the tracking system.
- FIG. 7 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention.
- FIG. 8 is a schematic illustration of an indicating step according to one embodiment of the invention.
- FIG. 9 is a schematic illustration of an indicating step according to another embodiment of the invention.
- FIG. 10 illustrates a display according to an embodiment of the invention.
- FIG. 11 is a schematic illustration of an indicating step according to another embodiment of the invention.
- FIGS. 12-14B illustrate displays according to embodiments of the invention.
- FIG. 1 is a schematic view of an image-guided surgery system 8 according to certain aspects of an embodiment of the invention.
- the system includes an imaging device for generating intraoperative images of selection portions of the patient's 10 anatomy.
- the imaging device may comprise a mobile fluoroscopic device 12 .
- Fluoroscopic device 12 is preferably a C-Arm of the type which may be obtained from General Electric, Milwaukee, Wis.
- the mobile fluoroscopic device includes an X-ray camera 14 and an image intensifier 16 .
- the imaging device may be an ultrasound imaging device, such as a hand held ultrasound imaging probe 17 .
- the system also includes a surgical instrument 18 , which may be any of a variety of devices such as a pointer, a drill, or an endoscope, for example.
- the system also includes a tracking system.
- the C-arm/image intensifier 24 , the ultrasound probe 17 and the surgical instrument 18 are each equipped with tracking elements 16 a, 17 a and 18 a, respectively, that define local coordinate systems for each of those components.
- the tracking elements 16 a, 17 a, 18 a are emitters, such as infrared light-emitting diode (LED) markers.
- the tracking elements communicate with a position sensor (e.g. a camera (digitizer)) 20 , such as an Optotrak digitizer available from Northern Digital, Waterloo, Ontario, Canada.
- a position sensor e.g. a camera (digitizer)
- an Optotrak digitizer available from Northern Digital, Waterloo, Ontario, Canada.
- the optical system may employ passive tracking elements, e.g. reflectors.
- an electromagnetic (EM) tracking system or a combined EM/optical tracking system may be employed.
- the position sensor 20 tracks the components 12 , 17 , 18 within an operating space 19 , and supplies data needed to perform coordinate transformations between the various local coordinate systems to a computer system 22 , such as a workstation computer of the type available from Sun Microsystems, Mountain View, Calif. Or Silicon Graphics Inc., Mountain View, Calif.
- the NTSC video output of camera 14 is also processed by the computer system.
- a video framegrabber board such as an SLIC-Video available from Osprey Systems, Cary, N.C., may also be employed to allow loading of gray-scale images from the video buffer of the C-arm to the computer system.
- the general architecture of such a computer system 22 is shown in more detail in FIG. 2 .
- the computer system includes a central processing unit (CPU) 30 that provides computing resources and controls the computer.
- CPU 30 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations.
- Computer 22 also includes system memory 32 which may be in the form of random-access memory (RAM) and random-access memory (ROM).
- Input device(s) 34 such as a keyboard, mouse, foot pedal, stylus, etc., are used to input data into the computer.
- Storage device(s) 36 include a storage medium such as magnetic tape or disk, or optical disk, e.g., a compact disk, that are used to record programs of instructions for operating systems, utilities and applications.
- the storage device(s) may be internal, such as a hard disk and may also include a disk drive for reading data and software embodied on external storage mediums such as compact disks, etc.
- Storage device 36 may be used to store one or more programs and data that implement various aspects of the present invention, including the imaging and tracking procedures.
- One or more display devices 38 are used to display various images to the surgeon during the surgical procedure. Display device(s) 38 are preferably high-resolution device(s).
- the computer system may also include communications device(s) 40 , such as a modem or other network device for making connection to a network, such as a local area network (LAN), Internet, etc.
- communications device(s) 40 such as a modem or other network device for making connection to a network, such as a local area network (LAN), Internet, etc.
- program(s) and/or data that implement various aspects of the present invention may be transmitted to computer 22 from a remote location (e.g., a server or another workstation) over a network.
- All major system components of the computer may connect to a bus 42 which may be more than one physical bus.
- Bus 42 is preferably a high-bandwidth bus to improve speed of image display during the procedure.
- FIG. 3 is a flow chart illustrating an image guided surgical method according to certain aspects of an embodiment of the invention.
- an imaging device such as the fluoroscopic device 12 or the ultrasound probe 17 , is used to generate at least one image of the patient 10 .
- the image(s) is/are transmitted to the computer 22 , e.g., by means of a cable connecting the imaging device to the computer, and by means of the video capture device installed in the computer.
- the user defines the target in the image(s). This step may be accomplished, for example, by moving the cursor to the desired image position(s) and double-clicking the mouse.
- the 3-D coordinates of the target are determined in the reference coordinate system.
- the coordinates of the selected target in the reference coordinate system are computed using the tracking measurements recorded when the image(s) was/were generated.
- the tracking elements 16 a are positioned to allow parameters of the fluoroscopic device 12 , such as focal length and image center, to be estimated.
- the coordinates of the instrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of the instrument 18 in the reference coordinate system are computed by the computer system 22 .
- the computer system computes the target position in the field of view of the instrument 18 .
- the computer displays the coordinates of the target on the instrument's field of view.
- the instrument 18 may be an endoscope, in which case the field of view projected onto the display device may be the image as seen by the endoscope.
- the view field projected onto the display device can be that view seen from the tip-end position and orientation of the medical instrument.
- the view field projected onto the display device can be that view seen from a position along the axis of the instrument that is different from the tip-end position of the medical instrument.
- the user can select the view field from a position, e.g., distal from the tip of the pointer, along the axis of the pointer.
- the real-time image 50 from the endoscope is displayed on a monitor 52 .
- An indicia illustrated as a cross hair 54 , is projected onto the displayed field of view of the endoscope.
- the cross hair moves to guide the user towards the target site.
- the cross hair 54 will be centered on the image 50 .
- the cross hair 54 functions as an indicia whose state (position in this instance) is related to the indicated spatial feature target site with the known position of the endoscope.
- the user by observing the state (position) of the cross hair, can guide the endoscope toward the target site by moving the endoscope so that the cross hair is placed in the center of the display.
- FIG. 4 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention.
- the fluoroscopic device 12 is used to generate two or more X-ray images of the patient.
- the fluoroscopic device can be used to make first and second images 800 , 802 of the patient target site taken from first and second positions, respectively.
- the patient target site is a portion of the spine and the first image 800 is a lateral view of the spine portion, while the second image 802 is an anterior-posterior (AP) view of the spine portion.
- AP anterior-posterior
- the first and second images 800 , 802 are generated by moving the fluoroscopic device 12 to a first position to generate the first image, moving the fluoroscopic device to a second position to generate a second image, and tracking the position of the imaging device, i.e., with the tracking system, at the first and second positions in the reference coordinate system.
- the X-ray images are transmitted to the computer system 22 , e.g., by means of a cable connecting the fluoroscopic device to the computer system, and by means of a video capture device installed in the computer system.
- the user selects the desired position of the target in one image. This step may be accomplished, for example, by moving the cursor to the desired image position and double-clicking the computer mouse. Because fluoroscopic images are projective, a point selected in one image corresponds to a line in space in the other images.
- the computer system may draw the line representing the target on the other X-ray image(s). For example, referring to FIG. 8 , the user initially selects a target 804 in the first image 800 .
- the computer system projects 806 the point 804 onto the second image 802 as a line 808 .
- the target-site spatial feature indicated on the first image is shown as a point and the corresponding spatial feature projected onto the second images is a line.
- the target-site spatial feature on the first image can be selected as area or a line, in which case the corresponding spatial feature projected onto the second image is a volume or an area, respectively.
- a geometric pattern, which defines the boundary of the indicated special feature may be projected onto the second image.
- FIG. 11 shows first and second images 1100 , 1102 .
- the target-site spatial feature indicated in the first image 1100 is an area 1104 that is projected 1106 onto the second image 1102 as a geometric pattern 1108 .
- the target site spatial feature indicated is a volume or area
- the indicia can be arranged in a geometric pattern which defines the boundary of the indicated spatial feature in the image that is displayed to the user during navigation. (See, e.g., FIG. 13 where geometric shape 1302 is displayed over the instrument's field of view 1304 ).
- the target site spatial feature is indicated as a volume, area or point
- the displayed indicia can be arranged in a geometric pattern that indicates the position of a point within the target site.
- the user defines a target in another image by moving the cursor to the desired position in that image and double-clicking the mouse.
- the line 808 projected in the second image 802 can function as a guide for directing the user to the target area in the second image that aligns with the target area selected in the first image.
- the projected special feature e.g., the line 808
- the line 808 can be used to constrain where the target-site spatial feature can be indicated on the second image.
- FIG. 9 shows first and second images 900 , 902 .
- the target (point) 904 selected in the first image 900 does not align with the target (point) 906 selected in the second image 902 .
- step 408 the coordinates of the point best representing the selected target in the reference coordinate system are computed using the tracking measurements recorded when the X-ray image(s) were generated.
- Steps 406 through 410 can be repeated to allow the user to define the target in additional images.
- step 408 can be accomplished, for example, by using a matrix that is minimized to give the best match of all of the points selected in the images.
- step 414 the computer system computes the target position in the field of view of the instrument 18 . Specifically, using the now known transformation between reference and instrument coordinate systems, the coordinates of the selected target in the instrument coordinate system are computed.
- step 416 the computer displays the coordinates of the target on the instrument's field of view.
- FIG. 5 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention.
- an ultrasound scanner 17 is used to generate the intraoperative image.
- the user generates an ultrasound image of the patient using an ultrasound scanner 17 in the OR.
- the ultrasound image is transmitted to the computer system 22 , e.g., by means of a video cable connecting the ultrasound scanner to the computer system and by means of a video capture device installed in the computer system.
- the user selects the target position in the ultrasound image, e.g., by moving the cursor to the desired location and double-clicking the mouse.
- the 3D coordinates of the target are determined in the reference coordinate system.
- the tracking system installed in the OR is used to track the position of ultrasound scanner during the imaging process.
- the computer system uses the tracking measurements recorded when the ultrasound image was generated to compute the point best representing the selected target in the reference coordinate system.
- the coordinates of the instrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of the instrument 18 in the reference coordinate system are computed by the computer system 22 .
- the computer system computes the target position in the field of view of the instrument 18 . Specifically, using the now known transformation between reference and instrument coordinate systems, the coordinates of the selected target in the instrument coordinate system are computed.
- the computer displays the coordinates of the target on the instrument's field of view.
- FIG. 6 is a flow chart that further illustrates how the navigation system is used to guide the instrument during a procedure.
- the instrument is equipped with a tracking element 18 a so the instrument can be tracked by the position sensor 20 .
- the instrument's position and orientation with respect to the tracking elements 18 a are computed.
- the current position of the tracking element 18 a in the reference coordinate system is measured by means of the position sensor 20 .
- the position and orientation of the instrument in the reference coordinate system are computed in step 606 .
- the position of the target in the instrument coordinate system is computed, using the known transformation between instrument and reference coordinates.
- step 608 the computer system 22 generates a display showing the target overlaid on the instrument's field of view.
- the display is updated according to the relative position of the target in the instrument's field of view in step 610 .
- step 612 the user guides the instrument by observing the display and moving or rotating the instrument to achieve a desired position of the target in the instrument's field of view. Steps 604 through 612 are continuously repeated to update the display as the user moves the instrument.
- FIG. 7 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention.
- This embodiment provides the ability to define a surgical trajectory in the displayed image.
- the fluoroscopic device 12 is used to generate two or more X-ray images of the patient 10 .
- the images are transmitted to the computer system 22 , for example by means of a cable connecting the fluoroscopic device to the computer system and by means of a video capture device installed in the computer system.
- a first target point is defined in the reference coordinate by selecting its position in two or more images.
- the target-defining step 704 can be accomplished in the manner described above in connection with FIG. 4 .
- a second target point is defined in the reference coordinate system by selecting its position in two or more images in the manner shown in FIG. 4 .
- the instrument 18 can be used to indicate on a patient surface region, an entry point that defines the second target point.
- the trajectory including the two target points in the reference coordinate system is calculated in step 708 .
- the coordinates of the instrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of the instrument 18 in the reference coordinate system are computed by the computer system 22 . Using the known transformation between instrument and reference coordinates, the computer displays the trajectory including the two target points on the instrument's field of view 712 .
- the surgical trajectory on the displayed image may, for example, be indicated by two sets of indicia, one set corresponding to the first-indicated spatial feature and the second corresponding to either the second indicated spatial feature or indicated entry point.
- the surgical trajectory on the displayed image may, for example, be indicated by a geometric object defined, at its end regions, by the first-indicated spatial feature and the second spatial feature or entry point indicated.
- the size or shape of the individual indicia may be used to indicate the orientation of the instrument relative to the target-site. This is illustrated in FIG. 12 , where the indicia are displayed as four arrows 1202 - 1208 and a point 1210 is used to represent the target. As the instrument 18 moves relative to the target site in the patient, the sizes of the arrows 1202 - 1208 . For example, a larger arrow, such as the down arrow 1202 , indicates that the instrument needs to be moved down relative to the target. Similarly, the larger size of the right pointing arrow 1208 relative to the left pointing arrow 1204 indicates that the instrument needs to be moved to the right.
- the display can be structured such that the size or shape of individual indicia indicates the distance of the instrument from the target site.
- the size of the arrows could increase or decrease to indicate the relative distance from the target.
- the location of the target on displayed field of view could be indicative of the relative alignment of the instrument with the target.
- the instrument is aligned with the target when the displayed target, e.g., point 1210 , is centered in the displayed field of view.
- the spacing between or among indicia may be used to indicative of the distance of the instrument from the target-site position. This is illustrated in FIG. 14A and 14B .
- the indicia are displayed as four arrows 1402 - 1408 .
- the arrows 1402 - 1408 move farther from the display target 1410 .
- the relative spacing of the arrows 1402 - 1408 from the target 1410 is used to show indicate the relative distance form the target while the location of the target on displayed field of view 1412 is indicative of the relative alignment of the instrument with the target.
- a variety of other display methods can be employed without departing from the scope of the present invention.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Human Computer Interaction (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Robotics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Intraoperative image(s) of a patient target site are generated by an intraoperative imaging system (e.g., ultrasound or X-ray). The intraoperative imaging system is tracked with respect to the patient target site and surgical instrument(s) (e.g., a pointer, endoscope or other intraoperative video or optical device). The intraoperative images, surgical instruments, and patient target site are registered into a common coordinate system. Spatial feature(s) of the patient target site are indicated on the images of the patient target site. Indicia relating the position and orientation of the surgical instrument(s) to the spatial feature(s) of the patient target site are projected on the images, with the indicia being used to correlate the position and orientation of the surgical instruments with respect to the target feature.
Description
- This application makes reference to and claims priority from U.S. Provisional Patent Application Ser. No. 60/541,131 entitled “Method and Apparatus for Guiding a Medical Instrument to a Subsurface Target Site in a Patient” filed on Feb. 2, 2004, the complete subject matter of which is incorporated herein by reference in its entirety.
- [Not Applicable]
- [Not Applicable]
- Precise imaging of portions of the anatomy is an increasingly important technique in the medical and surgical fields. In order to lessen the trauma to a patient caused by invasive surgery, techniques have been developed for performing surgical procedures within the body through small incisions with minimal invasion. These procedures generally require the surgeon to operate on portions of the anatomy that are not directly visible, or can be seen only with difficulty. Furthermore, some parts of the body contain extremely complex or small structures and it is necessary to enhance the visibility of these structures to enable the surgeon to perform more delicate procedures. In addition, planning such procedures requires the evaluation of the location and orientation of these structures within the body in order to determine the optimal surgical trajectory.
- U.S. Pat. No. 6,167,296, issued Dec. 26, 2000, (Shahidi), the disclosure of which is hereby incorporated by reference in its entirety into the present application, discloses a surgical navigation system having a computer with a memory and display connected to a surgical instrument or pointer and position tracking system, so that the location and orientation of the pointer are tracked in real time and conveyed to the computer. The computer memory is loaded with data from an MRI, CT, or other volumetric scan of a patient, and this data is utilized to dynamically display 3-dimensional perspective images in real time of the patient's anatomy from the viewpoint of the pointer. The images are segmented and displayed in color to highlight selected anatomical features and to allow the viewer to see beyond obscuring surfaces and structures. The displayed image tracks the movement of the instrument during surgical procedures. The instrument may include an imaging device such as an endoscope or ultrasound transducer, and the system displays also the two images to be fused so that a combined image is displayed. The system is adapted for easy and convenient operating room use during surgical procedures.
- The Shahidi 296 patent uses pre-operative volumetric scans of the patient, e.g., from an MRI, CT. Hence, it is necessary to register the preoperative volume image with the patient in the operating room. It would be beneficial to provide a navigation system that utilizes intraoperative images to eliminate the registration step. It would also be desirable to provide a system that uses intraoperative images to aid the user in navigating to a target site within the patient anatomy.
- Certain aspects of an embodiment of the present invention relate to a system and method for aiding a user in guiding a medical instrument to a target site in a patient. The system comprises an imaging device for generating one or more intraoperative images, on which spatial features of a patient target site can be defined in a 3-dimensional coordinate system. A tracking system tracks the position and optionally, the orientation of the medical instrument and imaging device in a reference coordinate system. An indicator allows a user to indicate a spatial feature of a target site on such image(s). The system also includes a display device, an electronic computer (operably connected to said tracking system, display device, and indicator), and computer-readable code. The computer-readable code, when used to control the operation of the computer, is operable to carry out the steps of (i) recording target-site spatial information indicated by the user on said image(s), (ii) determining from the spatial feature of the target site indicated on said image(s), 3-D coordinates of the target-site spatial feature in a reference coordinate system, (iii)tracking the position of the instrument in the reference coordinate system, (iv) projecting onto a display device, a view field as seen from a known position and, optionally, a known orientation, with respect to the tool, in the reference coordinate system, and (v) projecting onto the displayed view field, indicia whose states indicate the indicated spatial feature of the target site with respect to said known position and, optionally, said known orientation. Thus, the system allows the user, by observing the states of said indicia, to guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
- According to certain aspects of one embodiment of the invention, the imaging device is an x-ray (fluoroscopic) imaging device. The x-ray imaging device is capable of generating first and second digitized projection images of the patient target site from first and second positions, respectively, while the tracking device is operable to record the positions of the x-ray imaging device at the first and second positions.
- According to another embodiment, the imaging device is an ultrasound imaging device and the tracking device is operable for generating tracking measurements which are recorded by the computer system when the ultrasound image(s) is generated.
- The medical instrument may be any of a variety of devices, such as a pointer, a drill, or an endoscope (or other intraoperative video or optical device). When the instrument is an endoscope, the view field projected onto the display device may be the image seen by the endoscope.
- A method according to certain aspects of an embodiment of the present invention involves generating one or more intraoperative images on which a spatial feature of a patient target site can be indicated, indicating a spatial feature of the target site on said image(s), using the spatial feature of the target site indicated on said image(s) to determine 3-D coordinates of the target site spatial feature in a reference coordinate system, tracking the position of the instrument in the reference coordinate system, projecting onto a display device a view field as seen from a known position and, optionally, a known orientation, with respect to the tool, in the reference coordinate system, and projecting onto the displayed view field, indicia whose states are related to the indicated spatial feature of the target site with respect to the known position and, optionally, said known orientation. This method allows the user, by observing the states of said indicia, to guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
- The view field projected onto the display device may be that view as seen from the tip-end position and orientation of the medical instrument having a defined field of view. Alternatively, the view field projected onto the display device may be that a seen from a position along the axis of the instrument that is different from the tip-end . Other view fields may also be shown without departing from the scope of the present invention.
- In one embodiment, the medical instrument is an endoscope. In this embodiment, the view field projected onto the display device may be the image seen by the endoscope.
- The method may include the steps of generating first and second digitized projection images, such as x-ray projection images, of the patient target site from first and second positions, respectively, and indicating the spatial feature of the target site on the first and second digitized projection images.
- The step of generating first and second projection images may includes moving an x-ray imaging device to a first position, to generate the first image, moving the x-ray imaging device to a second position, to generate the second image, and tracking the position of the imaging device at the first and second positions, in the reference coordinate system.
- In one embodiment, target-site spatial features are indicated on the first image and then projected onto the second image. The spatial feature projected onto the second image may be used to constrain the target-site spatial feature indicated on the second image. According to one aspect of this method, the target-site spatial feature indicated on the first image is selected from an area, a line, and a point, and the corresponding spatial feature projected onto the second image is a volume, an area, and a line, respectively.
- Alternatively, the indicating step may be carried out independently for both images, in which instance the 3-D coordinates of the target site are determined from the independently indicated spatial features.
- According to another aspect of the present invention, the step of generating includes using an ultrasonic source to generate an ultrasonic image of the patient, and the 3-D coordinates of a spatial feature indicated on the image are determined from the 2-D coordinates of the spatial feature on the image and the position of the ultrasonic source.
- In one embodiment, the target site spatial feature indicated is a volume or area, and the indicia are arranged in a geometric pattern which defines the boundary of the indicated spatial feature. According to another embodiment, the target site spatial feature indicated is a volume, area or point, and the indicia are arranged in a geometric pattern that indicates the position of a point within the target site.
- According to one aspect of an embodiment of the invention, the spacing between or among indicia is indicative of the distance of the instrument from the target-site position. According to another aspect of an embodiment of the invention, the size or shape of the individual indicia is indicative of the distance of the instrument from the target-site position. According to yet another aspect of an embodiment of the invention, the size or shape of individual indicia is indicative of the orientation of said tool.
- Certain embodiments of the present invention also provide the ability to define a surgical trajectory in the displayed image. Specifically, according to one embodiment, the step of indicating includes indicating on each image, a second spatial feature which, together with the first-indicated spatial feature, defines a surgical trajectory on the displayed image. According to another embodiment, the method further includes using the instrument to indicate on a patient surface region, an entry point that defines, with the indicated spatial feature, a surgical trajectory on the displayed image. In either instance, the surgical trajectory on the displayed image may be indicated by two sets of indicia, one set corresponding to the first-indicated spatial feature and the second, by the second spatial feature or entry point indicated. Alternatively, the surgical trajectory on the displayed image may for example be indicated by a geometric object defined, at its end regions, by the first-indicated spatial feature and the second spatial feature or entry point indicated.
-
FIG. 1 is a schematic diagram of an image-guided surgery system according to certain aspects of an embodiment of the invention. -
FIG. 2 is a schematic diagram depicting the architecture of a computer system which may be used in the image guided surgery system ofFIG. 1 . -
FIG. 3 is a flow chart illustrating an image guided surgical method according to certain aspects of an embodiment of the invention. -
FIG. 4 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention. -
FIG. 5 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention. -
FIG. 6 is a flow chart illustrating operating of the tracking system. -
FIG. 7 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention. -
FIG. 8 is a schematic illustration of an indicating step according to one embodiment of the invention. -
FIG. 9 is a schematic illustration of an indicating step according to another embodiment of the invention. -
FIG. 10 illustrates a display according to an embodiment of the invention. -
FIG. 11 is a schematic illustration of an indicating step according to another embodiment of the invention. -
FIGS. 12-14B illustrate displays according to embodiments of the invention. - The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. It should be understood, however, that the present invention is not limited to the arrangements and instrumentality shown in the attached drawings
-
FIG. 1 is a schematic view of an image-guided surgery system 8 according to certain aspects of an embodiment of the invention. The system includes an imaging device for generating intraoperative images of selection portions of the patient's 10 anatomy. For example, as shown inFIG. 1 the imaging device may comprise a mobilefluoroscopic device 12.Fluoroscopic device 12 is preferably a C-Arm of the type which may be obtained from General Electric, Milwaukee, Wis. The mobile fluoroscopic device includes anX-ray camera 14 and animage intensifier 16. Alternatively, the imaging device may be an ultrasound imaging device, such as a hand heldultrasound imaging probe 17. The system also includes asurgical instrument 18, which may be any of a variety of devices such as a pointer, a drill, or an endoscope, for example. The system also includes a tracking system. In this respect, the C-arm/image intensifier 24, theultrasound probe 17 and thesurgical instrument 18 are each equipped with trackingelements tracking elements - The
position sensor 20 tracks thecomponents space 19, and supplies data needed to perform coordinate transformations between the various local coordinate systems to acomputer system 22, such as a workstation computer of the type available from Sun Microsystems, Mountain View, Calif. Or Silicon Graphics Inc., Mountain View, Calif. The NTSC video output ofcamera 14 is also processed by the computer system. A video framegrabber board, such as an SLIC-Video available from Osprey Systems, Cary, N.C., may also be employed to allow loading of gray-scale images from the video buffer of the C-arm to the computer system. - The general architecture of such a
computer system 22 is shown in more detail inFIG. 2 . The computer system includes a central processing unit (CPU) 30 that provides computing resources and controls the computer.CPU 30 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations.Computer 22 also includes system memory 32 which may be in the form of random-access memory (RAM) and random-access memory (ROM). Input device(s) 34, such as a keyboard, mouse, foot pedal, stylus, etc., are used to input data into the computer. Storage device(s) 36 include a storage medium such as magnetic tape or disk, or optical disk, e.g., a compact disk, that are used to record programs of instructions for operating systems, utilities and applications. The storage device(s) may be internal, such as a hard disk and may also include a disk drive for reading data and software embodied on external storage mediums such as compact disks, etc.Storage device 36 may be used to store one or more programs and data that implement various aspects of the present invention, including the imaging and tracking procedures. One ormore display devices 38 are used to display various images to the surgeon during the surgical procedure. Display device(s) 38 are preferably high-resolution device(s). The computer system may also include communications device(s) 40, such as a modem or other network device for making connection to a network, such as a local area network (LAN), Internet, etc. With such an arrangement, program(s) and/or data that implement various aspects of the present invention may be transmitted tocomputer 22 from a remote location (e.g., a server or another workstation) over a network. All major system components of the computer may connect to abus 42 which may be more than one physical bus.Bus 42 is preferably a high-bandwidth bus to improve speed of image display during the procedure. -
FIG. 3 is a flow chart illustrating an image guided surgical method according to certain aspects of an embodiment of the invention. Initially, instep 300 an imaging device, such as thefluoroscopic device 12 or theultrasound probe 17, is used to generate at least one image of thepatient 10. The image(s) is/are transmitted to thecomputer 22, e.g., by means of a cable connecting the imaging device to the computer, and by means of the video capture device installed in the computer. Next, instep 302 the user defines the target in the image(s). This step may be accomplished, for example, by moving the cursor to the desired image position(s) and double-clicking the mouse. Next instep 304, the 3-D coordinates of the target are determined in the reference coordinate system. In particular, the coordinates of the selected target in the reference coordinate system are computed using the tracking measurements recorded when the image(s) was/were generated. As will be appreciated, in the context of X-ray images thetracking elements 16 a are positioned to allow parameters of thefluoroscopic device 12, such as focal length and image center, to be estimated. Next, instep 306 the coordinates of theinstrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of theinstrument 18 in the reference coordinate system are computed by thecomputer system 22. Next, instep 308 the computer system computes the target position in the field of view of theinstrument 18. Specifically, using the now known transformation between reference and instrument coordinate systems, the coordinates of the selected target in the instrument coordinate system are computed. Next, instep 310 the computer displays the coordinates of the target on the instrument's field of view. For example, as is illustrated inFIG. 10 , theinstrument 18 may be an endoscope, in which case the field of view projected onto the display device may be the image as seen by the endoscope. For an instrument, such as an endoscope, with a defined field of view the view field projected onto the display device can be that view seen from the tip-end position and orientation of the medical instrument. Alternatively, the view field projected onto the display device can be that view seen from a position along the axis of the instrument that is different from the tip-end position of the medical instrument. For example, where the instrument is a pointer, the user can select the view field from a position, e.g., distal from the tip of the pointer, along the axis of the pointer. - In
FIG. 10 , the real-time image 50 from the endoscope is displayed on amonitor 52. An indicia, illustrated as across hair 54, is projected onto the displayed field of view of the endoscope. As the endoscope moves relative to the target site, the cross hair moves to guide the user towards the target site. In particular, when the endoscope is centered on the target, thecross hair 54 will be centered on theimage 50. Hence thecross hair 54 functions as an indicia whose state (position in this instance) is related to the indicated spatial feature target site with the known position of the endoscope. As a result, the user, by observing the state (position) of the cross hair, can guide the endoscope toward the target site by moving the endoscope so that the cross hair is placed in the center of the display. -
FIG. 4 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention. Instep 400, thefluoroscopic device 12 is used to generate two or more X-ray images of the patient. For example, as is shown inFIG. 8 , the fluoroscopic device can be used to make first andsecond images first image 800 is a lateral view of the spine portion, while thesecond image 802 is an anterior-posterior (AP) view of the spine portion. The first andsecond images fluoroscopic device 12 to a first position to generate the first image, moving the fluoroscopic device to a second position to generate a second image, and tracking the position of the imaging device, i.e., with the tracking system, at the first and second positions in the reference coordinate system. - Referring again to
FIG. 4 , the X-ray images are transmitted to thecomputer system 22, e.g., by means of a cable connecting the fluoroscopic device to the computer system, and by means of a video capture device installed in the computer system. Instep 402 the user selects the desired position of the target in one image. This step may be accomplished, for example, by moving the cursor to the desired image position and double-clicking the computer mouse. Because fluoroscopic images are projective, a point selected in one image corresponds to a line in space in the other images. As an optional step, the computer system may draw the line representing the target on the other X-ray image(s). For example, referring toFIG. 8 , the user initially selects atarget 804 in thefirst image 800. The computer system projects 806 thepoint 804 onto thesecond image 802 as aline 808. InFIG. 8 , the target-site spatial feature indicated on the first image is shown as a point and the corresponding spatial feature projected onto the second images is a line. Alternatively, the target-site spatial feature on the first image can be selected as area or a line, in which case the corresponding spatial feature projected onto the second image is a volume or an area, respectively. Where the target site spatial feature indicated is a volume or area, a geometric pattern, which defines the boundary of the indicated special feature, may be projected onto the second image. For example,FIG. 11 shows first andsecond images first image 1100 is anarea 1104 that is projected 1106 onto thesecond image 1102 as ageometric pattern 1108. Where the target site spatial feature indicated is a volume or area, the indicia can be arranged in a geometric pattern which defines the boundary of the indicated spatial feature in the image that is displayed to the user during navigation. (See, e.g.,FIG. 13 wheregeometric shape 1302 is displayed over the instrument's field of view 1304). Alternatively, wherein the target site spatial feature is indicated as a volume, area or point, the displayed indicia can be arranged in a geometric pattern that indicates the position of a point within the target site. - Referring again to
FIG. 4 , instep 406 the user defines a target in another image by moving the cursor to the desired position in that image and double-clicking the mouse. Theline 808 projected in thesecond image 802 can function as a guide for directing the user to the target area in the second image that aligns with the target area selected in the first image. Optionally, the projected special feature, e.g., theline 808, can be used to constrain where the target-site spatial feature can be indicated on the second image. Specifically, in some applications it may be desirable to only allow the user to select a point on theline 808 when defining the target in the second image (and any further images). Alternatively, in some applications it may be desirable to perform the indicating step independently for each image. In such instances it may still be desirable to project a line into the other image(s) to aid the user in selecting the target in the other image(s). This is illustrated generally inFIG. 9 , which shows first andsecond images first image 900 does not align with the target (point) 906 selected in thesecond image 902. - After the target is selected in the second image, the coordinates of the point best representing the selected target in the reference coordinate system are computed using the tracking measurements recorded when the X-ray image(s) were generated (step 408).
Steps 406 through 410 can be repeated to allow the user to define the target in additional images. When more than two images are used, step 408 can be accomplished, for example, by using a matrix that is minimized to give the best match of all of the points selected in the images. Once the user is finished defining the target in the images, control is passed to step 412 where coordinates of theinstrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of theinstrument 18 in the reference coordinate system are computed by thecomputer system 22. Next, instep 414 the computer system computes the target position in the field of view of theinstrument 18. Specifically, using the now known transformation between reference and instrument coordinate systems, the coordinates of the selected target in the instrument coordinate system are computed. Next, instep 416 the computer displays the coordinates of the target on the instrument's field of view. -
FIG. 5 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention. In this embodiment, anultrasound scanner 17 is used to generate the intraoperative image. Instep 502 the user generates an ultrasound image of the patient using anultrasound scanner 17 in the OR. The ultrasound image is transmitted to thecomputer system 22, e.g., by means of a video cable connecting the ultrasound scanner to the computer system and by means of a video capture device installed in the computer system. Next, instep 504 the user selects the target position in the ultrasound image, e.g., by moving the cursor to the desired location and double-clicking the mouse. Next, instep 506 the 3D coordinates of the target are determined in the reference coordinate system. Specifically, the tracking system installed in the OR is used to track the position of ultrasound scanner during the imaging process. The computer system uses the tracking measurements recorded when the ultrasound image was generated to compute the point best representing the selected target in the reference coordinate system. Next, instep 506 the coordinates of theinstrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of theinstrument 18 in the reference coordinate system are computed by thecomputer system 22. Next, instep 508 the computer system computes the target position in the field of view of theinstrument 18. Specifically, using the now known transformation between reference and instrument coordinate systems, the coordinates of the selected target in the instrument coordinate system are computed. Next, instep 510 the computer displays the coordinates of the target on the instrument's field of view. -
FIG. 6 is a flow chart that further illustrates how the navigation system is used to guide the instrument during a procedure. Instep 600, the instrument is equipped with atracking element 18 a so the instrument can be tracked by theposition sensor 20. Instep 602 the instrument's position and orientation with respect to thetracking elements 18 a are computed. Instep 604 the current position of thetracking element 18 a in the reference coordinate system is measured by means of theposition sensor 20. Using the known transformation between the instrument coordinate system and that of thetracking element 18 a, the position and orientation of the instrument in the reference coordinate system are computed instep 606. The position of the target in the instrument coordinate system is computed, using the known transformation between instrument and reference coordinates. Instep 608, thecomputer system 22 generates a display showing the target overlaid on the instrument's field of view. The display is updated according to the relative position of the target in the instrument's field of view instep 610. Instep 612, the user guides the instrument by observing the display and moving or rotating the instrument to achieve a desired position of the target in the instrument's field of view.Steps 604 through 612 are continuously repeated to update the display as the user moves the instrument. -
FIG. 7 is a flow chart illustrating an image guided surgical method according to certain aspects of another embodiment of the invention. This embodiment provides the ability to define a surgical trajectory in the displayed image. Initially, instep 700 thefluoroscopic device 12 is used to generate two or more X-ray images of thepatient 10. The images are transmitted to thecomputer system 22, for example by means of a cable connecting the fluoroscopic device to the computer system and by means of a video capture device installed in the computer system. In step 704 a first target point is defined in the reference coordinate by selecting its position in two or more images. The target-definingstep 704 can be accomplished in the manner described above in connection withFIG. 4 . Next, in step 706 a second target point is defined in the reference coordinate system by selecting its position in two or more images in the manner shown inFIG. 4 . Alternatively, theinstrument 18 can be used to indicate on a patient surface region, an entry point that defines the second target point. The trajectory including the two target points in the reference coordinate system is calculated instep 708. Next, instep 710 the coordinates of theinstrument 18 are determined in the reference coordinate system. Specifically, using tracking measurements recorded by the tracking system, the position and orientation of theinstrument 18 in the reference coordinate system are computed by thecomputer system 22. Using the known transformation between instrument and reference coordinates, the computer displays the trajectory including the two target points on the instrument's field ofview 712. The surgical trajectory on the displayed image may, for example, be indicated by two sets of indicia, one set corresponding to the first-indicated spatial feature and the second corresponding to either the second indicated spatial feature or indicated entry point. Alternatively, the surgical trajectory on the displayed image may, for example, be indicated by a geometric object defined, at its end regions, by the first-indicated spatial feature and the second spatial feature or entry point indicated. - A variety of display methods can be used to guide the user during navigation. For example, the size or shape of the individual indicia may be used to indicate the orientation of the instrument relative to the target-site. This is illustrated in
FIG. 12 , where the indicia are displayed as four arrows 1202-1208 and a point 1210 is used to represent the target. As theinstrument 18 moves relative to the target site in the patient, the sizes of the arrows 1202-1208. For example, a larger arrow, such as thedown arrow 1202, indicates that the instrument needs to be moved down relative to the target. Similarly, the larger size of theright pointing arrow 1208 relative to theleft pointing arrow 1204 indicates that the instrument needs to be moved to the right. Alternatively or additionally, the display can be structured such that the size or shape of individual indicia indicates the distance of the instrument from the target site. For example, the size of the arrows could increase or decrease to indicate the relative distance from the target. In such a display, the location of the target on displayed field of view could be indicative of the relative alignment of the instrument with the target. Specifically, the instrument is aligned with the target when the displayed target, e.g., point 1210, is centered in the displayed field of view. Alternatively or additionally, the spacing between or among indicia may be used to indicative of the distance of the instrument from the target-site position. This is illustrated inFIG. 14A and 14B . In this example, the indicia are displayed as four arrows 1402-1408. As theinstrument 18 moves closer to the target site in the patient, the arrows 1402-1408 move farther from thedisplay target 1410. Hence, the relative spacing of the arrows 1402-1408 from thetarget 1410 is used to show indicate the relative distance form the target while the location of the target on displayed field ofview 1412 is indicative of the relative alignment of the instrument with the target. As will be appreciated, a variety of other display methods can be employed without departing from the scope of the present invention. - While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (24)
1. A method for assisting a user in guiding a medical instrument to a subsurface target site in a patient, comprising:
generating one or more intraoperative images on which a spatial feature of a patient target site can be indicated;
indicating a spatial feature of the target site on said image(s);
using the spatial feature of the target site indicated on said image(s) to determine 3-D coordinates of the target site spatial feature in a reference coordinate system;
tracking the position of the instrument in the reference coordinate system;
projecting onto a display device, a view field as seen from a known position and, optionally, a known orientation, with respect to the instrument, in the reference coordinate system; and
projecting onto the displayed view field, indicia whose states are related to the indicated spatial feature of the target site with respect to said known position and, optionally, said known orientation;
whereby the user, by observing the states of said indicia, can guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
2. The method of claim 1 , wherein said generating and indicating include the steps of
generating first and second digitized projection images of the patient target site from first and second positions, respectively; and
indicating the spatial feature of the target site on the first and second digitized projection images.
3. The method of claim 2 , wherein said projection images are x-ray projection images.
4. The method of claim 2 , which further includes, after indicating the spatial feature of the target site on the first image, projecting the target-site spatial feature indicated in the first image onto the second image, and using the spatial feature projected onto the second image to constrain the target-site spatial feature indicated on the second image.
5. The method of claim 4 , wherein the target-site spatial feature indicated on the first image is selected from an area, a line, and a point, and the corresponding spatial feature projected onto the second image is a volume, an area, and a line, respectively.
6. The method of claim 2 , wherein said indicating is carried out independently for both images, and the 3-D coordinates of the target site are determined from the independently indicated spatial features.
7. The method of claim 2 wherein said generating includes moving an x-ray imaging device to a first position, to generate said first image, moving the x-ray imaging device to a second position, to generate said second image, and tracking the position of the imaging device at said first and second positions, in said reference coordinate system.
8. The method of claim 1 , wherein said generating includes using an ultrasonic source to generate an ultrasonic image of the patient, and the 3-D coordinates of a spatial feature indicated on said image are determined from the 2-D coordinates of the spatial feature on the image and the position of the ultrasonic source.
9. The method of claim 1 , wherein said medical instrument is an endoscope and the view field projected onto the display device is the image seen by the endoscope.
10. The method of claim 1 , wherein the view field projected onto the display device is that seen from the tip-end position and orientation of the medical instrument having a defined field of view.
11. The method of claim 1 , wherein the view field projected onto the display device is that seen from a position along the axis of instrument that is different than the tip-end position of the medical instrument.
12. The method of claim 1 , wherein the target site spatial feature indicated is a volume or area, and said indicia are arranged in a geometric pattern which defines the boundary of the indicated spatial feature.
13. The method of claim 1 , wherein the target site spatial feature indicated is a volume, area or point, and said indicia are arranged in a geometric pattern that indicates the position of a point within the target site.
14. The method of claim 1 , wherein the spacing between or among indicia is indicative of the distance of the instrument from the target-site position.
15. The method of claim 1 , wherein the size or shape of the individual indicia is indicative of the distance of the instrument from the target-site position.
16. The method of claim 1 , wherein the size or shape of individual indicia is indicative of the orientation of said instrument.
17. The method of claim 1 , wherein said indicating includes indicating on each image, a second spatial feature which, together with the first-indicated spatial feature, defines a surgical trajectory on the displayed image.
18. The method of claim 1 , which further includes using said instrument to indicate on a patient surface region, an entry point that defines, with said indicated spatial feature, a surgical trajectory on the displayed image.
19. The method of claims 17 or 18, wherein the surgical trajectory on the displayed image is indicated by two sets of indicia, one set corresponding to the first-indicated spatial feature and the second, by the second spatial feature or entry point indicated.
20. The method of claims 17 or 18, wherein the surgical trajectory on the displayed image is indicated by a geometric object defined, at its end regions, by the first-indicated spatial feature and the second spatial feature or entry point indicated.
21. A system designed to a user in guiding a medical instrument to a target site in a patient, comprising:
(a) an imaging device for generating one or more intraoperative images, on which spatial features of a patient target site can be defined in a 3-dimensional coordinate system;
(b) a tracking system for tracking the position and optionally, the orientation of the medical instrument and imaging device in a reference coordinate system;
(c) an indicator by which a user can indicate a spatial feature of a target site on such image(s);
(d) a display device;
(e) an electronic computer operably connected to said tracking system, display device, and indicator, and
(f) computer-readable code which is operable, when used to control the operation of the computer, to carry out the steps of:
(i) recording target-site spatial information indicated by the user on said image(s), through the use of said indicator,
(ii) determining from the spatial feature of the target site indicated on said image(s), 3-D coordinates of the target-site spatial feature in a reference coordinate system,
(iii) tracking the position of the instrument in the reference coordinate system,
(iv) projecting onto a display device, a view field as seen from a known position and, optionally, a known orientation, with respect to the instrument, in the reference coordinate system, and
(v) projecting onto the displayed view field, indicia whose states indicate the indicated spatial feature of the target site with respect to said known position and, optionally, said known orientation;
whereby the user, by observing the states of said indicia, can guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
22. The system of claim 21 , wherein said imaging device is an x-ray imaging device capable of generating first and second digitized projection images of the patient target site from first and second positions, respectively, and said tracking device is operable to record the positions of the imaging device at said two positions.
23. The system of claim 21 , wherein said medical instrument is an endoscope and the view field projected onto the display device is the image seen by the endoscope.
24. Machine readable code in a system designed to assist a user in guiding a medical instrument to a target site in a patient, said system including:
(a) an imaging device for generating one or more intraoperative images, on which a patient target site can be defined in a 3-dimensional coordinate system;
(b) a tracking system for tracking the position and optionally, the orientation of the medical instrument and imaging device in a reference coordinate system;
(c) an indicator by which a user can indicate a spatial feature of a target site on such image(s);
(d) a display device, and (e) an electronic computer operably connected to said tracking system, display device, and indicator; and
said code being operable, when used to control the operation of said computer, to
(i) recording target-site spatial information indicated by the user on said image(s), through the use of said indicator,
(ii) determining from the spatial feature of the target site indicated on said image(s), 3-D coordinates of the target-site spatial feature in a reference coordinate system,
(iii) tracking the position of the instrument in the reference coordinate system,
(iv) projecting onto a display device, a view field as seen from a known position and, optionally, a known orientation, with respect to the instrument, in the reference coordinate system, and
(v) projecting onto the displayed view field, indicia whose states indicate the indicated spatial feature of the target site with respect to said known position and, optionally, said known orientation;
whereby the user, by observing the states of said indicia, can guide the instrument toward the target site by moving the instrument so that said indicia are placed or held in a given state in the displayed field of view.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/045,013 US20060036162A1 (en) | 2004-02-02 | 2005-01-27 | Method and apparatus for guiding a medical instrument to a subsurface target site in a patient |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US54113104P | 2004-02-02 | 2004-02-02 | |
US11/045,013 US20060036162A1 (en) | 2004-02-02 | 2005-01-27 | Method and apparatus for guiding a medical instrument to a subsurface target site in a patient |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060036162A1 true US20060036162A1 (en) | 2006-02-16 |
Family
ID=35800906
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/045,013 Abandoned US20060036162A1 (en) | 2004-02-02 | 2005-01-27 | Method and apparatus for guiding a medical instrument to a subsurface target site in a patient |
Country Status (1)
Country | Link |
---|---|
US (1) | US20060036162A1 (en) |
Cited By (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050267353A1 (en) * | 2004-02-04 | 2005-12-01 | Joel Marquart | Computer-assisted knee replacement apparatus and method |
US20060084840A1 (en) * | 2004-10-14 | 2006-04-20 | Hoeg Hans D | Endoscopic imaging with indication of gravity direction |
US20060106494A1 (en) * | 2004-10-28 | 2006-05-18 | Accelerated Pictures, Llc | Camera and animation controller, systems and methods |
US20060173293A1 (en) * | 2003-02-04 | 2006-08-03 | Joel Marquart | Method and apparatus for computer assistance with intramedullary nail procedure |
US20070016008A1 (en) * | 2005-06-23 | 2007-01-18 | Ryan Schoenefeld | Selective gesturing input to a surgical navigation system |
US20070038223A1 (en) * | 2003-02-04 | 2007-02-15 | Joel Marquart | Computer-assisted knee replacement apparatus and method |
US20070073137A1 (en) * | 2005-09-15 | 2007-03-29 | Ryan Schoenefeld | Virtual mouse for use in surgical navigation |
US20070073306A1 (en) * | 2004-03-08 | 2007-03-29 | Ryan Lakin | Cutting block for surgical navigation |
US20070167811A1 (en) * | 2004-09-15 | 2007-07-19 | Lemmerhirt David F | Capacitive Micromachined Ultrasonic Transducer |
US20070167812A1 (en) * | 2004-09-15 | 2007-07-19 | Lemmerhirt David F | Capacitive Micromachined Ultrasonic Transducer |
US20080028312A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Scene organization in computer-assisted filmmaking |
US20080024615A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Camera control |
US20080039723A1 (en) * | 2006-05-18 | 2008-02-14 | Suri Jasjit S | System and method for 3-d biopsy |
US20080071292A1 (en) * | 2006-09-20 | 2008-03-20 | Rich Collin A | System and method for displaying the trajectory of an instrument and the position of a body within a volume |
US20080095422A1 (en) * | 2006-10-18 | 2008-04-24 | Suri Jasjit S | Alignment method for registering medical images |
US20080146915A1 (en) * | 2006-10-19 | 2008-06-19 | Mcmorrow Gerald | Systems and methods for visualizing a cannula trajectory |
US20080161687A1 (en) * | 2006-12-29 | 2008-07-03 | Suri Jasjit S | Repeat biopsy system |
US20080159606A1 (en) * | 2006-10-30 | 2008-07-03 | Suri Jasit S | Object Recognition System for Medical Imaging |
US20080240526A1 (en) * | 2007-03-28 | 2008-10-02 | Suri Jasjit S | Object recognition system for medical imaging |
US20080306379A1 (en) * | 2007-06-06 | 2008-12-11 | Olympus Medical Systems Corp. | Medical guiding system |
US20080319491A1 (en) * | 2007-06-19 | 2008-12-25 | Ryan Schoenefeld | Patient-matched surgical component and methods of use |
US20090003528A1 (en) * | 2007-06-19 | 2009-01-01 | Sankaralingam Ramraj | Target location by tracking of imaging device |
US20090118640A1 (en) * | 2007-11-06 | 2009-05-07 | Steven Dean Miller | Biopsy planning and display apparatus |
US20090183740A1 (en) * | 2008-01-21 | 2009-07-23 | Garrett Sheffer | Patella tracking method and apparatus for use in surgical navigation |
US20100045783A1 (en) * | 2001-10-19 | 2010-02-25 | Andrei State | Methods and systems for dynamic virtual convergence and head mountable display using same |
US20100121316A1 (en) * | 2007-04-26 | 2010-05-13 | Koninklijke Philips Electronics N.V. | Risk indication for surgical procedures |
US20100130858A1 (en) * | 2005-10-06 | 2010-05-27 | Osamu Arai | Puncture Treatment Supporting Apparatus |
US7728868B2 (en) | 2006-08-02 | 2010-06-01 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US20100168556A1 (en) * | 2006-03-31 | 2010-07-01 | Koninklijke Philips Electronics N.V. | System for local error compensation in electromagnetic tracking systems |
US20100268067A1 (en) * | 2009-02-17 | 2010-10-21 | Inneroptic Technology Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US7840256B2 (en) | 2005-06-27 | 2010-11-23 | Biomet Manufacturing Corporation | Image guided tracking array and method |
US20110043612A1 (en) * | 2009-07-31 | 2011-02-24 | Inneroptic Technology Inc. | Dual-tube stereoscope |
US20110046483A1 (en) * | 2008-01-24 | 2011-02-24 | Henry Fuchs | Methods, systems, and computer readable media for image guided ablation |
EP2289578A1 (en) * | 2008-06-16 | 2011-03-02 | Nory Co., Ltd. | Syringe needle guiding apparatus |
US20110057930A1 (en) * | 2006-07-26 | 2011-03-10 | Inneroptic Technology Inc. | System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy |
US20110082351A1 (en) * | 2009-10-07 | 2011-04-07 | Inneroptic Technology, Inc. | Representing measurement information during a medical procedure |
US20110137156A1 (en) * | 2009-02-17 | 2011-06-09 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US20110151608A1 (en) * | 2004-09-15 | 2011-06-23 | Lemmerhirt David F | Capacitive micromachined ultrasonic transducer and manufacturing method |
US20110184284A1 (en) * | 2010-01-28 | 2011-07-28 | Warsaw Orthopedic, Inc. | Non-invasive devices and methods to diagnose pain generators |
US8165659B2 (en) | 2006-03-22 | 2012-04-24 | Garrett Sheffer | Modeling method and apparatus for use in surgical navigation |
US8175350B2 (en) | 2007-01-15 | 2012-05-08 | Eigen, Inc. | Method for tissue culture extraction |
US20120245458A1 (en) * | 2009-12-09 | 2012-09-27 | Koninklijke Philips Electronics N.V. | Combination of ultrasound and x-ray systems |
US8340379B2 (en) | 2008-03-07 | 2012-12-25 | Inneroptic Technology, Inc. | Systems and methods for displaying guidance data based on updated deformable imaging data |
CN102846337A (en) * | 2011-06-29 | 2013-01-02 | 清华大学 | Three-dimensional ultrasound system, method and device for positioning target point of three-dimensional ultrasound system |
DE102011114146A1 (en) * | 2011-09-23 | 2013-03-28 | Scopis Gmbh | Method for representing e.g. head region of human for controlling operation to remove tumor, involves producing point set in coordinate system, and imaging coordinates of points of set in another coordinate system using determined image |
US20130182901A1 (en) * | 2012-01-16 | 2013-07-18 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
US8554307B2 (en) | 2010-04-12 | 2013-10-08 | Inneroptic Technology, Inc. | Image annotation in image-guided medical procedures |
WO2013156893A1 (en) * | 2012-04-19 | 2013-10-24 | Koninklijke Philips N.V. | Guidance tools to manually steer endoscope using pre-operative and intra-operative 3d images |
US8571277B2 (en) | 2007-10-18 | 2013-10-29 | Eigen, Llc | Image interpolation for medical imaging |
US8670816B2 (en) | 2012-01-30 | 2014-03-11 | Inneroptic Technology, Inc. | Multiple medical device guidance |
US8934961B2 (en) | 2007-05-18 | 2015-01-13 | Biomet Manufacturing, Llc | Trackable diagnostic scope apparatus and methods of use |
US20150016704A1 (en) * | 2012-02-03 | 2015-01-15 | Koninklijke Philips N.V. | Imaging apparatus for imaging an object |
US20150278623A1 (en) * | 2014-03-27 | 2015-10-01 | Blue Belt Technologies, Inc. | Systems and methods for preventing wrong-level spinal surgery |
US20150366624A1 (en) * | 2014-06-19 | 2015-12-24 | KB Medical SA | Systems and methods for performing minimally invasive surgery |
US9282947B2 (en) | 2009-12-01 | 2016-03-15 | Inneroptic Technology, Inc. | Imager focusing based on intraoperative data |
US9345552B2 (en) | 2011-09-02 | 2016-05-24 | Stryker Corporation | Method of performing a minimally invasive procedure on a hip joint of a patient to relieve femoral acetabular impingement |
US9375196B2 (en) | 2012-07-12 | 2016-06-28 | Covidien Lp | System and method for detecting critical structures using ultrasound |
US9510771B1 (en) | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
DK178899B1 (en) * | 2015-10-09 | 2017-05-08 | 3Dintegrated Aps | A depiction system |
US9675319B1 (en) | 2016-02-17 | 2017-06-13 | Inneroptic Technology, Inc. | Loupe display |
US20170196643A1 (en) * | 2014-07-15 | 2017-07-13 | Koninklijke Philips N.V. | Image ntegration and robotic endoscope control in x-ray suite |
US9848922B2 (en) | 2013-10-09 | 2017-12-26 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US9883818B2 (en) | 2007-06-19 | 2018-02-06 | Accuray Incorporated | Fiducial localization |
US9901406B2 (en) | 2014-10-02 | 2018-02-27 | Inneroptic Technology, Inc. | Affected region display associated with a medical device |
US9949700B2 (en) | 2015-07-22 | 2018-04-24 | Inneroptic Technology, Inc. | Medical device approaches |
US10188467B2 (en) | 2014-12-12 | 2019-01-29 | Inneroptic Technology, Inc. | Surgical guidance intersection display |
US10278778B2 (en) | 2016-10-27 | 2019-05-07 | Inneroptic Technology, Inc. | Medical device navigation using a virtual 3D space |
US10314559B2 (en) | 2013-03-14 | 2019-06-11 | Inneroptic Technology, Inc. | Medical device guidance |
WO2019168935A1 (en) * | 2018-02-27 | 2019-09-06 | Steven Aaron Ross | Video patient tracking for medical imaging guidance |
US10413272B2 (en) | 2016-03-08 | 2019-09-17 | Covidien Lp | Surgical tool with flex circuit ultrasound sensor |
US10521069B2 (en) * | 2015-10-08 | 2019-12-31 | Samsung Medison Co., Ltd. | Ultrasonic apparatus and method for controlling the same |
US10631838B2 (en) | 2016-05-03 | 2020-04-28 | Covidien Lp | Devices, systems, and methods for locating pressure sensitive critical structures |
US10716544B2 (en) | 2015-10-08 | 2020-07-21 | Zmk Medical Technologies Inc. | System for 3D multi-parametric ultrasound imaging |
CN112155727A (en) * | 2020-08-31 | 2021-01-01 | 上海市第一人民医院 | Surgical navigation systems, methods, devices, and media based on three-dimensional models |
US11020144B2 (en) | 2015-07-21 | 2021-06-01 | 3Dintegrated Aps | Minimally invasive surgery system |
US11033182B2 (en) | 2014-02-21 | 2021-06-15 | 3Dintegrated Aps | Set comprising a surgical instrument |
US11036025B2 (en) * | 2019-03-25 | 2021-06-15 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus and medical observation system |
CN112971993A (en) * | 2016-08-16 | 2021-06-18 | 株式会社高迎科技 | Surgical robot system for positioning operation and control method thereof |
US11147531B2 (en) | 2015-08-12 | 2021-10-19 | Sonetics Ultrasound, Inc. | Method and system for measuring blood pressure using ultrasound by emitting push pulse to a blood vessel |
US11259879B2 (en) | 2017-08-01 | 2022-03-01 | Inneroptic Technology, Inc. | Selective transparency to assist medical device navigation |
US11331120B2 (en) | 2015-07-21 | 2022-05-17 | 3Dintegrated Aps | Cannula assembly kit |
US11436697B2 (en) * | 2019-03-27 | 2022-09-06 | Fujifilm Corporation | Positional information display device, positional information display method, positional information display program, and radiography apparatus |
US11464578B2 (en) | 2009-02-17 | 2022-10-11 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US11484365B2 (en) | 2018-01-23 | 2022-11-01 | Inneroptic Technology, Inc. | Medical image guidance |
US11711596B2 (en) | 2020-01-23 | 2023-07-25 | Covidien Lp | System and methods for determining proximity relative to an anatomical structure |
US11801019B2 (en) * | 2019-03-27 | 2023-10-31 | Fujifilm Corporation | Positional information display device, positional information display method, positional information display program, and radiography apparatus |
US20240081784A1 (en) * | 2019-10-08 | 2024-03-14 | Smith & Nephew, Inc. | Methods for improved ultrasound imaging to emphasize structures of interest and devices thereof |
US12094061B2 (en) | 2020-03-16 | 2024-09-17 | Covidien Lp | System and methods for updating an anatomical 3D model |
Citations (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US30397A (en) * | 1860-10-16 | Window-blind fastener | ||
USRE30397E (en) * | 1976-04-27 | 1980-09-09 | Three-dimensional ultrasonic imaging of animal soft tissue | |
US4583538A (en) * | 1984-05-04 | 1986-04-22 | Onik Gary M | Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization |
US4770182A (en) * | 1986-11-26 | 1988-09-13 | Fonar Corporation | NMR screening method |
US4945478A (en) * | 1987-11-06 | 1990-07-31 | Center For Innovative Technology | Noninvasive medical imaging system and method for the identification and 3-D display of atherosclerosis and the like |
US4977505A (en) * | 1988-05-24 | 1990-12-11 | Arch Development Corporation | Means to correlate images from scans taken at different times including means to determine the minimum distances between a patient anatomical contour and a correlating surface |
US5070401A (en) * | 1990-04-09 | 1991-12-03 | Welch Allyn, Inc. | Video measurement system with automatic calibration and distortion correction |
US5078140A (en) * | 1986-05-08 | 1992-01-07 | Kwoh Yik S | Imaging device - aided robotic stereotaxis system |
US5222499A (en) * | 1989-11-15 | 1993-06-29 | Allen George S | Method and apparatus for imaging the anatomy |
US5230338A (en) * | 1987-11-10 | 1993-07-27 | Allen George S | Interactive image-guided surgical system for displaying images corresponding to the placement of a surgical tool or the like |
US5261404A (en) * | 1991-07-08 | 1993-11-16 | Mick Peter R | Three-dimensional mammal anatomy imaging system and method |
US5299253A (en) * | 1992-04-10 | 1994-03-29 | Akzo N.V. | Alignment system to overlay abdominal computer aided tomography and magnetic resonance anatomy with single photon emission tomography |
US5313306A (en) * | 1991-05-13 | 1994-05-17 | Telerobotics International, Inc. | Omniview motionless camera endoscopy system |
US5337732A (en) * | 1992-09-16 | 1994-08-16 | Cedars-Sinai Medical Center | Robotic endoscopy |
US5363475A (en) * | 1988-12-05 | 1994-11-08 | Rediffusion Simulation Limited | Image generator for generating perspective views from data defining a model having opaque and translucent features |
US5389101A (en) * | 1992-04-21 | 1995-02-14 | University Of Utah | Apparatus and method for photogrammetric surgical localization |
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5419320A (en) * | 1990-10-26 | 1995-05-30 | Hitachi, Ltd. | Method and apparatus for obtaining an image indicating metabolism in a body |
US5454371A (en) * | 1993-11-29 | 1995-10-03 | London Health Association | Method and system for constructing and displaying three-dimensional images |
US5458126A (en) * | 1994-02-24 | 1995-10-17 | General Electric Company | Cardiac functional analysis system employing gradient image segmentation |
US5491510A (en) * | 1993-12-03 | 1996-02-13 | Texas Instruments Incorporated | System and method for simultaneously viewing a scene and an obscured object |
US5531277A (en) * | 1995-03-30 | 1996-07-02 | Deere & Company | Bent wing sweep |
US5531520A (en) * | 1994-09-01 | 1996-07-02 | Massachusetts Institute Of Technology | System and method of registration of three-dimensional data sets including anatomical body data |
US5540229A (en) * | 1993-09-29 | 1996-07-30 | U.S. Philips Cororation | System and method for viewing three-dimensional echographic data |
US5546807A (en) * | 1994-12-02 | 1996-08-20 | Oxaal; John T. | High speed volumetric ultrasound imaging system |
US5548807A (en) * | 1993-10-07 | 1996-08-20 | Nec Corporation | Mobile communication system comprising base stations each having omnidirectional antenna for reception of interference wave |
US5562095A (en) * | 1992-12-24 | 1996-10-08 | Victoria Hospital Corporation | Three dimensional ultrasound imaging system |
US5585813A (en) * | 1992-10-05 | 1996-12-17 | Rockwell International Corporation | All aspect head aiming display |
US5604848A (en) * | 1994-03-18 | 1997-02-18 | Fujitsu Limited | Viewpoint setting apparatus for a computer graphics system for displaying three-dimensional models |
US5608849A (en) * | 1991-08-27 | 1997-03-04 | King, Jr.; Donald | Method of visual guidance for positioning images or data in three-dimensional space |
US5611025A (en) * | 1994-11-23 | 1997-03-11 | General Electric Company | Virtual internal cavity inspection system |
US5622170A (en) * | 1990-10-19 | 1997-04-22 | Image Guided Technologies, Inc. | Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body |
US5671381A (en) * | 1993-03-23 | 1997-09-23 | Silicon Graphics, Inc. | Method and apparatus for displaying data within a three-dimensional information landscape |
US5682886A (en) * | 1995-12-26 | 1997-11-04 | Musculographics Inc | Computer-assisted surgical system |
US5704897A (en) * | 1992-07-31 | 1998-01-06 | Truppe; Michael J. | Apparatus and method for registration of points of a data field with respective points of an optical image |
US5740802A (en) * | 1993-04-20 | 1998-04-21 | General Electric Company | Computer graphic and live video system for enhancing visualization of body structures during surgery |
US5772594A (en) * | 1995-10-17 | 1998-06-30 | Barrick; Earl F. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US5776050A (en) * | 1995-07-24 | 1998-07-07 | Medical Media Systems | Anatomical visualization system |
US5781195A (en) * | 1996-04-16 | 1998-07-14 | Microsoft Corporation | Method and system for rendering two-dimensional views of a three-dimensional surface |
US5797849A (en) * | 1995-03-28 | 1998-08-25 | Sonometrics Corporation | Method for carrying out a medical procedure using a three-dimensional tracking and imaging system |
US5800352A (en) * | 1994-09-15 | 1998-09-01 | Visualization Technology, Inc. | Registration system for use with position tracking and imaging system for use in medical applications |
US5815126A (en) * | 1993-10-22 | 1998-09-29 | Kopin Corporation | Monocular portable communication and display system |
US5833627A (en) * | 1995-04-13 | 1998-11-10 | United States Surgical Corporation | Image-guided biopsy apparatus and methods of use |
US5833608A (en) * | 1993-10-06 | 1998-11-10 | Biosense, Inc. | Magnetic determination of position and orientation |
US5836954A (en) * | 1992-04-21 | 1998-11-17 | University Of Utah Research Foundation | Apparatus and method for photogrammetric surgical localization |
US5842473A (en) * | 1993-11-29 | 1998-12-01 | Life Imaging Systems | Three-dimensional imaging system |
US5855553A (en) * | 1995-02-16 | 1999-01-05 | Hitchi, Ltd. | Remote surgery support system and method thereof |
US5868673A (en) * | 1995-03-28 | 1999-02-09 | Sonometrics Corporation | System for carrying out surgery, biopsy and ablation of a tumor or other physical anomaly |
US5882206A (en) * | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
US5887121A (en) * | 1995-04-21 | 1999-03-23 | International Business Machines Corporation | Method of constrained Cartesian control of robotic mechanisms with active and passive joints |
US5892538A (en) * | 1995-06-30 | 1999-04-06 | Ericsson Inc. | True three-dimensional imaging and display system |
US5891034A (en) * | 1990-10-19 | 1999-04-06 | St. Louis University | System for indicating the position of a surgical probe within a head on an image of the head |
US6016439A (en) * | 1996-10-15 | 2000-01-18 | Biosense, Inc. | Method and apparatus for synthetic viewpoint imaging |
US6167296A (en) * | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
US6272366B1 (en) * | 1994-10-27 | 2001-08-07 | Wake Forest University | Method and system for producing interactive three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen |
US20030073901A1 (en) * | 1999-03-23 | 2003-04-17 | Simon David A. | Navigational guidance via computer-assisted fluoroscopic imaging |
US20040097806A1 (en) * | 2002-11-19 | 2004-05-20 | Mark Hunter | Navigation system for cardiac therapies |
US20040171924A1 (en) * | 2003-01-30 | 2004-09-02 | Mire David A. | Method and apparatus for preplanning a surgical procedure |
US20050085718A1 (en) * | 2003-10-21 | 2005-04-21 | Ramin Shahidi | Systems and methods for intraoperative targetting |
US20050085717A1 (en) * | 2003-10-21 | 2005-04-21 | Ramin Shahidi | Systems and methods for intraoperative targetting |
US6895268B1 (en) * | 1999-06-28 | 2005-05-17 | Siemens Aktiengesellschaft | Medical workstation, imaging system, and method for mixing two images |
-
2005
- 2005-01-27 US US11/045,013 patent/US20060036162A1/en not_active Abandoned
Patent Citations (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US30397A (en) * | 1860-10-16 | Window-blind fastener | ||
USRE30397E (en) * | 1976-04-27 | 1980-09-09 | Three-dimensional ultrasonic imaging of animal soft tissue | |
US4583538A (en) * | 1984-05-04 | 1986-04-22 | Onik Gary M | Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization |
US5078140A (en) * | 1986-05-08 | 1992-01-07 | Kwoh Yik S | Imaging device - aided robotic stereotaxis system |
US4770182A (en) * | 1986-11-26 | 1988-09-13 | Fonar Corporation | NMR screening method |
US4945478A (en) * | 1987-11-06 | 1990-07-31 | Center For Innovative Technology | Noninvasive medical imaging system and method for the identification and 3-D display of atherosclerosis and the like |
US5230338A (en) * | 1987-11-10 | 1993-07-27 | Allen George S | Interactive image-guided surgical system for displaying images corresponding to the placement of a surgical tool or the like |
US4977505A (en) * | 1988-05-24 | 1990-12-11 | Arch Development Corporation | Means to correlate images from scans taken at different times including means to determine the minimum distances between a patient anatomical contour and a correlating surface |
US5363475A (en) * | 1988-12-05 | 1994-11-08 | Rediffusion Simulation Limited | Image generator for generating perspective views from data defining a model having opaque and translucent features |
US5222499A (en) * | 1989-11-15 | 1993-06-29 | Allen George S | Method and apparatus for imaging the anatomy |
US5070401A (en) * | 1990-04-09 | 1991-12-03 | Welch Allyn, Inc. | Video measurement system with automatic calibration and distortion correction |
US5891034A (en) * | 1990-10-19 | 1999-04-06 | St. Louis University | System for indicating the position of a surgical probe within a head on an image of the head |
US5622170A (en) * | 1990-10-19 | 1997-04-22 | Image Guided Technologies, Inc. | Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body |
US5419320A (en) * | 1990-10-26 | 1995-05-30 | Hitachi, Ltd. | Method and apparatus for obtaining an image indicating metabolism in a body |
US5313306A (en) * | 1991-05-13 | 1994-05-17 | Telerobotics International, Inc. | Omniview motionless camera endoscopy system |
US5261404A (en) * | 1991-07-08 | 1993-11-16 | Mick Peter R | Three-dimensional mammal anatomy imaging system and method |
US5608849A (en) * | 1991-08-27 | 1997-03-04 | King, Jr.; Donald | Method of visual guidance for positioning images or data in three-dimensional space |
US5299253A (en) * | 1992-04-10 | 1994-03-29 | Akzo N.V. | Alignment system to overlay abdominal computer aided tomography and magnetic resonance anatomy with single photon emission tomography |
US5836954A (en) * | 1992-04-21 | 1998-11-17 | University Of Utah Research Foundation | Apparatus and method for photogrammetric surgical localization |
US5389101A (en) * | 1992-04-21 | 1995-02-14 | University Of Utah | Apparatus and method for photogrammetric surgical localization |
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5572999A (en) * | 1992-05-27 | 1996-11-12 | International Business Machines Corporation | Robotic system for positioning a surgical instrument relative to a patient's body |
US5704897A (en) * | 1992-07-31 | 1998-01-06 | Truppe; Michael J. | Apparatus and method for registration of points of a data field with respective points of an optical image |
US5337732A (en) * | 1992-09-16 | 1994-08-16 | Cedars-Sinai Medical Center | Robotic endoscopy |
US5585813A (en) * | 1992-10-05 | 1996-12-17 | Rockwell International Corporation | All aspect head aiming display |
US5562095A (en) * | 1992-12-24 | 1996-10-08 | Victoria Hospital Corporation | Three dimensional ultrasound imaging system |
US5671381A (en) * | 1993-03-23 | 1997-09-23 | Silicon Graphics, Inc. | Method and apparatus for displaying data within a three-dimensional information landscape |
US5740802A (en) * | 1993-04-20 | 1998-04-21 | General Electric Company | Computer graphic and live video system for enhancing visualization of body structures during surgery |
US5540229A (en) * | 1993-09-29 | 1996-07-30 | U.S. Philips Cororation | System and method for viewing three-dimensional echographic data |
US5833608A (en) * | 1993-10-06 | 1998-11-10 | Biosense, Inc. | Magnetic determination of position and orientation |
US5548807A (en) * | 1993-10-07 | 1996-08-20 | Nec Corporation | Mobile communication system comprising base stations each having omnidirectional antenna for reception of interference wave |
US5815126A (en) * | 1993-10-22 | 1998-09-29 | Kopin Corporation | Monocular portable communication and display system |
US5454371A (en) * | 1993-11-29 | 1995-10-03 | London Health Association | Method and system for constructing and displaying three-dimensional images |
US5842473A (en) * | 1993-11-29 | 1998-12-01 | Life Imaging Systems | Three-dimensional imaging system |
US5491510A (en) * | 1993-12-03 | 1996-02-13 | Texas Instruments Incorporated | System and method for simultaneously viewing a scene and an obscured object |
US5458126A (en) * | 1994-02-24 | 1995-10-17 | General Electric Company | Cardiac functional analysis system employing gradient image segmentation |
US5604848A (en) * | 1994-03-18 | 1997-02-18 | Fujitsu Limited | Viewpoint setting apparatus for a computer graphics system for displaying three-dimensional models |
US5531520A (en) * | 1994-09-01 | 1996-07-02 | Massachusetts Institute Of Technology | System and method of registration of three-dimensional data sets including anatomical body data |
US5800352A (en) * | 1994-09-15 | 1998-09-01 | Visualization Technology, Inc. | Registration system for use with position tracking and imaging system for use in medical applications |
US6272366B1 (en) * | 1994-10-27 | 2001-08-07 | Wake Forest University | Method and system for producing interactive three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen |
US5611025A (en) * | 1994-11-23 | 1997-03-11 | General Electric Company | Virtual internal cavity inspection system |
US5546807A (en) * | 1994-12-02 | 1996-08-20 | Oxaal; John T. | High speed volumetric ultrasound imaging system |
US5855553A (en) * | 1995-02-16 | 1999-01-05 | Hitchi, Ltd. | Remote surgery support system and method thereof |
US5797849A (en) * | 1995-03-28 | 1998-08-25 | Sonometrics Corporation | Method for carrying out a medical procedure using a three-dimensional tracking and imaging system |
US5868673A (en) * | 1995-03-28 | 1999-02-09 | Sonometrics Corporation | System for carrying out surgery, biopsy and ablation of a tumor or other physical anomaly |
US5882206A (en) * | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
US5531277A (en) * | 1995-03-30 | 1996-07-02 | Deere & Company | Bent wing sweep |
US5833627A (en) * | 1995-04-13 | 1998-11-10 | United States Surgical Corporation | Image-guided biopsy apparatus and methods of use |
US5887121A (en) * | 1995-04-21 | 1999-03-23 | International Business Machines Corporation | Method of constrained Cartesian control of robotic mechanisms with active and passive joints |
US5892538A (en) * | 1995-06-30 | 1999-04-06 | Ericsson Inc. | True three-dimensional imaging and display system |
US5776050A (en) * | 1995-07-24 | 1998-07-07 | Medical Media Systems | Anatomical visualization system |
US5772594A (en) * | 1995-10-17 | 1998-06-30 | Barrick; Earl F. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US5871018A (en) * | 1995-12-26 | 1999-02-16 | Delp; Scott L. | Computer-assisted surgical method |
US5682886A (en) * | 1995-12-26 | 1997-11-04 | Musculographics Inc | Computer-assisted surgical system |
US5781195A (en) * | 1996-04-16 | 1998-07-14 | Microsoft Corporation | Method and system for rendering two-dimensional views of a three-dimensional surface |
US6167296A (en) * | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
US6016439A (en) * | 1996-10-15 | 2000-01-18 | Biosense, Inc. | Method and apparatus for synthetic viewpoint imaging |
US20030073901A1 (en) * | 1999-03-23 | 2003-04-17 | Simon David A. | Navigational guidance via computer-assisted fluoroscopic imaging |
US6895268B1 (en) * | 1999-06-28 | 2005-05-17 | Siemens Aktiengesellschaft | Medical workstation, imaging system, and method for mixing two images |
US20040097806A1 (en) * | 2002-11-19 | 2004-05-20 | Mark Hunter | Navigation system for cardiac therapies |
US20040171924A1 (en) * | 2003-01-30 | 2004-09-02 | Mire David A. | Method and apparatus for preplanning a surgical procedure |
US20050085718A1 (en) * | 2003-10-21 | 2005-04-21 | Ramin Shahidi | Systems and methods for intraoperative targetting |
US20050085717A1 (en) * | 2003-10-21 | 2005-04-21 | Ramin Shahidi | Systems and methods for intraoperative targetting |
Cited By (164)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100045783A1 (en) * | 2001-10-19 | 2010-02-25 | Andrei State | Methods and systems for dynamic virtual convergence and head mountable display using same |
US20070038223A1 (en) * | 2003-02-04 | 2007-02-15 | Joel Marquart | Computer-assisted knee replacement apparatus and method |
US20060173293A1 (en) * | 2003-02-04 | 2006-08-03 | Joel Marquart | Method and apparatus for computer assistance with intramedullary nail procedure |
US20060241416A1 (en) * | 2003-02-04 | 2006-10-26 | Joel Marquart | Method and apparatus for computer assistance with intramedullary nail procedure |
US20050267353A1 (en) * | 2004-02-04 | 2005-12-01 | Joel Marquart | Computer-assisted knee replacement apparatus and method |
US20070073306A1 (en) * | 2004-03-08 | 2007-03-29 | Ryan Lakin | Cutting block for surgical navigation |
US20070167811A1 (en) * | 2004-09-15 | 2007-07-19 | Lemmerhirt David F | Capacitive Micromachined Ultrasonic Transducer |
US8309428B2 (en) | 2004-09-15 | 2012-11-13 | Sonetics Ultrasound, Inc. | Capacitive micromachined ultrasonic transducer |
US8658453B2 (en) | 2004-09-15 | 2014-02-25 | Sonetics Ultrasound, Inc. | Capacitive micromachined ultrasonic transducer |
US8399278B2 (en) | 2004-09-15 | 2013-03-19 | Sonetics Ultrasound, Inc. | Capacitive micromachined ultrasonic transducer and manufacturing method |
US20070167812A1 (en) * | 2004-09-15 | 2007-07-19 | Lemmerhirt David F | Capacitive Micromachined Ultrasonic Transducer |
US20110151608A1 (en) * | 2004-09-15 | 2011-06-23 | Lemmerhirt David F | Capacitive micromachined ultrasonic transducer and manufacturing method |
US7517314B2 (en) * | 2004-10-14 | 2009-04-14 | Karl Storz Development Corp. | Endoscopic imaging with indication of gravity direction |
US20060084840A1 (en) * | 2004-10-14 | 2006-04-20 | Hoeg Hans D | Endoscopic imaging with indication of gravity direction |
US7433760B2 (en) | 2004-10-28 | 2008-10-07 | Accelerated Pictures, Inc. | Camera and animation controller, systems and methods |
WO2006050197A3 (en) * | 2004-10-28 | 2007-12-21 | Accelerated Pictures Llc | Camera and animation controller, systems and methods |
US20060106494A1 (en) * | 2004-10-28 | 2006-05-18 | Accelerated Pictures, Llc | Camera and animation controller, systems and methods |
US20060109274A1 (en) * | 2004-10-28 | 2006-05-25 | Accelerated Pictures, Llc | Client/server-based animation software, systems and methods |
US20070016008A1 (en) * | 2005-06-23 | 2007-01-18 | Ryan Schoenefeld | Selective gesturing input to a surgical navigation system |
US7840256B2 (en) | 2005-06-27 | 2010-11-23 | Biomet Manufacturing Corporation | Image guided tracking array and method |
US20070073137A1 (en) * | 2005-09-15 | 2007-03-29 | Ryan Schoenefeld | Virtual mouse for use in surgical navigation |
US7643862B2 (en) | 2005-09-15 | 2010-01-05 | Biomet Manufacturing Corporation | Virtual mouse for use in surgical navigation |
US20100130858A1 (en) * | 2005-10-06 | 2010-05-27 | Osamu Arai | Puncture Treatment Supporting Apparatus |
US8165659B2 (en) | 2006-03-22 | 2012-04-24 | Garrett Sheffer | Modeling method and apparatus for use in surgical navigation |
US9733336B2 (en) * | 2006-03-31 | 2017-08-15 | Koninklijke Philips N.V. | System for local error compensation in electromagnetic tracking systems |
US20100168556A1 (en) * | 2006-03-31 | 2010-07-01 | Koninklijke Philips Electronics N.V. | System for local error compensation in electromagnetic tracking systems |
US8425418B2 (en) | 2006-05-18 | 2013-04-23 | Eigen, Llc | Method of ultrasonic imaging and biopsy of the prostate |
US20080039723A1 (en) * | 2006-05-18 | 2008-02-14 | Suri Jasjit S | System and method for 3-d biopsy |
US20110057930A1 (en) * | 2006-07-26 | 2011-03-10 | Inneroptic Technology Inc. | System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy |
US20080024615A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Camera control |
US7880770B2 (en) | 2006-07-28 | 2011-02-01 | Accelerated Pictures, Inc. | Camera control |
US20080028312A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Scene organization in computer-assisted filmmaking |
US8350902B2 (en) | 2006-08-02 | 2013-01-08 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US10127629B2 (en) | 2006-08-02 | 2018-11-13 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US10733700B2 (en) | 2006-08-02 | 2020-08-04 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US20100198045A1 (en) * | 2006-08-02 | 2010-08-05 | Inneroptic Technology Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US8482606B2 (en) | 2006-08-02 | 2013-07-09 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US7728868B2 (en) | 2006-08-02 | 2010-06-01 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US9659345B2 (en) | 2006-08-02 | 2017-05-23 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US11481868B2 (en) | 2006-08-02 | 2022-10-25 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure she using multiple modalities |
US20080071292A1 (en) * | 2006-09-20 | 2008-03-20 | Rich Collin A | System and method for displaying the trajectory of an instrument and the position of a body within a volume |
US8064664B2 (en) | 2006-10-18 | 2011-11-22 | Eigen, Inc. | Alignment method for registering medical images |
US20080095422A1 (en) * | 2006-10-18 | 2008-04-24 | Suri Jasjit S | Alignment method for registering medical images |
US20080146915A1 (en) * | 2006-10-19 | 2008-06-19 | Mcmorrow Gerald | Systems and methods for visualizing a cannula trajectory |
US20080159606A1 (en) * | 2006-10-30 | 2008-07-03 | Suri Jasit S | Object Recognition System for Medical Imaging |
US7804989B2 (en) | 2006-10-30 | 2010-09-28 | Eigen, Inc. | Object recognition system for medical imaging |
US20080161687A1 (en) * | 2006-12-29 | 2008-07-03 | Suri Jasjit S | Repeat biopsy system |
US8175350B2 (en) | 2007-01-15 | 2012-05-08 | Eigen, Inc. | Method for tissue culture extraction |
US7856130B2 (en) | 2007-03-28 | 2010-12-21 | Eigen, Inc. | Object recognition system for medical imaging |
US20080240526A1 (en) * | 2007-03-28 | 2008-10-02 | Suri Jasjit S | Object recognition system for medical imaging |
US20100121316A1 (en) * | 2007-04-26 | 2010-05-13 | Koninklijke Philips Electronics N.V. | Risk indication for surgical procedures |
US10111726B2 (en) | 2007-04-26 | 2018-10-30 | Koninklijke Philips N.V. | Risk indication for surgical procedures |
US8934961B2 (en) | 2007-05-18 | 2015-01-13 | Biomet Manufacturing, Llc | Trackable diagnostic scope apparatus and methods of use |
US8204576B2 (en) * | 2007-06-06 | 2012-06-19 | Olympus Medical Systems Corp. | Medical guiding system |
US20080306379A1 (en) * | 2007-06-06 | 2008-12-11 | Olympus Medical Systems Corp. | Medical guiding system |
US10136950B2 (en) | 2007-06-19 | 2018-11-27 | Biomet Manufacturing, Llc | Patient-matched surgical component and methods of use |
US11331000B2 (en) | 2007-06-19 | 2022-05-17 | Accuray Incorporated | Treatment couch with localization array |
US9883818B2 (en) | 2007-06-19 | 2018-02-06 | Accuray Incorporated | Fiducial localization |
US20080319491A1 (en) * | 2007-06-19 | 2008-12-25 | Ryan Schoenefeld | Patient-matched surgical component and methods of use |
US10786307B2 (en) | 2007-06-19 | 2020-09-29 | Biomet Manufacturing, Llc | Patient-matched surgical component and methods of use |
US9289268B2 (en) | 2007-06-19 | 2016-03-22 | Accuray Incorporated | Target location by tracking of imaging device |
US11304620B2 (en) | 2007-06-19 | 2022-04-19 | Accuray Incorporated | Localization array position in treatment room coordinate system |
US20090003528A1 (en) * | 2007-06-19 | 2009-01-01 | Sankaralingam Ramraj | Target location by tracking of imaging device |
US9775625B2 (en) | 2007-06-19 | 2017-10-03 | Biomet Manufacturing, Llc. | Patient-matched surgical component and methods of use |
US8571277B2 (en) | 2007-10-18 | 2013-10-29 | Eigen, Llc | Image interpolation for medical imaging |
US20120087557A1 (en) * | 2007-11-06 | 2012-04-12 | Eigen, Inc. | Biopsy planning and display apparatus |
US7942829B2 (en) | 2007-11-06 | 2011-05-17 | Eigen, Inc. | Biopsy planning and display apparatus |
US20090118640A1 (en) * | 2007-11-06 | 2009-05-07 | Steven Dean Miller | Biopsy planning and display apparatus |
US8571637B2 (en) | 2008-01-21 | 2013-10-29 | Biomet Manufacturing, Llc | Patella tracking method and apparatus for use in surgical navigation |
US20090183740A1 (en) * | 2008-01-21 | 2009-07-23 | Garrett Sheffer | Patella tracking method and apparatus for use in surgical navigation |
US20110046483A1 (en) * | 2008-01-24 | 2011-02-24 | Henry Fuchs | Methods, systems, and computer readable media for image guided ablation |
US9265572B2 (en) | 2008-01-24 | 2016-02-23 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for image guided ablation |
US8831310B2 (en) | 2008-03-07 | 2014-09-09 | Inneroptic Technology, Inc. | Systems and methods for displaying guidance data based on updated deformable imaging data |
US8340379B2 (en) | 2008-03-07 | 2012-12-25 | Inneroptic Technology, Inc. | Systems and methods for displaying guidance data based on updated deformable imaging data |
EP2289578A4 (en) * | 2008-06-16 | 2011-06-01 | Nory Co Ltd | Syringe needle guiding apparatus |
EP2289578A1 (en) * | 2008-06-16 | 2011-03-02 | Nory Co., Ltd. | Syringe needle guiding apparatus |
US10398513B2 (en) | 2009-02-17 | 2019-09-03 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US20110137156A1 (en) * | 2009-02-17 | 2011-06-09 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US10136951B2 (en) | 2009-02-17 | 2018-11-27 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US8585598B2 (en) | 2009-02-17 | 2013-11-19 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US8641621B2 (en) | 2009-02-17 | 2014-02-04 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US8690776B2 (en) | 2009-02-17 | 2014-04-08 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US11464578B2 (en) | 2009-02-17 | 2022-10-11 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US20100268067A1 (en) * | 2009-02-17 | 2010-10-21 | Inneroptic Technology Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US9398936B2 (en) | 2009-02-17 | 2016-07-26 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US9364294B2 (en) | 2009-02-17 | 2016-06-14 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US11464575B2 (en) | 2009-02-17 | 2022-10-11 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US20110043612A1 (en) * | 2009-07-31 | 2011-02-24 | Inneroptic Technology Inc. | Dual-tube stereoscope |
US20110082351A1 (en) * | 2009-10-07 | 2011-04-07 | Inneroptic Technology, Inc. | Representing measurement information during a medical procedure |
US9282947B2 (en) | 2009-12-01 | 2016-03-15 | Inneroptic Technology, Inc. | Imager focusing based on intraoperative data |
US10238361B2 (en) * | 2009-12-09 | 2019-03-26 | Koninklijke Philips N.V. | Combination of ultrasound and x-ray systems |
US20120245458A1 (en) * | 2009-12-09 | 2012-09-27 | Koninklijke Philips Electronics N.V. | Combination of ultrasound and x-ray systems |
US20110184284A1 (en) * | 2010-01-28 | 2011-07-28 | Warsaw Orthopedic, Inc. | Non-invasive devices and methods to diagnose pain generators |
US9107698B2 (en) | 2010-04-12 | 2015-08-18 | Inneroptic Technology, Inc. | Image annotation in image-guided medical procedures |
US8554307B2 (en) | 2010-04-12 | 2013-10-08 | Inneroptic Technology, Inc. | Image annotation in image-guided medical procedures |
CN102846337A (en) * | 2011-06-29 | 2013-01-02 | 清华大学 | Three-dimensional ultrasound system, method and device for positioning target point of three-dimensional ultrasound system |
US9707043B2 (en) | 2011-09-02 | 2017-07-18 | Stryker Corporation | Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing |
US11135014B2 (en) | 2011-09-02 | 2021-10-05 | Stryker Corporation | Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing |
US11896314B2 (en) | 2011-09-02 | 2024-02-13 | Stryker Corporation | Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing |
US12279830B2 (en) | 2011-09-02 | 2025-04-22 | Stryker Corporation | Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing |
US10813697B2 (en) | 2011-09-02 | 2020-10-27 | Stryker Corporation | Methods of preparing tissue of a patient to receive an implant |
US9622823B2 (en) | 2011-09-02 | 2017-04-18 | Stryker Corporation | Method for repairing focal defects in tissue of a patient |
US9345552B2 (en) | 2011-09-02 | 2016-05-24 | Stryker Corporation | Method of performing a minimally invasive procedure on a hip joint of a patient to relieve femoral acetabular impingement |
DE102011114146A1 (en) * | 2011-09-23 | 2013-03-28 | Scopis Gmbh | Method for representing e.g. head region of human for controlling operation to remove tumor, involves producing point set in coordinate system, and imaging coordinates of points of set in another coordinate system using determined image |
USRE49094E1 (en) | 2011-10-28 | 2022-06-07 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US9510771B1 (en) | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US20130182901A1 (en) * | 2012-01-16 | 2013-07-18 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
US9058647B2 (en) * | 2012-01-16 | 2015-06-16 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
US8670816B2 (en) | 2012-01-30 | 2014-03-11 | Inneroptic Technology, Inc. | Multiple medical device guidance |
US9684972B2 (en) * | 2012-02-03 | 2017-06-20 | Koninklijke Philips N.V. | Imaging apparatus for imaging an object |
US20150016704A1 (en) * | 2012-02-03 | 2015-01-15 | Koninklijke Philips N.V. | Imaging apparatus for imaging an object |
WO2013156893A1 (en) * | 2012-04-19 | 2013-10-24 | Koninklijke Philips N.V. | Guidance tools to manually steer endoscope using pre-operative and intra-operative 3d images |
CN104244800A (en) * | 2012-04-19 | 2014-12-24 | 皇家飞利浦有限公司 | Guidance tools to manually steer endoscope using pre-operative and intra-operative 3d images |
JP2015514492A (en) * | 2012-04-19 | 2015-05-21 | コーニンクレッカ フィリップス エヌ ヴェ | Guidance tool to manually operate the endoscope using pre- and intra-operative 3D images |
US11452464B2 (en) | 2012-04-19 | 2022-09-27 | Koninklijke Philips N.V. | Guidance tools to manually steer endoscope using pre-operative and intra-operative 3D images |
US9375196B2 (en) | 2012-07-12 | 2016-06-28 | Covidien Lp | System and method for detecting critical structures using ultrasound |
US9730672B2 (en) | 2012-07-12 | 2017-08-15 | Covidien Lp | System and method for detecting critical structures using ultrasound |
US10314559B2 (en) | 2013-03-14 | 2019-06-11 | Inneroptic Technology, Inc. | Medical device guidance |
US9848922B2 (en) | 2013-10-09 | 2017-12-26 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US11033182B2 (en) | 2014-02-21 | 2021-06-15 | 3Dintegrated Aps | Set comprising a surgical instrument |
US12075981B2 (en) | 2014-02-21 | 2024-09-03 | Cilag Gmbh International | Set comprising a surgical instrument |
US20150278623A1 (en) * | 2014-03-27 | 2015-10-01 | Blue Belt Technologies, Inc. | Systems and methods for preventing wrong-level spinal surgery |
US20150366624A1 (en) * | 2014-06-19 | 2015-12-24 | KB Medical SA | Systems and methods for performing minimally invasive surgery |
CN106999248A (en) * | 2014-06-19 | 2017-08-01 | Kb医疗公司 | System and method for performing micro-wound surgical operation |
US10828120B2 (en) * | 2014-06-19 | 2020-11-10 | Kb Medical, Sa | Systems and methods for performing minimally invasive surgery |
US10702346B2 (en) * | 2014-07-15 | 2020-07-07 | Koninklijke Philips N.V. | Image integration and robotic endoscope control in X-ray suite |
US20170196643A1 (en) * | 2014-07-15 | 2017-07-13 | Koninklijke Philips N.V. | Image ntegration and robotic endoscope control in x-ray suite |
US11684429B2 (en) | 2014-10-02 | 2023-06-27 | Inneroptic Technology, Inc. | Affected region display associated with a medical device |
US12262960B2 (en) | 2014-10-02 | 2025-04-01 | Inneroptic Technology, Inc. | Affected region display associated with a medical device |
US9901406B2 (en) | 2014-10-02 | 2018-02-27 | Inneroptic Technology, Inc. | Affected region display associated with a medical device |
US10820944B2 (en) | 2014-10-02 | 2020-11-03 | Inneroptic Technology, Inc. | Affected region display based on a variance parameter associated with a medical device |
US11534245B2 (en) | 2014-12-12 | 2022-12-27 | Inneroptic Technology, Inc. | Surgical guidance intersection display |
US11931117B2 (en) | 2014-12-12 | 2024-03-19 | Inneroptic Technology, Inc. | Surgical guidance intersection display |
US10820946B2 (en) | 2014-12-12 | 2020-11-03 | Inneroptic Technology, Inc. | Surgical guidance intersection display |
US10188467B2 (en) | 2014-12-12 | 2019-01-29 | Inneroptic Technology, Inc. | Surgical guidance intersection display |
US11020144B2 (en) | 2015-07-21 | 2021-06-01 | 3Dintegrated Aps | Minimally invasive surgery system |
US11331120B2 (en) | 2015-07-21 | 2022-05-17 | 3Dintegrated Aps | Cannula assembly kit |
US9949700B2 (en) | 2015-07-22 | 2018-04-24 | Inneroptic Technology, Inc. | Medical device approaches |
US11103200B2 (en) | 2015-07-22 | 2021-08-31 | Inneroptic Technology, Inc. | Medical device approaches |
US11147531B2 (en) | 2015-08-12 | 2021-10-19 | Sonetics Ultrasound, Inc. | Method and system for measuring blood pressure using ultrasound by emitting push pulse to a blood vessel |
US10716544B2 (en) | 2015-10-08 | 2020-07-21 | Zmk Medical Technologies Inc. | System for 3D multi-parametric ultrasound imaging |
US10521069B2 (en) * | 2015-10-08 | 2019-12-31 | Samsung Medison Co., Ltd. | Ultrasonic apparatus and method for controlling the same |
DK178899B1 (en) * | 2015-10-09 | 2017-05-08 | 3Dintegrated Aps | A depiction system |
US11039734B2 (en) | 2015-10-09 | 2021-06-22 | 3Dintegrated Aps | Real time correlated depiction system of surgical tool |
US10433814B2 (en) | 2016-02-17 | 2019-10-08 | Inneroptic Technology, Inc. | Loupe display |
US11179136B2 (en) | 2016-02-17 | 2021-11-23 | Inneroptic Technology, Inc. | Loupe display |
US9675319B1 (en) | 2016-02-17 | 2017-06-13 | Inneroptic Technology, Inc. | Loupe display |
US11484285B2 (en) | 2016-03-08 | 2022-11-01 | Covidien Lp | Surgical tool with flex circuit ultrasound sensor |
US10413272B2 (en) | 2016-03-08 | 2019-09-17 | Covidien Lp | Surgical tool with flex circuit ultrasound sensor |
US10631838B2 (en) | 2016-05-03 | 2020-04-28 | Covidien Lp | Devices, systems, and methods for locating pressure sensitive critical structures |
CN112971993A (en) * | 2016-08-16 | 2021-06-18 | 株式会社高迎科技 | Surgical robot system for positioning operation and control method thereof |
US11369439B2 (en) | 2016-10-27 | 2022-06-28 | Inneroptic Technology, Inc. | Medical device navigation using a virtual 3D space |
US10278778B2 (en) | 2016-10-27 | 2019-05-07 | Inneroptic Technology, Inc. | Medical device navigation using a virtual 3D space |
US10772686B2 (en) | 2016-10-27 | 2020-09-15 | Inneroptic Technology, Inc. | Medical device navigation using a virtual 3D space |
US11259879B2 (en) | 2017-08-01 | 2022-03-01 | Inneroptic Technology, Inc. | Selective transparency to assist medical device navigation |
US11484365B2 (en) | 2018-01-23 | 2022-11-01 | Inneroptic Technology, Inc. | Medical image guidance |
WO2019168935A1 (en) * | 2018-02-27 | 2019-09-06 | Steven Aaron Ross | Video patient tracking for medical imaging guidance |
US11036025B2 (en) * | 2019-03-25 | 2021-06-15 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus and medical observation system |
US11801019B2 (en) * | 2019-03-27 | 2023-10-31 | Fujifilm Corporation | Positional information display device, positional information display method, positional information display program, and radiography apparatus |
US11436697B2 (en) * | 2019-03-27 | 2022-09-06 | Fujifilm Corporation | Positional information display device, positional information display method, positional information display program, and radiography apparatus |
US20240081784A1 (en) * | 2019-10-08 | 2024-03-14 | Smith & Nephew, Inc. | Methods for improved ultrasound imaging to emphasize structures of interest and devices thereof |
US11711596B2 (en) | 2020-01-23 | 2023-07-25 | Covidien Lp | System and methods for determining proximity relative to an anatomical structure |
US12094061B2 (en) | 2020-03-16 | 2024-09-17 | Covidien Lp | System and methods for updating an anatomical 3D model |
CN112155727A (en) * | 2020-08-31 | 2021-01-01 | 上海市第一人民医院 | Surgical navigation systems, methods, devices, and media based on three-dimensional models |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060036162A1 (en) | Method and apparatus for guiding a medical instrument to a subsurface target site in a patient | |
US6379302B1 (en) | Navigation information overlay onto ultrasound imagery | |
US6442417B1 (en) | Method and apparatus for transforming view orientations in image-guided surgery | |
EP2769689B1 (en) | Computer-implemented technique for calculating a position of a surgical device | |
EP2153794B1 (en) | System for and method of visualizing an interior of a body | |
EP1103229B1 (en) | System and method for use with imaging devices to facilitate planning of interventional procedures | |
JP5662638B2 (en) | System and method of alignment between fluoroscope and computed tomography for paranasal sinus navigation | |
US20080123910A1 (en) | Method and system for providing accuracy evaluation of image guided surgery | |
US9782147B2 (en) | Apparatus and methods for localization and relative positioning of a surgical instrument | |
US7853305B2 (en) | Trajectory storage apparatus and method for surgical navigation systems | |
US8108072B2 (en) | Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information | |
US5823958A (en) | System and method for displaying a structural data image in real-time correlation with moveable body | |
US8147503B2 (en) | Methods of locating and tracking robotic instruments in robotic surgical systems | |
US8248413B2 (en) | Visual navigation system for endoscopic surgery | |
US5930329A (en) | Apparatus and method for detection and localization of a biopsy needle or similar surgical tool in a radiographic image | |
US20010037064A1 (en) | Method and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body | |
CN100591282C (en) | System for guiding a medical device inside a patient | |
US20020077543A1 (en) | Method and apparatus for tracking a medical instrument based on image registration | |
US20090088634A1 (en) | Tool tracking systems and methods for image guided surgery | |
US20080071143A1 (en) | Multi-dimensional navigation of endoscopic video | |
JP2008126075A (en) | System and method for visual verification of ct registration and feedback | |
WO2001012057A1 (en) | Method and system for displaying cross-sectional images of a body | |
US20240144497A1 (en) | 3D Spatial Mapping in a 3D Coordinate System of an AR Headset Using 2D Images | |
WO2008035271A2 (en) | Device for registering a 3d model | |
US20240386682A1 (en) | 3D Alignment in a 3D Coordinate System of an AR Headset Using 2D Reference Images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GENERAL ELECTRIC COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WEST, JAY;REEL/FRAME:016846/0200 Effective date: 20050422 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |