US12137989B2 - Systems and methods for intelligent ultrasound probe guidance - Google Patents

Systems and methods for intelligent ultrasound probe guidance Download PDF

Info

Publication number
US12137989B2
US12137989B2 US17/861,031 US202217861031A US12137989B2 US 12137989 B2 US12137989 B2 US 12137989B2 US 202217861031 A US202217861031 A US 202217861031A US 12137989 B2 US12137989 B2 US 12137989B2
Authority
US
United States
Prior art keywords
ultrasound
ultrasound probe
optical fiber
needle
probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/861,031
Other versions
US20240008929A1 (en
Inventor
Anthony K. Misener
Steffan Sowards
William Robert McLaughlin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bard Access Systems Inc
Original Assignee
Bard Access Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bard Access Systems Inc filed Critical Bard Access Systems Inc
Priority to US17/861,031 priority Critical patent/US12137989B2/en
Priority to CN202321786916.5U priority patent/CN220655593U/en
Priority to CN202310834727.9A priority patent/CN117357158A/en
Priority to EP23758430.5A priority patent/EP4543303A1/en
Priority to PCT/US2023/027147 priority patent/WO2024010940A1/en
Publication of US20240008929A1 publication Critical patent/US20240008929A1/en
Assigned to BARD ACCESS SYSTEMS, INC. reassignment BARD ACCESS SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MCLAUGHLIN, William Robert, MISENER, Anthony K., SOWARDS, Steffan
Application granted granted Critical
Publication of US12137989B2 publication Critical patent/US12137989B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4488Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4494Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer characterised by the arrangement of the transducer elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0233Special features of optical sensors or probes classified in A61B5/00

Definitions

  • Ultrasound imaging is a widely accepted tool for guiding interventional instruments such as needles to targets such as blood vessels or organs in the human body.
  • the needle is monitored in real-time both immediately before and after a percutaneous puncture in order to enable a clinician to determine the distance and the orientation of the needle to the blood vessel and ensure successful access thereto.
  • the clinician can lose both the blood vessel and the needle, which can be difficult and time consuming to find again.
  • it is often easier toc monitor the distance and orientation of the needle immediately before the percutaneous puncture with a needle plane including the needle perpendicular to an image plane of the ultrasound probe.
  • ultrasound imaging systems and methods thereof that can dynamically adjust the image plane to facilitate guiding interventional instruments to targets in at least the human body.
  • Doppler ultrasound is a noninvasive approach to estimating the blood flow through your blood vessels by bouncing high-frequency sound waves (ultrasound) off circulating red blood cells.
  • a doppler ultrasound can estimate how fast blood flows by measuring the rate of change in its pitch (frequency).
  • Doppler ultrasound may be performed as an alternative to more-invasive procedures, such as angiography, which involves injecting dye into the blood vessels so that they show up clearly on X-ray images.
  • Doppler ultrasound may help diagnose many conditions, including blood clots, poorly functioning valves in your leg veins, which can cause blood or other fluids to pool in your legs (venous insufficiency), heart valve defects and congenital heart disease, a blocked artery (arterial occlusion), decreased blood circulation into your legs (peripheral artery disease), bulging arteries (aneurysms), and narrowing of an artery, such as in your neck (carotid artery stenosis). Doppler ultrasound may also detect a direction of blood flow within a blood vessel.
  • an ultrasound imaging system including an ultrasound probe including an array of ultrasonic transducers and an orientation system, wherein the ultrasonic transducers are configured to emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals of the ultrasound signals for processing into ultrasound images, and wherein the orientation system is configured to obtain orientation information of the ultrasound probe, a console configured to communicate with the ultrasound probe, the console including one or more processors and a non-transitory computer-readable medium having stored thereon logic, when executed by the one or more processors, causes operations including: obtaining the orientation information; performing an identification process on the ultrasound signals to identify an anatomical target (e.g., a target vessel); determining, based on the orientation information, a direction of movement required by the ultrasound probe to place the ultrasound probe at a predetermined position relative to the anatomical target (e.g., to center the ultrasound probe over the anatomical target); and initiating provision of feedback to a user of the ultrasound probe indicating the direction
  • the orientation information indicates positioning of the ultrasound probe on a Cartesian coordinate system relative to a skin surface of the patient.
  • the ultrasound probe includes an inertial measurement unit configured to obtain the orientation information.
  • the ultrasound probe includes an optical fiber having one or more of core fibers, wherein each of the one or more core 106 s includes a plurality of sensors distributed along a longitudinal length of a corresponding core fiber and each sensor of the plurality of sensors is configured to (i) reflect a light signal of a different spectral width based on received incident light, and (ii) change a characteristic of the reflected light signal for use in determining a physical state of the optical fiber.
  • the operations further include: providing a broadband incident light signal to the optical fiber, receiving a reflected light signal of the broadband incident light, wherein the reflected light signal is reflected from red blood cells within the patient body, and processing the reflected light signal to determine the orientation information.
  • the identification process includes applying a trained machine learning model configured to detect anatomical features within the ultrasound images and provide a bounding box around the anatomical target.
  • the provision of the feedback includes providing haptic feedback from a first side of the ultrasound probe, where the first side corresponds to the direction of movement required by the ultrasound probe to center the ultrasound probe over the anatomical target.
  • the system includes a needle including a second orientation system configured to obtain needle orientation information, and wherein the operations further include: determining, based on the needle orientation information, an orientation of the needle relative to the ultrasound probe, determining a trajectory of the needle, and generating a display screen illustrating the trajectory of the needle.
  • Also disclosed herein is a method of providing the ultrasound imaging system discussed above and providing instructions to cause performance of the operations also discussed above. Additionally, disclosed herein is a non-transitory, computer-readable medium having logic stored thereon that, when executed by a processor causes performance of the operations discussed above.
  • FIG. 1 illustrates an ultrasound imaging system and a patient in accordance with some embodiments
  • FIG. 2 illustrates a block diagram of a console of the ultrasound imaging system of FIG. 1 in accordance with some embodiments
  • FIG. 3 A illustrates the ultrasound probe 106 of the ultrasound imaging system 100 imaging a blood vessel of the patient P in an unsterile environment 300 prior to accessing the blood vessel in accordance with some embodiments;
  • FIG. 3 B illustrates an ultrasound image of the blood vessel of FIG. 3 A on a display screen of the ultrasound imaging system in accordance with some embodiments
  • FIG. 4 A illustrates the ultrasound probe of the ultrasound imaging system imaging a blood vessel of the patient P in a sterile environment prior to accessing and/or while accessing the blood vessel in accordance with some embodiments;
  • FIG. 4 B illustrates an ultrasound image of the blood vessel of FIG. 4 A on a display screen of the ultrasound imaging system in accordance with some embodiments
  • FIG. 5 illustrates the ultrasound probe 106 as illustrated in FIG. 3 A that further includes an inertial measurement unit (“IMU”) 158 in accordance with some embodiments;
  • IMU inertial measurement unit
  • FIG. 6 A illustrates the ultrasound probe 106 modified to include a multi-core optical fiber in accordance with some embodiments
  • FIG. 6 B illustrates the ultrasound probe 106 of FIG. 6 A , a needle 606 , and an exemplary needle trajectory in accordance with some embodiments;
  • FIG. 7 illustrates the ultrasound probe 106 fixedly coupled to a mechanical arm 700 having a series of known location points along an arm length the patient P in accordance with some embodiments;
  • FIG. 8 illustrates an ultrasound imaging system 800 that includes alternative reality functionality in accordance with some embodiments
  • FIG. 9 A illustrates the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician that directs movement of the probe 106 center the probe 106 over the target vessel in accordance with some embodiments;
  • FIG. 9 B is a first embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel in accordance with some embodiments;
  • FIG. 9 C is a second embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel in accordance with some embodiments;
  • FIG. 9 D illustrates the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician when the probe 106 is centered over the target vessel in accordance with some embodiments;
  • FIGS. 10 A and 10 B are simplified views of the ultrasound probe of the guidance system being used to guide a needle toward a vessel within the body of a patient in accordance with some embodiments;
  • FIGS. 11 A and 11 B show possible screenshots for depiction on the display of the guidance system, showing the position and orientation of a needle in accordance with some embodiments.
  • FIG. 12 is a flow diagram illustrating various stages of a method for guiding a needle to a desired target within the body of a patient in accordance with some embodiments.
  • proximal portion or a “proximal-end portion” of, for example, a catheter disclosed herein includes a portion of the catheter intended to be near a clinician when the catheter is used on a patient.
  • proximal length of, for example, the catheter includes a length of the catheter intended to be near the clinician when the catheter is used on the patient.
  • proximal end of, for example, the catheter includes an end of the catheter intended to be near the clinician when the catheter is used on the patient.
  • the proximal portion, the proximal-end portion, or the proximal length of the catheter can include the proximal end of the catheter; however, the proximal portion, the proximal-end portion, or the proximal length of the catheter need not include the proximal end of the catheter. That is, unless context suggests otherwise, the proximal portion, the proximal-end portion, or the proximal length of the catheter is not a terminal portion or terminal length of the catheter.
  • a “distal portion” or a “distal-end portion” of, for example, a catheter disclosed herein includes a portion of the catheter intended to be near or in a patient when the catheter is used on the patient.
  • a “distal length” of, for example, the catheter includes a length of the catheter intended to be near or in the patient when the catheter is used on the patient.
  • a “distal end” of, for example, the catheter includes an end of the catheter intended to be near or in the patient when the catheter is used on the patient.
  • the distal portion, the distal-end portion, or the distal length of the catheter can include the distal end of the catheter; however, the distal portion, the distal-end portion, or the distal length of the catheter need not include the distal end of the catheter. That is, unless context suggests otherwise, the distal portion, the distal-end portion, or the distal length of the catheter is not a terminal portion or terminal length of the catheter.
  • ultrasound imaging systems and methods thereof are needed that can dynamically adjust the image plane to facilitate guiding interventional instruments to targets in at least the human body.
  • dynamically adjusting ultrasound imaging systems and methods thereof are disclosed herein.
  • FIG. 1 an ultrasound imaging system 100 , a needle 112 , and a patient P is shown in accordance with some embodiments.
  • FIG. 2 illustrates a block diagram of the ultrasound imaging system 100 in accordance with some embodiments. The discussion below may be made with reference to both FIGS. 1 - 2 .
  • the ultrasound imaging system 100 includes a console 102 , the display screen 104 , and the ultrasound probe 106 .
  • the ultrasound imaging system 100 is useful for imaging a target such as a blood vessel or an organ within a body of the patient P prior to a percutaneous puncture with the needle 112 for inserting the needle 112 or another medical device into the target and accessing the target as well as imaging a target during the insertion process to provide confirmation of the needle 112 .
  • the ultrasound imaging system 100 is shown in FIG. 1 in a general relationship to the patient P during a ultrasound-based medical procedure to place a catheter 108 into the vasculature of the patient P through a skin insertion site S created by a percutaneous puncture with the needle 112 .
  • the ultrasound imaging system 100 can be useful in a variety of ultrasound-based medical procedures other than catheterization.
  • the percutaneous puncture with the needle 112 can be performed to biopsy tissue of an organ of the patient P.
  • the console 102 houses a variety of components of the ultrasound imaging system 100 , and it is appreciated the console 102 can take any of a variety of forms.
  • a processor 116 and memory 118 such as random-access memory (“RAM”) or non-volatile memory (e.g., electrically erasable programmable read-only memory (“EEPROM”)) are included in the console 102 for controlling functions of the ultrasound imaging system 100 .
  • the processor may execute various logic operations or algorithms during operation of the ultrasound imaging system 100 in accordance with executable logic (“instructions”) 120 stored in the memory 118 for execution by the processor 116 .
  • the console 102 is configured to instantiate by way of the logic 120 one or more processes for dynamically adjusting a distance of activated ultrasonic transducers 148 from a predefined target (e.g., blood vessel) or area, an orientation of the activated ultrasonic transducers 148 to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the predefined target or area, as well as process electrical signals from the ultrasound probe 106 into ultrasound images.
  • a predefined target e.g., blood vessel
  • Dynamically adjusting the activated ultrasonic transducers 148 uses ultrasound imaging data, magnetic-field data, shape-sensing data, or a combination thereof received by the console 102 for activating certain ultrasonic transducers of a 2-D array of the ultrasonic transducers 148 or moving those already activated in a linear array of the ultrasonic transducers 148 .
  • a digital controller/analog interface 122 is also included with the console 102 and is in communication with both the processor 116 and other system components to govern interfacing between the ultrasound probe 106 and other system components set forth herein.
  • the ultrasound imaging system 100 further includes ports 124 for connection with additional components such as optional components 126 including a printer, storage media, keyboard, etc.
  • the ports 124 can be universal serial bus (“USB”) ports, though other types of ports can be used for this connection or any other connections shown or described herein.
  • a power connection 128 is included with the console 102 to enable operable connection to an external power supply 130 .
  • An internal power supply 132 e.g., a battery
  • Power management circuitry 134 is included with the digital controller/analog interface 122 of the console 102 to regulate power use and distribution.
  • the display screen 104 is integrated into the console 102 to provide a GUI and display information for a clinician during such as one-or-more ultrasound images of the target or the patient P attained by the ultrasound probe 106 .
  • the ultrasound imaging system 100 enables the distance and orientation of a magnetized medical device such as the needle 112 to be superimposed in real-time atop an ultrasound image of the target, thus enabling a clinician to accurately guide the magnetized medical device to the intended target.
  • the display screen 104 can alternatively be separate from the console 102 and communicatively coupled thereto.
  • a console button interface 136 and control buttons 110 (see FIG. 1 ) included on the ultrasound probe 106 can be used to immediately call up a desired mode to the display screen 104 by the clinician for assistance in an ultrasound-based medical procedure.
  • the display screen 104 is an LCD device.
  • the ultrasound probe 106 is employed in connection with ultrasound-based visualization of a target such as a blood vessel (see FIG. 3 A ) in preparation for inserting the needle 112 or another medical device into the target.
  • a target such as a blood vessel (see FIG. 3 A )
  • Such visualization gives real-time ultrasound guidance and assists in reducing complications typically associated with such insertion, including inadvertent arterial puncture, hematoma, pneumothorax, etc.
  • the ultrasound probe 106 is configured to provide to the console 102 electrical signals corresponding to both the ultrasound imaging data, the magnetic-field data, the shape-sensing data, or a combination thereof for the real-time ultrasound guidance.
  • a stand-alone optical interrogator 154 can be communicatively coupled to the console 102 by way of one of the ports 124 .
  • the console 102 can include an integrated optical interrogator integrated into the console 102 .
  • Such an optical interrogator is configured to emit input optical signals into a companion optical-fiber stylet 156 for shape sensing with the ultrasound imaging system 100 , which optical-fiber stylet 156 , in turn, is configured to be inserted into a lumen of a medical device such as the needle 112 and convey the input optical signals from the optical interrogator 154 to a number of FBG sensors along a length of the optical-fiber stylet 156 .
  • the optical interrogator 154 is also configured to receive reflected optical signals conveyed by the optical-fiber stylet 156 reflected from the number of FBG sensors, the reflected optical signals indicative of a shape of the optical-fiber stylet 156 .
  • the optical interrogator 154 is also configured to convert the reflected optical signals into corresponding electrical signals for processing by the console 102 into distance and orientation information with respect to the target for dynamically adjusting a distance of the activated ultrasonic transducers 148 , an orientation of the activated ultrasonic transducers 148 , or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target or the medical device when it is brought into proximity of the target.
  • the distance and orientation of the activated ultrasonic transducers 148 can be adjusted with respect to a blood vessel as the target. Indeed, an image plane can be established by the activated ultrasonic transducers 148 being perpendicular or parallel to the blood vessel in accordance with an orientation of the blood vessel.
  • orientation information may refer to the positioning of the probe 106 (or other medical instrument) in three dimensions relative to a fixed axis.
  • the fixed axis may refer to a perpendicular axis extending distally from a surface of a patient P (e.g., which may be representative of the Z-axis of a Cartesian coordinate system).
  • orientation information of the probe 106 provides a geometric view of an angle of the ultrasound probe relative to the skin surface of patient P. Additionally, orientation information may provide an indication as to whether the ultrasound probe 106 is being held in a transverse or longitudinal orientation relative to a target vessel of the patient P.
  • FIG. 2 shows that the ultrasound probe 106 further includes a button and memory controller 138 for governing button and ultrasound probe 106 operation.
  • the button and memory controller 138 can include non-volatile memory (e.g., EEPROM).
  • the button and memory controller 138 is in operable communication with a probe interface 140 of the console 102 , which includes an input/output (“I/O”) component 142 for interfacing with the ultrasonic transducers 148 and a button and memory I/O component 144 for interfacing with the button and memory controller 138 .
  • I/O input/output
  • the ultrasound probe 106 can include a magnetic-sensor array 146 for detecting a magnetized medical device such as the needle 112 during ultrasound-based medical procedures.
  • the magnetic-sensor array 146 includes a number of magnetic sensors 150 embedded within or included on a housing of the ultrasound probe 106 .
  • the magnetic sensors 150 are configured to detect a magnetic field or a disturbance in a magnetic field as magnetic signals associated with the magnetized medical device when it is in proximity to the magnetic-sensor array 146 .
  • the magnetic sensors 150 are also configured to convert the magnetic signals from the magnetized medical device (e.g., the needle 112 ) into electrical signals for the console 102 to process into distance and orientation information for the magnetized medical device with respect to the predefined target, as well as for display of an iconographic representation of the magnetized medical device on the display screen 104 .
  • the magnetic-sensor array 146 enables the ultrasound imaging system 100 to track the needle 112 or the like.
  • the magnetic sensors 150 can be sensors of other types and configurations. Also, though they are described herein as included with the ultrasound probe 106 , the magnetic sensors 150 of the magnetic-sensor array 146 can be included in a component separate from the ultrasound probe 106 such as a sleeve into which the ultrasound probe 106 is inserted or even a separate handheld device. The magnetic sensors 150 can be disposed in an annular configuration about the probe head 114 of the ultrasound probe 106 , though it is appreciated that the magnetic sensors 150 can be arranged in other configurations, such as in an arched, planar, or semi-circular arrangement.
  • Each magnetic sensor of the magnetic sensors 150 includes three orthogonal sensor coils for enabling detection of a magnetic field in three spatial dimensions.
  • 3-dimensional (“3-D”) magnetic sensors can be purchased, for example, from Honeywell Sensing and Control of Morristown, NJ. Further, the magnetic sensors 150 are configured as Hall-effect sensors, though other types of magnetic sensors could be employed. Further, instead of 3-D sensors, a plurality of 1-dimensional (“1-D”) magnetic sensors can be included and arranged as desired to achieve 1-, 2-, or 3-D detection capability.
  • the ultrasound probe 106 can further include an inertial measurement unit (“IMU”) 158 or any one or more components thereof for inertial measurement selected from an accelerometer 160 , a gyroscope 162 , and a magnetometer 164 configured to provide positional-tracking data of the ultrasound probe 106 to the console 102 for stabilization of an image plane.
  • IMU inertial measurement unit
  • the processor 116 is further configured to execute the logic 120 for processing the positional-tracking data for adjusting the distance of the activated ultrasonic transducers 148 from the target, the orientation of the activated ultrasonic transducers 148 to the target, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target to maintain the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target when the ultrasound probe 106 is inadvertently moved with respect to the target.
  • a medical device of a magnetizable material enables the medical device (e.g., the needle 112 ) to be magnetized by a magnetizer, if not already magnetized, and tracked by the ultrasound imaging system 100 when the magnetized medical device is brought into proximity of the magnetic sensors 150 of the magnetic-sensor array 146 or inserted into the body of the patient P during an ultrasound-based medical procedure.
  • Such magnetic-based tracking of the magnetized medical device assists the clinician in placing a distal tip thereof in a desired location, such as in a lumen of a blood vessel, by superimposing a simulated needle image representing the real-time distance and orientation of the needle 112 over an ultrasound image of the body of the patient P being accessed by the magnetized medical device.
  • Such a medical device can be stainless steel such as SS 304 stainless steel; however, other suitable needle materials that are capable of being magnetized can be employed. So configured, the needle 112 or the like can produce a magnetic field or create a magnetic disturbance in a magnetic field detectable as magnetic signals by the magnetic-sensor array 146 of the ultrasound probe 106 so as to enable the distance and orientation of the magnetized medical device to be tracked by the ultrasound imaging system 100 for dynamically adjusting the distance of the activated ultrasonic transducers 148 , an orientation of the activated ultrasonic transducers 148 , or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the magnetized medical device.
  • the needle 112 can be tracked using the teachings of one or more patents of U.S. Pat. Nos. 5,775,322; 5,879,297; 6,129,668; 6,216,028; and 6,263,230, each of which is incorporated by reference in its entirety into this application.
  • the distance and orientation information determined by the ultrasound imaging system 100 together with an entire length of the magnetized medical device, as known by or input into the ultrasound imaging system 100 , enables the ultrasound imaging system 100 to accurately determine the distance and orientation of the entire length of the magnetized medical device, including a distal tip thereof, with respect to the magnetic-sensor array 146 . This, in turn, enables the ultrasound imaging system 100 to superimpose an image of the needle 112 on an ultrasound image produced by the ultrasound beam 152 of the ultrasound probe 106 on the display screen 104 .
  • the ultrasound image depicted on the display screen 104 can include depiction of the surface of the skin of the patient P and a subcutaneous blood vessel thereunder to be accessed by the needle 112 , as well as a depiction of the magnetized medical device as detected by the ultrasound imaging system 100 and its orientation to the vessel.
  • the ultrasound image corresponds to an image acquired by the ultrasound beam 152 of the ultrasound probe 106 . It should be appreciated that only a portion of an entire length of the magnetized medical device is magnetized and, thus, tracked by the ultrasound imaging system 100 .
  • the probe head 114 of the ultrasound probe 106 is placed against skin of the patient P.
  • An ultrasound beam 152 is produced so as to ultrasonically image a portion of a target such as a blood vessel beneath a surface of the skin of the patient P. (See FIGS. 3 A, 4 A .)
  • the ultrasonic image of the blood vessel can be depicted and stabilized on the display screen 104 of the ultrasound imaging system 100 as shown in FIGS. 3 B, 4 B despite inadvertent movements of the ultrasound probe 106 .
  • FIGS. 3 B, 4 B despite inadvertent movements of the ultrasound probe 106 .
  • FIG. 3 A illustrates the ultrasound probe 106 of the ultrasound imaging system 100 imaging a blood vessel of the patient P in an unsterile environment 300 prior to accessing the blood vessel in accordance with some embodiments.
  • the imaging performed in FIG. 3 A may be referred to as pre-scan imaging.
  • FIG. 3 B illustrates an ultrasound image of the blood vessel of FIG. 3 A (a “pre-scan image”) 306 on a display screen 104 of the ultrasound imaging system 100 in accordance with some embodiments.
  • the pre-scan image 306 may be obtained at first time that is prior to preparing the patient P and the surrounding area for sterilization, where the pre-scan image 306 may be stored in the memory 118 of the console 102 .
  • the intended purpose of obtaining the pre-scan image 306 is to allow a clinician to obtain an image of the target vessel 302 using the ultrasound probe 106 without any constraints that may be imposed in order to maintain a sterile environment.
  • the pre-scan image may then be used as a reference image to compare to the live scan image taken in a sterile field thereby allowing the clinician to confirm proper placement and orientation of the ultrasound probe 106 .
  • vessel identification logic 200 may be executed by the processor 116 causing performance of operations to identify a visual representation of the target vessel 302 , such as the target vessel image 308 of FIG. 3 B , within the pre-scan image 306 and/or detect other features of the pre-scan image 306 .
  • Other features detected may include those anatomical features typically visualized in an ultrasound image such as blood vessels, bones, muscles, tendons, ligaments, nerves, joints, etc.
  • the vessel identification logic 200 may be configured, upon execution by the processor 116 , to cause performance of operations including computerized, automated analysis of the pre-scan image 306 to identify the target vessel image 308 through machine learning operations (e.g., application of a trained machine learning model).
  • computerized, automated analysis may include operations comprising object recognition such as object detection methods, where the vessel identification logic 200 parses the pre-scan image 306 to locate a presence of one or more objects (e.g., the target vessel 302 ) with a bounding box and classify (label) the object within the bounding box.
  • the vessel identification logic 200 may include a machine learning model trained through supervised machine learning using a labeled data set.
  • a labeled data set may include ultrasound images that were previously captured (“historical data”) that has also been labeled, e.g., by another trained machine learning model and/or by a subject matter expert.
  • the machine learning model is then trained on the labeled historical data so that upon completion of the training, the machine learning model may detect objects within a new image (e.g., the pre-scan image 306 and a live scan image discussed below with respect to FIGS. 4 A- 4 B ), place bounding boxes around the images and classify the images.
  • the classification step may be skipped such that the trained machine learning model is configured to output an image including bounding boxes around detected objects within the image.
  • FIG. 4 A illustrates the ultrasound probe 106 of the ultrasound imaging system 100 imaging a blood vessel of the patient P in a sterile environment 400 prior to accessing and/or while accessing the blood vessel in accordance with some embodiments.
  • the imaging performed in FIG. 4 A may be referred to as live scan imaging.
  • FIG. 4 B illustrates an ultrasound image of the blood vessel of FIG. 4 A (a “live scan image”) on a display screen 104 of the ultrasound imaging system 100 in accordance with some embodiments.
  • the live scan image 406 may be obtained at second time that is subsequent to creating a sterilized area 402 around an insertion site on the patient P (or generally an area on the patient P.
  • the live scan image 406 may also be stored in the memory 118 of the console 102 .
  • systems and methods disclosed herein may include obtaining a pre-scan image 306 with the intended purpose of allowing a clinician to use the pre-scan image 306 as a reference image to compare to the live scan image 406 (which is taken in a sterile field) thereby allowing the clinician to confirm proper placement and orientation of the ultrasound probe 106 during the live scan process, which may correspond to insertion of a medical device such as the needle 112 .
  • the vessel identification logic 200 may be executed by the processor 116 causing performance of operations to identify a visual representation of the target vessel 302 , such as the target vessel image 308 , within the live scan image 406 and/or detect other features of the live scan image 406 .
  • Other features detected may include those anatomical features typically visualized in an ultrasound image such as blood vessels, bones, muscles, tendons, ligaments, nerves, joints, etc.
  • the ultrasound probe 106 as illustrated in FIG. 3 A that further includes an inertial measurement unit (“IMU”) 158 is shown in accordance with some embodiments.
  • the IMU 158 is configured to obtain inertial measurement from any of one or more components selected from an accelerometer 160 , a gyroscope 162 , and a magnetometer 164 . Based on the obtained inertial measurements, the IMU 158 is configured to provide positional-tracking data of the ultrasound probe 106 to the console 102 thereby enabling spatial awareness of the probe 106 .
  • the processor 116 is further configured to execute the logic 120 for processing the positional-tracking data for adjusting the distance of the activated ultrasonic transducers 148 from the target, the orientation of the activated ultrasonic transducers 148 to the target, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target to maintain the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target when the ultrasound probe 106 is inadvertently moved with respect to the target.
  • the ultrasound probe 106 modified to include a multi-core optical fiber is shown in accordance with some embodiments.
  • the ultrasound probe 106 of FIG. 6 A includes a multi-core optical fiber 600 that extends the length of a tether from the console 102 to the probe 106 .
  • the multi-core optical fiber 600 may be configured in a predetermined geometry 602 at the distal end 604 of the probe 106 .
  • the predetermined geometry enables logic of the console 102 to perform shape sensing operations enabling detection of an orientation of the probe 106 .
  • the ultrasound image displayed on the display 104 may be augmented with certain information, e.g., mirror coordination correction information, color-coding information, highlighting of a target vessel (see FIGS. 9 A- 9 D ), and the probe 106 may provide directional instructions via haptic feedback. Without knowing the orientation of the probe 106 , such information cannot be provided.
  • FIG. 6 B illustrates the ultrasound probe 106 of FIG. 6 A , a needle 606 , and an exemplary needle trajectory is shown in accordance with some embodiments.
  • FIG. 6 B illustrates the probe 106 positioned on the skin surface of patient P in order to image the vessel 302 .
  • FIG. 6 B further illustrates a needle 606 immediately adjacent an insertion site S intended to enter the target vessel 302 at a target site 608 .
  • logic of the console 102 may estimate a trajectory 610 of the needle 606 .
  • the trajectory 610 along with the ultrasound image (and/or a three-dimensional rending of the vessel 302 ) may be displayed on the display 104 .
  • the needle may also include a multi-core optical fiber 612 that extends the length of the needle 606 from either the console 102 or the probe 106 .
  • the orientation of the needle 606 may be determined based on a shape sensing of the multi-core optical fiber 612 extending through a tether to the probe 106 (optional multi-core optical fiber 612 ′) or through a tether to the console 102 (optional multi-core optical fiber 612 ′′). From the orientation of the probe 106 and the needle 606 , a rendering of imaging captured by the probe 106 , the target site 608 , and the needle trajectory 610 may be generated and displayed on the display 104 . Further, in addition to such information, knowledge of the human anatomy enables generation of a three-dimensional graphic for display on the display 104 (e.g., similar to the theoretical illustration of FIG. 6 B ).
  • feedback may be provided to the clinician by the probe 106 in certain situations.
  • the probe 106 may be configured to provide haptic feedback to the clinician indicating a direction to move the probe 106 in order to center the probe 106 over the target vessel 302 (and optionally over the insertion site 608 ).
  • Certain feedback may also be provided by the probe 106 to instruct movement of the needle 606 (e.g., in any direction including yaw and/or pitch).
  • the orientation of the needle may also be determined via the methodology discussed with respect to FIGS. 10 A- 12 .
  • light emitting diodes (LEDs) on the probe FIGS. 9 A, 9 D
  • LEDs light emitting diodes
  • the ultrasound probe 106 fixedly coupled to a mechanical arm 700 having a series of known location points along an arm length the patient P is shown in accordance with some embodiments.
  • the mechanical arm 700 may be comprised of a series of arm components 701 and joints that hingedly couple the arm components 701 together with an ultrasound probe (e.g., the probe 106 ) fixed to the distal end of the most distal arm component.
  • a series of location points 704 A- 704 D may be known, e.g., at each joint, at a proximal end of the probe 106 , and at a distal end of the probe 106 , such that an orientation and positioning of the mechanical arm 700 and the probe 106 may be determined relative to a fixed point, e.g., the console 102 .
  • an optical fiber 706 e.g., having one or more of core fibers
  • the optical fiber 706 may include the predetermined geometry 602 at the distal end of the probe 106 as illustrated in FIG. 6 B in order to determine an orientation of the probe 106 .
  • the orientation, positioning, and configuration information obtained through the use of the mechanical arm 700 may be utilized in the same manner as that data obtained through the deployment of the IMU 158 within the probe 106 as discussed above.
  • an ultrasound imaging system 800 that includes alternative reality functionality is shown in accordance with some embodiments.
  • the ultrasound imaging system 800 includes many of the components of the ultrasound imaging system 100 of FIG. 1 , where components that are held in common between the systems 100 , 800 will not be discussed in detail.
  • the system 800 includes the ability to perform augmented reality (AR) functionalities and components that provide AR data to a clinician.
  • AR augmented reality
  • the term “augmented reality” may refer to augmented reality (e.g., an enhancement of real-world images overlaid with computer-generated information) and virtual reality (e.g., replacement of a user's view with immersion within a computer-generated virtual environment).
  • the console may render an AR display screen 802 on the display 104 .
  • the AR display screen 802 may include certain visualizations as overlays on the ultrasound image obtained by the ultrasound probe 106 .
  • the AR display screen 802 may include overlays that highlight certain anatomical features detected within the ultrasound image (e.g., vessels).
  • the target vessel may be distinguished visually from all detected anatomical features (e.g., the target vessel appears in a particular color, appears within a bounding box, etc.).
  • Additional AR data that may be displayed includes directional indicators (e.g., “R”/“L” or “Right”/“Left”) that assist the clinician in properly characterizing a mirror coordination of the ultrasound probe 106 , when applicable.
  • a center line may be overlaid on the ultrasound image as well as an arrow that instructs the clinician as to a direction to move the ultrasound probe 106 in order to center the ultrasound probe 106 over a target vessel, which places the target vessel in the center of the ultrasound image displayed on the display 104 .
  • the disclosure is also intended to disclose positioning of the ultrasound probe 106 that are alternative to the center a target vessel (or anatomical target, e.g., an organ, a vessel blockage, a chamber within a heart or position within an organ, etc.). For instance, it may be advantageous to place the ultrasound probe at a particular distance from the center of the target vessel in order to allow a needle to properly access an insertion site.
  • the system 800 includes an AR device 804 that provides secondary AR data as an alternative to the AR display screen 802 .
  • the AR device 804 provides a second option (e.g., modality) for viewing AR data, while the first and secondary AR data may the same or substantially the same.
  • the AR device 804 is represented by a pair of AR glasses 804 to be worn by a clinician performing the ultrasound procedure.
  • the AR glasses 804 may be configured to display secondary AR data 806 on a display screen of the AR glasses 804 .
  • the first and secondary AR data is substantially the same (e.g., substantially the same display). However, the secondary AR data 806 may be seen by the clinician as an overlay directly on the patient body.
  • the clinician may view the insertion site S through the AR glasses 806 such that the ultrasound image including any of highlighting of detected anatomical features and/or directional or orientation markers also appear in an augmented manner.
  • the clinician views the augmented ultrasound image directly on the patient body when viewing the patient body through the AR glasses 806 .
  • the AR glasses 806 enable the clinician to maintain eye contact on the imaging area of the patient body and the augmented ultrasound image may correct any mirrored coordination that would otherwise be present when viewing the ultrasound image on the display 104 .
  • the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician that directs movement of the probe 106 center the probe 106 over the target vessel is shown in accordance with some embodiments.
  • the probe 106 may be configured with vibration technology (e.g., actuators 901 A- 901 B; linear resonant actuators (LRAs) or Piezoelectric actuators) that provide haptic feedback 900 .
  • the probe 106 may be configured with visual indicators (e.g., lights) configured to provide similar feedback by the actuators 901 A- 901 B.
  • the actuators 901 A- 901 B may be activated to provide haptic feedback that instructs a clinician as to the direction move the probe 106 to center the probe 106 over the target vessel 904 .
  • FIG. 9 A illustrates a plurality of vessels: the target vessel 904 ; and secondary vessels 905 (e.g., non-target vessels).
  • the console 102 may obtain an ultrasound image from the probe 106 and vessel identification logic 200 may perform a vessel identification process as discussed above to identify the target vessel 904 as well as detect the location of the target vessel 904 within the ultrasound image (e.g., relative to the ultrasound probe 106 ).
  • the console 102 may then activate an actuator to provide haptic feedback instructing the clinician to move the probe 106 in a particular direction to center the probe 106 over the target vessel 904 (e.g., vibration on a right side of the probe 106 indicates the clinician move the probe 106 to the right).
  • the clinician need not take his or her eyes off of the patient body and probe 106 to view the ultrasound image on the display 104 of the console 102 and determine which direction to move the probe 106 .
  • the probe 106 may be configured with lights 902 A, 902 B that operate in the same manner as the actuators 901 A, 901 B (e.g., light up on a right side of the probe 106 indicates the clinician move the probe 106 to the right).
  • FIG. 9 B a first embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel is shown in accordance with some embodiments.
  • FIG. 9 B illustrates a display screen 906 that may accompany, or be an alternative to, the feedback capabilities of the probe 106 discussed above.
  • the display screen 906 may be rendered on the display 104 of the console 102 and illustrate an ultrasound image (or a portion) captured by the probe 106 .
  • the display screen 906 may include a visual indication of identified anatomical features (e.g., a target vessel image 904 ′ and secondary vessel images 905 ′) as well as a center line 908 and a directional arrow indicator 910 , where the directional arrow indicator 910 instructs the clinician as to the direction to move the probe 106 in order to center the probe 106 over the target vessel 904 .
  • a visual indication of identified anatomical features e.g., a target vessel image 904 ′ and secondary vessel images 905 ′
  • a center line 908 e.g., a directional arrow indicator 910 instructs the clinician as to the direction to move the probe 106 in order to center the probe 106 over the target vessel 904 .
  • FIG. 9 C a second embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel is shown in accordance with some embodiments.
  • FIG. 9 C provides an alternative to FIG. 9 B where visual indicators provide explicit mirror coordination correction that may occur with ultrasound imaging.
  • the console 102 may render the display screen 906 that includes a target vessel image 904 ′ and a secondary vessel image 605 ′ as well as a center line 908 .
  • mirror coordination correction indicators 912 (“Right”, “Left”, and corresponding arrows) may be displayed, which indicate a direction to move the probe 106 in order to center the probe 106 over the target vessel 904 (or the secondary vessel 905 ). It is noted that the features illustrate in FIG. 9 C may be combined with those of FIG. 9 B .
  • FIG. 9 D the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician when the probe 106 is centered over the target vessel is shown in accordance with some embodiments.
  • FIG. 9 D illustrates the probe 106 as discussed with respect to FIG. 9 A while showing an embodiment of possible feedback when the probe 106 is centered over the target vessel 904 .
  • the center line 908 is shown in a dotted format merely to illustrate the center of the probe 106 .
  • feedback may be provided to the clinician that includes haptic feedback from both sides of the probe 106 and/or the lighting of both lights (e.g., light emitting diodes, LEDs) 902 A, 902 B.
  • the haptic feedback may differ from that provided when not centered over the target vessel 904 (e.g., when not centered, short pulses may be provided from a single side but when centered, one long pulse from both sides may be provided).
  • the lights 902 A, 902 B may blink in one situation and hold steady in another.
  • the feedback provided by the probe 106 may be customizable and/or dynamically adjusted prior to each use.
  • any of the systems disclosed herein may be used within a medical facility (e.g., a hospital, a clinic, an urgent care facility, etc.) such that a plurality of clinicians may routinely utilize the console 102 and probe 106 .
  • the console 102 may include the functionality for a clinician to sign-in to a particular profile, where each clinician profile stores a customized (or default) set of feedback.
  • FIGS. 10 A and 10 B simplified views of the ultrasound probe of the guidance system being used to guide a needle toward a vessel within the body of a patient are shown in accordance with some embodiments.
  • FIGS. 10 A- 10 B illustrate the ultrasound probe 106 of the system 100 and a needle 1020 (which may be included in system 100 ) in position and ready for insertion thereof through a skin surface of patient P to access a targeted internal body portion.
  • the probe 106 is shown with its head 1004 placed against the patient skin and producing an ultrasound beam 1006 so as to ultrasonically image a portion of a vessel 1008 beneath the skin surface of patient P.
  • the ultrasonic image of the vessel 1008 can be depicted on the display 104 of the console 102 .
  • the system 100 is configured to detect the position, orientation, and movement of the needle 1020 .
  • the sensor array 1000 of the probe 106 is configured to detect a magnetic field of the magnetic element 1024 included with the needle 1020 .
  • Each of the sensors 1002 of the sensor array 1000 is configured to spatially detect the magnetic element 1024 in three-dimensional space.
  • magnetic field strength data of the needle's magnetic element 1024 sensed by each of the sensors 1002 is forwarded to a processor, such as the processor 116 of the console 102 ( FIG.
  • the position of the magnetic element 1024 in X, Y, and Z coordinate space with respect to the sensor array 1000 can be determined by the system 100 using the magnetic field strength data sensed by the sensors 1002 .
  • FIG. shows that the pitch of the magnetic element 1024 can also be determined
  • FIG. 10 B shows that the yaw of the magnetic element can be determined.
  • Suitable circuitry of the probe 106 , the console 120 , or other component of the system can provide the calculations necessary for such position/orientation.
  • the magnetic element 1024 can be tracked using the teachings of one or more of the following U.S. Pat.
  • the above position and orientation information determined by the system 100 together with the length of the canula 1022 and position of the magnetic element 1024 with respect to the distal needle tip as known by or input into the system, enable the system 100 to accurately determine the location and orientation of the entire length of the needle 1020 with respect to the sensor array 1000 .
  • the distance between the magnetic element 1024 and the distal needle tip is known by or input into the system 100 . This in turn enables the system 100 to superimpose an image of the needle 1020 on to an image produced by the ultrasound beam 1006 of the probe 106 .
  • FIGS. 11 A and 11 B possible screenshots for depiction on the display of the guidance system, showing the position and orientation of a needle are shown in accordance with some embodiments.
  • FIGS. 11 A and 11 B show examples of a superimposition of the needle onto an ultrasound image.
  • FIGS. 11 A and 11 B each show a screenshot 1030 that can be depicted on the display 104 of the console 102 , for instance.
  • an ultrasound image 1032 is shown, including depiction of the skin surface of patient P, and the subcutaneous vessel 1008 (area 1039 ).
  • the ultrasound image 1032 corresponds to an image acquired by the ultrasound beam 1006 shown in FIGS. 11 A and 11 B , for instance.
  • the screenshot 1030 further shows a needle image 1034 representing the position and orientation of the actual needle 1020 as determined by the system 100 as described above. Because the system is able to determine the location and orientation of the needle 1020 with respect to the sensor array 1000 , the system is able to accurately determine the position and orientation of the needle 1020 with respect to the ultrasound image 1032 and superimpose it thereon for depiction as the needle image 1034 on the display 104 . Coordination of the positioning of the needle image 1034 on the ultrasound image 1032 is performed by suitable algorithms executed by the processor 116 or other suitable component of the system 100 .
  • FIG. 11 A shows that in one embodiment the system 100 can depict a projected path 1036 based on the current position and orientation of the needle 1020 as depicted by the needle image 1034 .
  • the projected path 1036 assists a clinician in determining whether the current orientation of the needle 1020 , as depicted by the needle image 1034 on the display 104 , will result in arriving at the desired internal body portion target, such as the vessel 1008 .
  • the projected path 1036 is correspondingly modified by the system 100 .
  • FIG. 11 B shows that, in one embodiment, the screenshot 1030 can be configured such that the ultrasound image 1032 and the needle image 1034 are oriented so as to be displayed in a three-dimensional aspect.
  • the screenshots 1030 are merely examples of possible depictions produced by the system 100 for display.
  • aural information such as beeps, tones, etc.
  • haptic feedback may be provided to the clinician via the probe 106 in a similar manner as discussed above with respect to at least FIGS. 9 A- 9 D .
  • the system 100 in guiding a needle or other medical device in connection with ultrasonic imaging of a targeted internal body portion (“target”) of a patient, according to one embodiment.
  • the magnetic element-equipped needle 1020 With the magnetic element-equipped needle 1020 positioned a suitable distance (e.g., two or more feet) away from the ultrasound probe 106 including the sensor array 1000 , the probe is employed to ultrasonically image, for depiction on the display 104 of the system 100 , the target within the patient that the needle is intended to intersect via percutaneous insertion.
  • the needle 1020 is then brought into the range of the sensors 1002 of the sensor array 1000 of the probe 106 .
  • Each of the sensors 1002 detects the magnetic field strength associated with the magnetic element 1024 of the needle 1020 , which data is forwarded to the processor 116 .
  • algorithms are performed by the processor 116 to calculate a magnetic field strength of the magnetic element 1024 of the needle 1020 at predicted points in space in relationship to the probe.
  • the processor 116 compares the actual magnetic field strength data detected by the sensors 1002 to the calculated field strength values (detail of this process is further described by the U.S. patents identified above). This process can be iteratively performed until the calculated value for a predicted point matches the measured data. Once this match occurs, the magnetic element 1024 has been positionally located in three-dimensional space. Using the magnetic field strength data as detected by the sensors 1002 , the pitch and yaw (i.e., orientation) of the magnetic element 1024 can also be determined.
  • FIG. 12 is a flow diagram illustrating various stages of a method for guiding a needle to a desired target within the body of a patient is shown in accordance with some embodiments.
  • Each block illustrated in FIG. 12 represents an operation performed in the method 1200 , which begins at stage 1202 where a targeted internal body portion of a patient is imaged by an imaging system, such as an ultrasound imaging device for instance.
  • an imaging system such as an ultrasound imaging device for instance.
  • a detectable characteristic of a medical component such as a needle is sensed by one or more sensors included with the imaging system.
  • the detectable characteristic of the needle is a magnetic field of the magnetic element 1024 included with the needle 1020 and the sensors are magnetic sensors included in the sensor array 1000 included with the ultrasound probe 106 .
  • a position of the medical component with respect to the targeted internal body portion is determined in at least two spatial dimensions via sensing of the detectable characteristic. As described above, such determination is made in the present embodiment by the processor 116 of the console 1120 .
  • an image representing the position of the medical component is combined with the image of the targeted internal body portion for depiction on a display.
  • directional feedback is provided to the clinician directing movement (or confirming location) of an ultrasound probe utilized in capturing the image of the internal body portion. The directional feedback may be any as discussed above.
  • Stage 1212 shows that stages 1204 - 1208 can be iteratively repeated to depict advancement or other movement of the medical component with respect to the imaged target, such as percutaneous insertion of the needle 1020 toward the vessel 1008 ( FIGS. 11 A, 11 B ), for instance.
  • the processor 116 or other suitable component can calculate additional aspects, including the area of image 1039 and the target 1038 ( FIGS. 11 A, 11 B ) for depiction on the display 104 .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Gynecology & Obstetrics (AREA)
  • Vascular Medicine (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Disclosed is an ultrasound probe including an array of ultrasonic transducers and an orientation system, wherein the orientation system obtains orientation information of the ultrasound probe. Also disclosed is a console for communicating with the ultrasound probe, the console including one or more processors and non-transitory computer-readable medium having stored thereon logic, when executed by the one or more processors, causes operations including: obtaining the orientation information, performing an identification process on the ultrasound signals to identify an anatomical target (target vessel), determining, based on the orientation information, a direction of movement required by the ultrasound probe to place the ultrasound probe at a position relative to the ultrasound probe over the anatomical target, and initiating provision of feedback of the ultrasound probe indicating the direction of movement required by the ultrasound probe to place the ultrasound probe at a position relative to the ultrasound probe over the anatomical target.

Description

BACKGROUND
Ultrasound imaging is a widely accepted tool for guiding interventional instruments such as needles to targets such as blood vessels or organs in the human body. In order to successfully guide, for example, a needle to a blood vessel using ultrasound imaging, the needle is monitored in real-time both immediately before and after a percutaneous puncture in order to enable a clinician to determine the distance and the orientation of the needle to the blood vessel and ensure successful access thereto. However, through inadvertent movement of an ultrasound probe during the ultrasound imaging, the clinician can lose both the blood vessel and the needle, which can be difficult and time consuming to find again. In addition, it is often easier toc monitor the distance and orientation of the needle immediately before the percutaneous puncture with a needle plane including the needle perpendicular to an image plane of the ultrasound probe. And it is often easier to monitor the distance and orientation of the needle immediately after the percutaneous puncture with the needle plane parallel to the image plane. As with inadvertently moving the ultrasound probe, the clinician can lose both the blood vessel and the needle when adjusting the image plane before and after the percutaneous puncture, which can be difficult and time consuming to find again. What is needed are ultrasound imaging systems and methods thereof that can dynamically adjust the image plane to facilitate guiding interventional instruments to targets in at least the human body.
Doppler ultrasound is a noninvasive approach to estimating the blood flow through your blood vessels by bouncing high-frequency sound waves (ultrasound) off circulating red blood cells. A doppler ultrasound can estimate how fast blood flows by measuring the rate of change in its pitch (frequency). Doppler ultrasound may be performed as an alternative to more-invasive procedures, such as angiography, which involves injecting dye into the blood vessels so that they show up clearly on X-ray images. Doppler ultrasound may help diagnose many conditions, including blood clots, poorly functioning valves in your leg veins, which can cause blood or other fluids to pool in your legs (venous insufficiency), heart valve defects and congenital heart disease, a blocked artery (arterial occlusion), decreased blood circulation into your legs (peripheral artery disease), bulging arteries (aneurysms), and narrowing of an artery, such as in your neck (carotid artery stenosis). Doppler ultrasound may also detect a direction of blood flow within a blood vessel.
SUMMARY
Disclosed herein is an ultrasound imaging system including an ultrasound probe including an array of ultrasonic transducers and an orientation system, wherein the ultrasonic transducers are configured to emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals of the ultrasound signals for processing into ultrasound images, and wherein the orientation system is configured to obtain orientation information of the ultrasound probe, a console configured to communicate with the ultrasound probe, the console including one or more processors and a non-transitory computer-readable medium having stored thereon logic, when executed by the one or more processors, causes operations including: obtaining the orientation information; performing an identification process on the ultrasound signals to identify an anatomical target (e.g., a target vessel); determining, based on the orientation information, a direction of movement required by the ultrasound probe to place the ultrasound probe at a predetermined position relative to the anatomical target (e.g., to center the ultrasound probe over the anatomical target); and initiating provision of feedback to a user of the ultrasound probe indicating the direction of movement required by the ultrasound probe to center the ultrasound probe over the anatomical target.
In some embodiments, the orientation information indicates positioning of the ultrasound probe on a Cartesian coordinate system relative to a skin surface of the patient. In some embodiments, the ultrasound probe includes an inertial measurement unit configured to obtain the orientation information. In some embodiments, the ultrasound probe includes an optical fiber having one or more of core fibers, wherein each of the one or more core 106 s includes a plurality of sensors distributed along a longitudinal length of a corresponding core fiber and each sensor of the plurality of sensors is configured to (i) reflect a light signal of a different spectral width based on received incident light, and (ii) change a characteristic of the reflected light signal for use in determining a physical state of the optical fiber.
In some embodiments, the operations further include: providing a broadband incident light signal to the optical fiber, receiving a reflected light signal of the broadband incident light, wherein the reflected light signal is reflected from red blood cells within the patient body, and processing the reflected light signal to determine the orientation information. In some embodiments, the identification process includes applying a trained machine learning model configured to detect anatomical features within the ultrasound images and provide a bounding box around the anatomical target. In some embodiments, the provision of the feedback includes providing haptic feedback from a first side of the ultrasound probe, where the first side corresponds to the direction of movement required by the ultrasound probe to center the ultrasound probe over the anatomical target. In some embodiments, the system includes a needle including a second orientation system configured to obtain needle orientation information, and wherein the operations further include: determining, based on the needle orientation information, an orientation of the needle relative to the ultrasound probe, determining a trajectory of the needle, and generating a display screen illustrating the trajectory of the needle.
Also disclosed herein is a method of providing the ultrasound imaging system discussed above and providing instructions to cause performance of the operations also discussed above. Additionally, disclosed herein is a non-transitory, computer-readable medium having logic stored thereon that, when executed by a processor causes performance of the operations discussed above.
These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the disclosure are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
FIG. 1 illustrates an ultrasound imaging system and a patient in accordance with some embodiments;
FIG. 2 illustrates a block diagram of a console of the ultrasound imaging system of FIG. 1 in accordance with some embodiments;
FIG. 3A illustrates the ultrasound probe 106 of the ultrasound imaging system 100 imaging a blood vessel of the patient P in an unsterile environment 300 prior to accessing the blood vessel in accordance with some embodiments;
FIG. 3B illustrates an ultrasound image of the blood vessel of FIG. 3A on a display screen of the ultrasound imaging system in accordance with some embodiments;
FIG. 4A illustrates the ultrasound probe of the ultrasound imaging system imaging a blood vessel of the patient P in a sterile environment prior to accessing and/or while accessing the blood vessel in accordance with some embodiments;
FIG. 4B illustrates an ultrasound image of the blood vessel of FIG. 4A on a display screen of the ultrasound imaging system in accordance with some embodiments;
FIG. 5 illustrates the ultrasound probe 106 as illustrated in FIG. 3A that further includes an inertial measurement unit (“IMU”) 158 in accordance with some embodiments;
FIG. 6A illustrates the ultrasound probe 106 modified to include a multi-core optical fiber in accordance with some embodiments;
FIG. 6B illustrates the ultrasound probe 106 of FIG. 6A, a needle 606, and an exemplary needle trajectory in accordance with some embodiments;
FIG. 7 illustrates the ultrasound probe 106 fixedly coupled to a mechanical arm 700 having a series of known location points along an arm length the patient P in accordance with some embodiments;
FIG. 8 illustrates an ultrasound imaging system 800 that includes alternative reality functionality in accordance with some embodiments;
FIG. 9A illustrates the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician that directs movement of the probe 106 center the probe 106 over the target vessel in accordance with some embodiments;
FIG. 9B is a first embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel in accordance with some embodiments;
FIG. 9C is a second embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel in accordance with some embodiments;
FIG. 9D illustrates the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician when the probe 106 is centered over the target vessel in accordance with some embodiments;
FIGS. 10A and 10B are simplified views of the ultrasound probe of the guidance system being used to guide a needle toward a vessel within the body of a patient in accordance with some embodiments;
FIGS. 11A and 11B show possible screenshots for depiction on the display of the guidance system, showing the position and orientation of a needle in accordance with some embodiments; and
FIG. 12 is a flow diagram illustrating various stages of a method for guiding a needle to a desired target within the body of a patient in accordance with some embodiments.
DESCRIPTION
Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.
Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
With respect to “proximal,” a “proximal portion” or a “proximal-end portion” of, for example, a catheter disclosed herein includes a portion of the catheter intended to be near a clinician when the catheter is used on a patient. Likewise, a “proximal length” of, for example, the catheter includes a length of the catheter intended to be near the clinician when the catheter is used on the patient. A “proximal end” of, for example, the catheter includes an end of the catheter intended to be near the clinician when the catheter is used on the patient. The proximal portion, the proximal-end portion, or the proximal length of the catheter can include the proximal end of the catheter; however, the proximal portion, the proximal-end portion, or the proximal length of the catheter need not include the proximal end of the catheter. That is, unless context suggests otherwise, the proximal portion, the proximal-end portion, or the proximal length of the catheter is not a terminal portion or terminal length of the catheter.
With respect to “distal,” a “distal portion” or a “distal-end portion” of, for example, a catheter disclosed herein includes a portion of the catheter intended to be near or in a patient when the catheter is used on the patient. Likewise, a “distal length” of, for example, the catheter includes a length of the catheter intended to be near or in the patient when the catheter is used on the patient. A “distal end” of, for example, the catheter includes an end of the catheter intended to be near or in the patient when the catheter is used on the patient. The distal portion, the distal-end portion, or the distal length of the catheter can include the distal end of the catheter; however, the distal portion, the distal-end portion, or the distal length of the catheter need not include the distal end of the catheter. That is, unless context suggests otherwise, the distal portion, the distal-end portion, or the distal length of the catheter is not a terminal portion or terminal length of the catheter.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
As set forth above, ultrasound imaging systems and methods thereof are needed that can dynamically adjust the image plane to facilitate guiding interventional instruments to targets in at least the human body. Disclosed herein are dynamically adjusting ultrasound imaging systems and methods thereof.
Referring now to FIG. 1 , an ultrasound imaging system 100, a needle 112, and a patient P is shown in accordance with some embodiments. FIG. 2 illustrates a block diagram of the ultrasound imaging system 100 in accordance with some embodiments. The discussion below may be made with reference to both FIGS. 1-2 . As shown, the ultrasound imaging system 100 includes a console 102, the display screen 104, and the ultrasound probe 106. The ultrasound imaging system 100 is useful for imaging a target such as a blood vessel or an organ within a body of the patient P prior to a percutaneous puncture with the needle 112 for inserting the needle 112 or another medical device into the target and accessing the target as well as imaging a target during the insertion process to provide confirmation of the needle 112. Indeed, the ultrasound imaging system 100 is shown in FIG. 1 in a general relationship to the patient P during a ultrasound-based medical procedure to place a catheter 108 into the vasculature of the patient P through a skin insertion site S created by a percutaneous puncture with the needle 112. It should be appreciated that the ultrasound imaging system 100 can be useful in a variety of ultrasound-based medical procedures other than catheterization. For example, the percutaneous puncture with the needle 112 can be performed to biopsy tissue of an organ of the patient P.
The console 102 houses a variety of components of the ultrasound imaging system 100, and it is appreciated the console 102 can take any of a variety of forms. A processor 116 and memory 118 such as random-access memory (“RAM”) or non-volatile memory (e.g., electrically erasable programmable read-only memory (“EEPROM”)) are included in the console 102 for controlling functions of the ultrasound imaging system 100. The processor may execute various logic operations or algorithms during operation of the ultrasound imaging system 100 in accordance with executable logic (“instructions”) 120 stored in the memory 118 for execution by the processor 116. For example, the console 102 is configured to instantiate by way of the logic 120 one or more processes for dynamically adjusting a distance of activated ultrasonic transducers 148 from a predefined target (e.g., blood vessel) or area, an orientation of the activated ultrasonic transducers 148 to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the predefined target or area, as well as process electrical signals from the ultrasound probe 106 into ultrasound images. Dynamically adjusting the activated ultrasonic transducers 148 uses ultrasound imaging data, magnetic-field data, shape-sensing data, or a combination thereof received by the console 102 for activating certain ultrasonic transducers of a 2-D array of the ultrasonic transducers 148 or moving those already activated in a linear array of the ultrasonic transducers 148. A digital controller/analog interface 122 is also included with the console 102 and is in communication with both the processor 116 and other system components to govern interfacing between the ultrasound probe 106 and other system components set forth herein.
The ultrasound imaging system 100 further includes ports 124 for connection with additional components such as optional components 126 including a printer, storage media, keyboard, etc. The ports 124 can be universal serial bus (“USB”) ports, though other types of ports can be used for this connection or any other connections shown or described herein. A power connection 128 is included with the console 102 to enable operable connection to an external power supply 130. An internal power supply 132 (e.g., a battery) can also be employed either with or exclusive of the external power supply 130. Power management circuitry 134 is included with the digital controller/analog interface 122 of the console 102 to regulate power use and distribution.
The display screen 104 is integrated into the console 102 to provide a GUI and display information for a clinician during such as one-or-more ultrasound images of the target or the patient P attained by the ultrasound probe 106. In addition, the ultrasound imaging system 100 enables the distance and orientation of a magnetized medical device such as the needle 112 to be superimposed in real-time atop an ultrasound image of the target, thus enabling a clinician to accurately guide the magnetized medical device to the intended target. Notwithstanding the foregoing, the display screen 104 can alternatively be separate from the console 102 and communicatively coupled thereto. A console button interface 136 and control buttons 110 (see FIG. 1 ) included on the ultrasound probe 106 can be used to immediately call up a desired mode to the display screen 104 by the clinician for assistance in an ultrasound-based medical procedure. In some embodiments, the display screen 104 is an LCD device.
The ultrasound probe 106 is employed in connection with ultrasound-based visualization of a target such as a blood vessel (see FIG. 3A) in preparation for inserting the needle 112 or another medical device into the target. Such visualization gives real-time ultrasound guidance and assists in reducing complications typically associated with such insertion, including inadvertent arterial puncture, hematoma, pneumothorax, etc. As described in more detail below, the ultrasound probe 106 is configured to provide to the console 102 electrical signals corresponding to both the ultrasound imaging data, the magnetic-field data, the shape-sensing data, or a combination thereof for the real-time ultrasound guidance.
Optionally, a stand-alone optical interrogator 154 can be communicatively coupled to the console 102 by way of one of the ports 124. Alternatively, the console 102 can include an integrated optical interrogator integrated into the console 102. Such an optical interrogator is configured to emit input optical signals into a companion optical-fiber stylet 156 for shape sensing with the ultrasound imaging system 100, which optical-fiber stylet 156, in turn, is configured to be inserted into a lumen of a medical device such as the needle 112 and convey the input optical signals from the optical interrogator 154 to a number of FBG sensors along a length of the optical-fiber stylet 156. The optical interrogator 154 is also configured to receive reflected optical signals conveyed by the optical-fiber stylet 156 reflected from the number of FBG sensors, the reflected optical signals indicative of a shape of the optical-fiber stylet 156. The optical interrogator 154 is also configured to convert the reflected optical signals into corresponding electrical signals for processing by the console 102 into distance and orientation information with respect to the target for dynamically adjusting a distance of the activated ultrasonic transducers 148, an orientation of the activated ultrasonic transducers 148, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target or the medical device when it is brought into proximity of the target.
For example, the distance and orientation of the activated ultrasonic transducers 148 can be adjusted with respect to a blood vessel as the target. Indeed, an image plane can be established by the activated ultrasonic transducers 148 being perpendicular or parallel to the blood vessel in accordance with an orientation of the blood vessel. As used herein, the term “orientation information” may refer to the positioning of the probe 106 (or other medical instrument) in three dimensions relative to a fixed axis. In some embodiments, the fixed axis may refer to a perpendicular axis extending distally from a surface of a patient P (e.g., which may be representative of the Z-axis of a Cartesian coordinate system). Thus, orientation information of the probe 106 provides a geometric view of an angle of the ultrasound probe relative to the skin surface of patient P. Additionally, orientation information may provide an indication as to whether the ultrasound probe 106 is being held in a transverse or longitudinal orientation relative to a target vessel of the patient P.
FIG. 2 shows that the ultrasound probe 106 further includes a button and memory controller 138 for governing button and ultrasound probe 106 operation. The button and memory controller 138 can include non-volatile memory (e.g., EEPROM). The button and memory controller 138 is in operable communication with a probe interface 140 of the console 102, which includes an input/output (“I/O”) component 142 for interfacing with the ultrasonic transducers 148 and a button and memory I/O component 144 for interfacing with the button and memory controller 138.
Also as seen in FIG. 2 , the ultrasound probe 106 can include a magnetic-sensor array 146 for detecting a magnetized medical device such as the needle 112 during ultrasound-based medical procedures. The magnetic-sensor array 146 includes a number of magnetic sensors 150 embedded within or included on a housing of the ultrasound probe 106. The magnetic sensors 150 are configured to detect a magnetic field or a disturbance in a magnetic field as magnetic signals associated with the magnetized medical device when it is in proximity to the magnetic-sensor array 146. The magnetic sensors 150 are also configured to convert the magnetic signals from the magnetized medical device (e.g., the needle 112) into electrical signals for the console 102 to process into distance and orientation information for the magnetized medical device with respect to the predefined target, as well as for display of an iconographic representation of the magnetized medical device on the display screen 104. Thus, the magnetic-sensor array 146 enables the ultrasound imaging system 100 to track the needle 112 or the like.
Though configured here as magnetic sensors, it is appreciated that the magnetic sensors 150 can be sensors of other types and configurations. Also, though they are described herein as included with the ultrasound probe 106, the magnetic sensors 150 of the magnetic-sensor array 146 can be included in a component separate from the ultrasound probe 106 such as a sleeve into which the ultrasound probe 106 is inserted or even a separate handheld device. The magnetic sensors 150 can be disposed in an annular configuration about the probe head 114 of the ultrasound probe 106, though it is appreciated that the magnetic sensors 150 can be arranged in other configurations, such as in an arched, planar, or semi-circular arrangement.
Each magnetic sensor of the magnetic sensors 150 includes three orthogonal sensor coils for enabling detection of a magnetic field in three spatial dimensions. Such 3-dimensional (“3-D”) magnetic sensors can be purchased, for example, from Honeywell Sensing and Control of Morristown, NJ. Further, the magnetic sensors 150 are configured as Hall-effect sensors, though other types of magnetic sensors could be employed. Further, instead of 3-D sensors, a plurality of 1-dimensional (“1-D”) magnetic sensors can be included and arranged as desired to achieve 1-, 2-, or 3-D detection capability.
As shown in FIG. 2 , the ultrasound probe 106 can further include an inertial measurement unit (“IMU”) 158 or any one or more components thereof for inertial measurement selected from an accelerometer 160, a gyroscope 162, and a magnetometer 164 configured to provide positional-tracking data of the ultrasound probe 106 to the console 102 for stabilization of an image plane. The processor 116 is further configured to execute the logic 120 for processing the positional-tracking data for adjusting the distance of the activated ultrasonic transducers 148 from the target, the orientation of the activated ultrasonic transducers 148 to the target, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target to maintain the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target when the ultrasound probe 106 is inadvertently moved with respect to the target.
It is appreciated that a medical device of a magnetizable material enables the medical device (e.g., the needle 112) to be magnetized by a magnetizer, if not already magnetized, and tracked by the ultrasound imaging system 100 when the magnetized medical device is brought into proximity of the magnetic sensors 150 of the magnetic-sensor array 146 or inserted into the body of the patient P during an ultrasound-based medical procedure. Such magnetic-based tracking of the magnetized medical device assists the clinician in placing a distal tip thereof in a desired location, such as in a lumen of a blood vessel, by superimposing a simulated needle image representing the real-time distance and orientation of the needle 112 over an ultrasound image of the body of the patient P being accessed by the magnetized medical device. Such a medical device can be stainless steel such as SS 304 stainless steel; however, other suitable needle materials that are capable of being magnetized can be employed. So configured, the needle 112 or the like can produce a magnetic field or create a magnetic disturbance in a magnetic field detectable as magnetic signals by the magnetic-sensor array 146 of the ultrasound probe 106 so as to enable the distance and orientation of the magnetized medical device to be tracked by the ultrasound imaging system 100 for dynamically adjusting the distance of the activated ultrasonic transducers 148, an orientation of the activated ultrasonic transducers 148, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the magnetized medical device. In some embodiments, the needle 112 can be tracked using the teachings of one or more patents of U.S. Pat. Nos. 5,775,322; 5,879,297; 6,129,668; 6,216,028; and 6,263,230, each of which is incorporated by reference in its entirety into this application.
In some embodiments, the distance and orientation information determined by the ultrasound imaging system 100, together with an entire length of the magnetized medical device, as known by or input into the ultrasound imaging system 100, enables the ultrasound imaging system 100 to accurately determine the distance and orientation of the entire length of the magnetized medical device, including a distal tip thereof, with respect to the magnetic-sensor array 146. This, in turn, enables the ultrasound imaging system 100 to superimpose an image of the needle 112 on an ultrasound image produced by the ultrasound beam 152 of the ultrasound probe 106 on the display screen 104. For example, the ultrasound image depicted on the display screen 104 can include depiction of the surface of the skin of the patient P and a subcutaneous blood vessel thereunder to be accessed by the needle 112, as well as a depiction of the magnetized medical device as detected by the ultrasound imaging system 100 and its orientation to the vessel. The ultrasound image corresponds to an image acquired by the ultrasound beam 152 of the ultrasound probe 106. It should be appreciated that only a portion of an entire length of the magnetized medical device is magnetized and, thus, tracked by the ultrasound imaging system 100.
During operation of the ultrasound imaging system 100, the probe head 114 of the ultrasound probe 106 is placed against skin of the patient P. An ultrasound beam 152 is produced so as to ultrasonically image a portion of a target such as a blood vessel beneath a surface of the skin of the patient P. (See FIGS. 3A, 4A.) The ultrasonic image of the blood vessel can be depicted and stabilized on the display screen 104 of the ultrasound imaging system 100 as shown in FIGS. 3B, 4B despite inadvertent movements of the ultrasound probe 106. Note that further details regarding structure and operation of the ultrasound imaging system 100 can be found in U.S. Pat. No. 9,456,766, titled “Apparatus for Use with Needle Insertion Guidance System,” which is incorporated by reference in its entirety into this application.
FIG. 3A illustrates the ultrasound probe 106 of the ultrasound imaging system 100 imaging a blood vessel of the patient P in an unsterile environment 300 prior to accessing the blood vessel in accordance with some embodiments. The imaging performed in FIG. 3A may be referred to as pre-scan imaging. FIG. 3B illustrates an ultrasound image of the blood vessel of FIG. 3A (a “pre-scan image”) 306 on a display screen 104 of the ultrasound imaging system 100 in accordance with some embodiments.
The pre-scan image 306 may be obtained at first time that is prior to preparing the patient P and the surrounding area for sterilization, where the pre-scan image 306 may be stored in the memory 118 of the console 102. The intended purpose of obtaining the pre-scan image 306 is to allow a clinician to obtain an image of the target vessel 302 using the ultrasound probe 106 without any constraints that may be imposed in order to maintain a sterile environment. As will be discussed below, the pre-scan image may then be used as a reference image to compare to the live scan image taken in a sterile field thereby allowing the clinician to confirm proper placement and orientation of the ultrasound probe 106.
In some embodiments, following operations to obtain, capture, and optionally to store, the pre-scan image, vessel identification logic 200 may be executed by the processor 116 causing performance of operations to identify a visual representation of the target vessel 302, such as the target vessel image 308 of FIG. 3B, within the pre-scan image 306 and/or detect other features of the pre-scan image 306. Other features detected may include those anatomical features typically visualized in an ultrasound image such as blood vessels, bones, muscles, tendons, ligaments, nerves, joints, etc.
The vessel identification logic 200 may be configured, upon execution by the processor 116, to cause performance of operations including computerized, automated analysis of the pre-scan image 306 to identify the target vessel image 308 through machine learning operations (e.g., application of a trained machine learning model). For instance, computerized, automated analysis may include operations comprising object recognition such as object detection methods, where the vessel identification logic 200 parses the pre-scan image 306 to locate a presence of one or more objects (e.g., the target vessel 302) with a bounding box and classify (label) the object within the bounding box. In order to perform such operations, the vessel identification logic 200 may include a machine learning model trained through supervised machine learning using a labeled data set. For example, a labeled data set may include ultrasound images that were previously captured (“historical data”) that has also been labeled, e.g., by another trained machine learning model and/or by a subject matter expert. The machine learning model is then trained on the labeled historical data so that upon completion of the training, the machine learning model may detect objects within a new image (e.g., the pre-scan image 306 and a live scan image discussed below with respect to FIGS. 4A-4B), place bounding boxes around the images and classify the images. It is noted that is some embodiments, the classification step may be skipped such that the trained machine learning model is configured to output an image including bounding boxes around detected objects within the image.
FIG. 4A illustrates the ultrasound probe 106 of the ultrasound imaging system 100 imaging a blood vessel of the patient P in a sterile environment 400 prior to accessing and/or while accessing the blood vessel in accordance with some embodiments. The imaging performed in FIG. 4A may be referred to as live scan imaging. FIG. 4B illustrates an ultrasound image of the blood vessel of FIG. 4A (a “live scan image”) on a display screen 104 of the ultrasound imaging system 100 in accordance with some embodiments.
The live scan image 406 may be obtained at second time that is subsequent to creating a sterilized area 402 around an insertion site on the patient P (or generally an area on the patient P. The live scan image 406 may also be stored in the memory 118 of the console 102. As noted above, systems and methods disclosed herein may include obtaining a pre-scan image 306 with the intended purpose of allowing a clinician to use the pre-scan image 306 as a reference image to compare to the live scan image 406 (which is taken in a sterile field) thereby allowing the clinician to confirm proper placement and orientation of the ultrasound probe 106 during the live scan process, which may correspond to insertion of a medical device such as the needle 112.
In some embodiments, following operations to obtain, capture, and optionally to store, the live scan image 406, the vessel identification logic 200 may be executed by the processor 116 causing performance of operations to identify a visual representation of the target vessel 302, such as the target vessel image 308, within the live scan image 406 and/or detect other features of the live scan image 406. Other features detected may include those anatomical features typically visualized in an ultrasound image such as blood vessels, bones, muscles, tendons, ligaments, nerves, joints, etc.
Referring now to FIG. 5 , the ultrasound probe 106 as illustrated in FIG. 3A that further includes an inertial measurement unit (“IMU”) 158 is shown in accordance with some embodiments. As discussed above, the IMU 158 is configured to obtain inertial measurement from any of one or more components selected from an accelerometer 160, a gyroscope 162, and a magnetometer 164. Based on the obtained inertial measurements, the IMU 158 is configured to provide positional-tracking data of the ultrasound probe 106 to the console 102 thereby enabling spatial awareness of the probe 106. The processor 116 is further configured to execute the logic 120 for processing the positional-tracking data for adjusting the distance of the activated ultrasonic transducers 148 from the target, the orientation of the activated ultrasonic transducers 148 to the target, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target to maintain the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target when the ultrasound probe 106 is inadvertently moved with respect to the target.
Referring to FIG. 6A, the ultrasound probe 106 modified to include a multi-core optical fiber is shown in accordance with some embodiments. The ultrasound probe 106 of FIG. 6A includes a multi-core optical fiber 600 that extends the length of a tether from the console 102 to the probe 106. Further, the multi-core optical fiber 600 may be configured in a predetermined geometry 602 at the distal end 604 of the probe 106. The predetermined geometry enables logic of the console 102 to perform shape sensing operations enabling detection of an orientation of the probe 106. More specifically, as the orientation of the predetermined geometry 602 relative to the transducers 148 is known prior to deployment of the probe 106, the ultrasound image displayed on the display 104 may be augmented with certain information, e.g., mirror coordination correction information, color-coding information, highlighting of a target vessel (see FIGS. 9A-9D), and the probe 106 may provide directional instructions via haptic feedback. Without knowing the orientation of the probe 106, such information cannot be provided.
Referring to FIG. 6B, the ultrasound probe 106 of FIG. 6A, a needle 606, and an exemplary needle trajectory is shown in accordance with some embodiments. FIG. 6B illustrates the probe 106 positioned on the skin surface of patient P in order to image the vessel 302. FIG. 6B further illustrates a needle 606 immediately adjacent an insertion site S intended to enter the target vessel 302 at a target site 608. Further, based on the orientation information obtained via the fiber optic data (e.g., reflected light signals returned from gratings disposed along the length of the multi-core optical fiber 600), logic of the console 102 may estimate a trajectory 610 of the needle 606. The trajectory 610 along with the ultrasound image (and/or a three-dimensional rending of the vessel 302) may be displayed on the display 104.
In some embodiments, the needle may also include a multi-core optical fiber 612 that extends the length of the needle 606 from either the console 102 or the probe 106. In such embodiments, the orientation of the needle 606 may be determined based on a shape sensing of the multi-core optical fiber 612 extending through a tether to the probe 106 (optional multi-core optical fiber 612′) or through a tether to the console 102 (optional multi-core optical fiber 612″). From the orientation of the probe 106 and the needle 606, a rendering of imaging captured by the probe 106, the target site 608, and the needle trajectory 610 may be generated and displayed on the display 104. Further, in addition to such information, knowledge of the human anatomy enables generation of a three-dimensional graphic for display on the display 104 (e.g., similar to the theoretical illustration of FIG. 6B).
As will be described below with respect to FIGS. 9A-9B, feedback may be provided to the clinician by the probe 106 in certain situations. For example, as discussed further below, the probe 106 may be configured to provide haptic feedback to the clinician indicating a direction to move the probe 106 in order to center the probe 106 over the target vessel 302 (and optionally over the insertion site 608). Certain feedback may also be provided by the probe 106 to instruct movement of the needle 606 (e.g., in any direction including yaw and/or pitch). The orientation of the needle may also be determined via the methodology discussed with respect to FIGS. 10A-12 . For example, light emitting diodes (LEDs) on the probe (FIGS. 9A, 9D) may provide an indication of a direction to move the needle. Additional LEDs from those illustrated in FIGS. 9A, 9D may be included on the probe 106.
Referring to FIG. 7 , the ultrasound probe 106 fixedly coupled to a mechanical arm 700 having a series of known location points along an arm length the patient P is shown in accordance with some embodiments. The mechanical arm 700 may be comprised of a series of arm components 701 and joints that hingedly couple the arm components 701 together with an ultrasound probe (e.g., the probe 106) fixed to the distal end of the most distal arm component. In some embodiments, a series of location points 704A-704D may be known, e.g., at each joint, at a proximal end of the probe 106, and at a distal end of the probe 106, such that an orientation and positioning of the mechanical arm 700 and the probe 106 may be determined relative to a fixed point, e.g., the console 102. In some embodiments, an optical fiber 706 (e.g., having one or more of core fibers) may extend from the console 102 to the probe 106, which provided reflected light signals to the console 102 thereby enabling determination of a positioning, orientation, and shape of the mechanical arm 700 and the probe 106 as detailed above. For instance, the optical fiber 706 may include the predetermined geometry 602 at the distal end of the probe 106 as illustrated in FIG. 6B in order to determine an orientation of the probe 106. The orientation, positioning, and configuration information obtained through the use of the mechanical arm 700 may be utilized in the same manner as that data obtained through the deployment of the IMU 158 within the probe 106 as discussed above.
Referring to FIG. 8 , an ultrasound imaging system 800 that includes alternative reality functionality is shown in accordance with some embodiments. The ultrasound imaging system 800 includes many of the components of the ultrasound imaging system 100 of FIG. 1 , where components that are held in common between the systems 100, 800 will not be discussed in detail. The system 800 includes the ability to perform augmented reality (AR) functionalities and components that provide AR data to a clinician. As used herein, the term “augmented reality” may refer to augmented reality (e.g., an enhancement of real-world images overlaid with computer-generated information) and virtual reality (e.g., replacement of a user's view with immersion within a computer-generated virtual environment).
For instance, the console may render an AR display screen 802 on the display 104. The AR display screen 802 may include certain visualizations as overlays on the ultrasound image obtained by the ultrasound probe 106. For example, the AR display screen 802 may include overlays that highlight certain anatomical features detected within the ultrasound image (e.g., vessels). In some embodiments, the target vessel may be distinguished visually from all detected anatomical features (e.g., the target vessel appears in a particular color, appears within a bounding box, etc.). Additional AR data that may be displayed includes directional indicators (e.g., “R”/“L” or “Right”/“Left”) that assist the clinician in properly characterizing a mirror coordination of the ultrasound probe 106, when applicable. Further, a center line may be overlaid on the ultrasound image as well as an arrow that instructs the clinician as to a direction to move the ultrasound probe 106 in order to center the ultrasound probe 106 over a target vessel, which places the target vessel in the center of the ultrasound image displayed on the display 104. The disclosure is also intended to disclose positioning of the ultrasound probe 106 that are alternative to the center a target vessel (or anatomical target, e.g., an organ, a vessel blockage, a chamber within a heart or position within an organ, etc.). For instance, it may be advantageous to place the ultrasound probe at a particular distance from the center of the target vessel in order to allow a needle to properly access an insertion site.
Additionally, the system 800 includes an AR device 804 that provides secondary AR data as an alternative to the AR display screen 802. The AR device 804 provides a second option (e.g., modality) for viewing AR data, while the first and secondary AR data may the same or substantially the same. As illustrated in FIG. 8 , the AR device 804 is represented by a pair of AR glasses 804 to be worn by a clinician performing the ultrasound procedure. The AR glasses 804 may be configured to display secondary AR data 806 on a display screen of the AR glasses 804. In the illustration of FIG. 8 , the first and secondary AR data is substantially the same (e.g., substantially the same display). However, the secondary AR data 806 may be seen by the clinician as an overlay directly on the patient body. For example, when imaging the insertion site S, the clinician may view the insertion site S through the AR glasses 806 such that the ultrasound image including any of highlighting of detected anatomical features and/or directional or orientation markers also appear in an augmented manner. Thus, the clinician views the augmented ultrasound image directly on the patient body when viewing the patient body through the AR glasses 806. Advantageously, the AR glasses 806 enable the clinician to maintain eye contact on the imaging area of the patient body and the augmented ultrasound image may correct any mirrored coordination that would otherwise be present when viewing the ultrasound image on the display 104.
Referring now to FIG. 9A, the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician that directs movement of the probe 106 center the probe 106 over the target vessel is shown in accordance with some embodiments. The probe 106 may be configured with vibration technology (e.g., actuators 901A-901B; linear resonant actuators (LRAs) or Piezoelectric actuators) that provide haptic feedback 900. Additionally, or in the alternative, the probe 106 may be configured with visual indicators (e.g., lights) configured to provide similar feedback by the actuators 901A-901B. The actuators 901A-901B may be activated to provide haptic feedback that instructs a clinician as to the direction move the probe 106 to center the probe 106 over the target vessel 904.
FIG. 9A illustrates a plurality of vessels: the target vessel 904; and secondary vessels 905 (e.g., non-target vessels). Thus, the console 102 may obtain an ultrasound image from the probe 106 and vessel identification logic 200 may perform a vessel identification process as discussed above to identify the target vessel 904 as well as detect the location of the target vessel 904 within the ultrasound image (e.g., relative to the ultrasound probe 106). The console 102 may then activate an actuator to provide haptic feedback instructing the clinician to move the probe 106 in a particular direction to center the probe 106 over the target vessel 904 (e.g., vibration on a right side of the probe 106 indicates the clinician move the probe 106 to the right). As a result, the clinician need not take his or her eyes off of the patient body and probe 106 to view the ultrasound image on the display 104 of the console 102 and determine which direction to move the probe 106. Similarly, the probe 106 may be configured with lights 902A, 902B that operate in the same manner as the actuators 901A, 901B (e.g., light up on a right side of the probe 106 indicates the clinician move the probe 106 to the right).
Referring to FIG. 9B, a first embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel is shown in accordance with some embodiments. FIG. 9B illustrates a display screen 906 that may accompany, or be an alternative to, the feedback capabilities of the probe 106 discussed above. The display screen 906 may be rendered on the display 104 of the console 102 and illustrate an ultrasound image (or a portion) captured by the probe 106. The display screen 906 may include a visual indication of identified anatomical features (e.g., a target vessel image 904′ and secondary vessel images 905′) as well as a center line 908 and a directional arrow indicator 910, where the directional arrow indicator 910 instructs the clinician as to the direction to move the probe 106 in order to center the probe 106 over the target vessel 904.
Referring to FIG. 9C, a second embodiment of a display screen illustrating the ultrasound imaging in real-time including a center line of the ultrasound probe 106 and a visual indication of a direction to move the probe 106 to center the probe 106 over the target vessel is shown in accordance with some embodiments. FIG. 9C provides an alternative to FIG. 9B where visual indicators provide explicit mirror coordination correction that may occur with ultrasound imaging. Thus, following identification of anatomical features (e.g., a target vessel 904 and secondary vessels 905), the console 102 may render the display screen 906 that includes a target vessel image 904′ and a secondary vessel image 605′ as well as a center line 908. Further, mirror coordination correction indicators 912 (“Right”, “Left”, and corresponding arrows) may be displayed, which indicate a direction to move the probe 106 in order to center the probe 106 over the target vessel 904 (or the secondary vessel 905). It is noted that the features illustrate in FIG. 9C may be combined with those of FIG. 9B.
Referring to FIG. 9D, the ultrasound probe 106 imaging a target vessel of the patient P and configured to provide feedback to a clinician when the probe 106 is centered over the target vessel is shown in accordance with some embodiments. FIG. 9D illustrates the probe 106 as discussed with respect to FIG. 9A while showing an embodiment of possible feedback when the probe 106 is centered over the target vessel 904. The center line 908 is shown in a dotted format merely to illustrate the center of the probe 106. In some embodiments, once the probe 106 is centered over the target vessel 904 (e.g., as determined by analysis of the ultrasound image by the vessel identification logic 200, and optionally in view of orientation and positioning data obtained via any of the modalities discussed above (e.g., shape sensing via a fiber optic, an IMU, etc.), feedback may be provided to the clinician that includes haptic feedback from both sides of the probe 106 and/or the lighting of both lights (e.g., light emitting diodes, LEDs) 902A, 902B. In some instances, the haptic feedback may differ from that provided when not centered over the target vessel 904 (e.g., when not centered, short pulses may be provided from a single side but when centered, one long pulse from both sides may be provided). Similarly, the lights 902A, 902B may blink in one situation and hold steady in another. Further, the feedback provided by the probe 106 may be customizable and/or dynamically adjusted prior to each use. For instance, any of the systems disclosed herein may be used within a medical facility (e.g., a hospital, a clinic, an urgent care facility, etc.) such that a plurality of clinicians may routinely utilize the console 102 and probe 106. In some embodiments, the console 102 may include the functionality for a clinician to sign-in to a particular profile, where each clinician profile stores a customized (or default) set of feedback.
Referring to FIGS. 10A and 10B, simplified views of the ultrasound probe of the guidance system being used to guide a needle toward a vessel within the body of a patient are shown in accordance with some embodiments. FIGS. 10A-10B illustrate the ultrasound probe 106 of the system 100 and a needle 1020 (which may be included in system 100) in position and ready for insertion thereof through a skin surface of patient P to access a targeted internal body portion. In particular, the probe 106 is shown with its head 1004 placed against the patient skin and producing an ultrasound beam 1006 so as to ultrasonically image a portion of a vessel 1008 beneath the skin surface of patient P. The ultrasonic image of the vessel 1008 can be depicted on the display 104 of the console 102.
In the embodiment of FIGS. 10A-10B, the system 100 is configured to detect the position, orientation, and movement of the needle 1020. In particular, the sensor array 1000 of the probe 106 is configured to detect a magnetic field of the magnetic element 1024 included with the needle 1020. Each of the sensors 1002 of the sensor array 1000 is configured to spatially detect the magnetic element 1024 in three-dimensional space. Thus, during operation of the system 100 in accordance with the embodiment of FIGS. 10A-10B, magnetic field strength data of the needle's magnetic element 1024 sensed by each of the sensors 1002 is forwarded to a processor, such as the processor 116 of the console 102 (FIG. 2 ), which computes in real-time the position and/or orientation of the magnetic element 1024 Specifically, and as shown in FIGS. 10A-10B, the position of the magnetic element 1024 in X, Y, and Z coordinate space with respect to the sensor array 1000 can be determined by the system 100 using the magnetic field strength data sensed by the sensors 1002. Moreover, FIG. shows that the pitch of the magnetic element 1024 can also be determined, while FIG. 10B shows that the yaw of the magnetic element can be determined. Suitable circuitry of the probe 106, the console 120, or other component of the system can provide the calculations necessary for such position/orientation. In one embodiment, the magnetic element 1024 can be tracked using the teachings of one or more of the following U.S. Pat. Nos. 5,775,322; 5,879,297; 6,129,668; 6,216,028; 6,263,230; and 9,456,766. The contents of the aforementioned U.S. patents are incorporated herein by reference in their entireties.
The above position and orientation information determined by the system 100, together with the length of the canula 1022 and position of the magnetic element 1024 with respect to the distal needle tip as known by or input into the system, enable the system 100 to accurately determine the location and orientation of the entire length of the needle 1020 with respect to the sensor array 1000. Optionally, the distance between the magnetic element 1024 and the distal needle tip is known by or input into the system 100. This in turn enables the system 100 to superimpose an image of the needle 1020 on to an image produced by the ultrasound beam 1006 of the probe 106.
Referring now to FIGS. 11A and 11B, possible screenshots for depiction on the display of the guidance system, showing the position and orientation of a needle are shown in accordance with some embodiments. FIGS. 11A and 11B show examples of a superimposition of the needle onto an ultrasound image. Specifically, FIGS. 11A and 11B each show a screenshot 1030 that can be depicted on the display 104 of the console 102, for instance. In FIG. 11A, an ultrasound image 1032 is shown, including depiction of the skin surface of patient P, and the subcutaneous vessel 1008 (area 1039). The ultrasound image 1032 corresponds to an image acquired by the ultrasound beam 1006 shown in FIGS. 11A and 11B, for instance. The screenshot 1030 further shows a needle image 1034 representing the position and orientation of the actual needle 1020 as determined by the system 100 as described above. Because the system is able to determine the location and orientation of the needle 1020 with respect to the sensor array 1000, the system is able to accurately determine the position and orientation of the needle 1020 with respect to the ultrasound image 1032 and superimpose it thereon for depiction as the needle image 1034 on the display 104. Coordination of the positioning of the needle image 1034 on the ultrasound image 1032 is performed by suitable algorithms executed by the processor 116 or other suitable component of the system 100.
Specifically, FIG. 11A shows that in one embodiment the system 100 can depict a projected path 1036 based on the current position and orientation of the needle 1020 as depicted by the needle image 1034. The projected path 1036 assists a clinician in determining whether the current orientation of the needle 1020, as depicted by the needle image 1034 on the display 104, will result in arriving at the desired internal body portion target, such as the vessel 1008. Again, as the orientation and/or position of the needle image 1034 changes, the projected path 1036 is correspondingly modified by the system 100. FIG. 11B shows that, in one embodiment, the screenshot 1030 can be configured such that the ultrasound image 1032 and the needle image 1034 are oriented so as to be displayed in a three-dimensional aspect. This enables the angle and orientation of the needle 1020, as depicted by the needle image 1034, to be ascertained and compared with the intended target imaged by the ultrasound image 1032. It should be noted that the screenshots 1030 are merely examples of possible depictions produced by the system 100 for display. Also, it is appreciated that, in addition to the visual display 104, aural information, such as beeps, tones, etc., can also be employed by the system 100 to assist the clinician during positioning and insertion of the needle into the patient. Further, haptic feedback may be provided to the clinician via the probe 106 in a similar manner as discussed above with respect to at least FIGS. 9A-9D.
Further details are given here regarding use of the system 100 in guiding a needle or other medical device in connection with ultrasonic imaging of a targeted internal body portion (“target”) of a patient, according to one embodiment. With the magnetic element-equipped needle 1020 positioned a suitable distance (e.g., two or more feet) away from the ultrasound probe 106 including the sensor array 1000, the probe is employed to ultrasonically image, for depiction on the display 104 of the system 100, the target within the patient that the needle is intended to intersect via percutaneous insertion. Following a calibration of the system 100 and obtaining or determining a total length of the needle 1020, and/or position of the magnetic element with respect to the distal needle tip such as by user input, automatic detection, or in another suitable manner, the needle 1020 is then brought into the range of the sensors 1002 of the sensor array 1000 of the probe 106. Each of the sensors 1002 detects the magnetic field strength associated with the magnetic element 1024 of the needle 1020, which data is forwarded to the processor 116. As the sensors 1002 detect the magnetic field, algorithms are performed by the processor 116 to calculate a magnetic field strength of the magnetic element 1024 of the needle 1020 at predicted points in space in relationship to the probe. The processor 116 then compares the actual magnetic field strength data detected by the sensors 1002 to the calculated field strength values (detail of this process is further described by the U.S. patents identified above). This process can be iteratively performed until the calculated value for a predicted point matches the measured data. Once this match occurs, the magnetic element 1024 has been positionally located in three-dimensional space. Using the magnetic field strength data as detected by the sensors 1002, the pitch and yaw (i.e., orientation) of the magnetic element 1024 can also be determined. Together with the known length of the needle 1020 and the position of the distal tip of the needle with respect to the magnetic element, this enables an accurate representation of the position and orientation of the needle can be made by the system 100 and depicted as a virtual model, i.e., the needle image 1034, on the display 104. Note that the predicted and actual detected values must match within a predetermined tolerance or confidence level in one embodiment for the system 100 to enable needle depiction to occur. Further detail as to the guidance of a needle toward a vessel within the body of a patient as discussed with respect to FIGS. 10A-12 is provided in U.S. Pat. No. 9,456,766, the entire contents of which is incorporated herein by reference.
Referring to FIG. 12 , is a flow diagram illustrating various stages of a method for guiding a needle to a desired target within the body of a patient is shown in accordance with some embodiments. Each block illustrated in FIG. 12 represents an operation performed in the method 1200, which begins at stage 1202 where a targeted internal body portion of a patient is imaged by an imaging system, such as an ultrasound imaging device for instance. At stage 1204, a detectable characteristic of a medical component such as a needle is sensed by one or more sensors included with the imaging system. In the present embodiment, the detectable characteristic of the needle is a magnetic field of the magnetic element 1024 included with the needle 1020 and the sensors are magnetic sensors included in the sensor array 1000 included with the ultrasound probe 106.
At stage 1206, a position of the medical component with respect to the targeted internal body portion is determined in at least two spatial dimensions via sensing of the detectable characteristic. As described above, such determination is made in the present embodiment by the processor 116 of the console 1120. At stage 1208, an image representing the position of the medical component is combined with the image of the targeted internal body portion for depiction on a display. At stage 1210, directional feedback is provided to the clinician directing movement (or confirming location) of an ultrasound probe utilized in capturing the image of the internal body portion. The directional feedback may be any as discussed above. Stage 1212 shows that stages 1204-1208 can be iteratively repeated to depict advancement or other movement of the medical component with respect to the imaged target, such as percutaneous insertion of the needle 1020 toward the vessel 1008 (FIGS. 11A, 11B), for instance. It is appreciated that the processor 116 or other suitable component can calculate additional aspects, including the area of image 1039 and the target 1038 (FIGS. 11A, 11B) for depiction on the display 104.
While some particular embodiments have been disclosed herein, and while the particular embodiments have been disclosed in some detail, it is not the intention for the particular embodiments to limit the scope of the concepts provided herein. Additional adaptations and/or modifications can appear to those of ordinary skill in the art, and, in broader aspects, these adaptations and/or modifications are encompassed as well. Accordingly, departures may be made from the particular embodiments disclosed herein without departing from the scope of the concepts provided herein.

Claims (17)

What is claimed is:
1. An ultrasound imaging system, comprising:
an ultrasound probe including an array of ultrasonic transducers and an optical fiber, wherein a portion of the optical fiber extends through at least a portion of the ultrasound probe, wherein the ultrasonic transducers are configured to emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals for processing into ultrasound images, and wherein the optical fiber includes a set of gratings disposed along a length of the optical fiber, and wherein the portion of the optical fiber extending through the portion of the ultrasound probe is configured in a predetermined geometry relative to the ultrasonic transducers; and
a console configured to communicate with the ultrasound probe, the console including one or more processors and a non-transitory computer-readable medium having stored thereon logic, when executed by the one or more processors, causes operations including:
obtaining orientation information of the ultrasound probe through analysis of reflected light signals reflected by the set of gratings in view of the predetermined geometry relative to the ultrasonic transducers;
performing an identification process on the reflected ultrasound signals to identify a target vessel;
determining, based on the orientation information, a direction of movement resulting in placement of a center of the ultrasound probe over an anatomical target; and
initiating provision of feedback to a user of the ultrasound probe indicating the direction of movement resulting in the placement of the center of the ultrasound probe over the anatomical target.
2. The ultrasound imaging system of claim 1, wherein the orientation information indicates positioning of the ultrasound probe on a Cartesian coordinate system relative to a skin surface of the patient.
3. The ultrasound imaging system of claim 1, wherein the optical fiber includes one or more core fibers, wherein each of the one or more core fibers includes a plurality of gratings distributed along a longitudinal length of a corresponding core fiber and each sensor of the plurality of gratings is configured to (i) reflect a light signal of a different spectral width based on received incident light, and (ii) change a characteristic of the reflected light signal for use in determining a physical state of the optical fiber.
4. The ultrasound imaging system of claim 3, wherein the operations further include:
providing a broadband incident light signal to the optical fiber,
receiving a reflected light signal of the broadband incident light signal, and
processing the reflected light signal to determine the orientation information.
5. The ultrasound imaging system of claim 1, wherein the identification process includes applying a trained machine learning model configured to detect anatomical features within the ultrasound images and provide a bounding box around the anatomical target.
6. The ultrasound imaging system of claim 1, wherein the provision of the feedback includes providing haptic feedback from a first side of the ultrasound probe, where the first side corresponds to the direction of movement required by the ultrasound probe to place the ultrasound probe at a position relative to the ultrasound probe over the anatomical target.
7. The ultrasound imaging system of claim 1, further comprising:
a needle including a second optical fiber configured to obtain needle orientation information, and wherein the operations further include:
determining, based on the needle orientation information, an orientation of the needle relative to the ultrasound probe,
determining a trajectory of the needle, and
generating a display screen illustrating the trajectory of the needle.
8. A method of performing an ultrasound procedure comprising:
providing an ultrasound probe including an array of ultrasonic transducers and an optical fiber, wherein a portion of the optical fiber extends through at least a portion of the ultrasound probe, wherein the ultrasonic transducers are configured to emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals for processing into ultrasound images, and wherein the optical fiber includes a set of gratings disposed along a length of the optical fiber, and wherein the portion of the_optical fiber extending through the portion of the ultrasound probe is configured in a predetermined geometry relative to the ultrasonic transducers;
providing a console configured to communicate with the ultrasound probe, the console including one or more processors and a non-transitory computer-readable medium having stored thereon logic that, when executed by the one or more processors, causes operations; and
instructing use of the ultrasound probe and the console to cause execution of the one or more processors of the console to perform operations including:
obtaining orientation information of the ultrasound probe through analysis of reflected light signals reflected by the set of gratings in view of the predetermined geometry relative to the ultrasonic transducers;
performing an identification process on the reflected ultrasound signals to identify a target vessel;
determining, based on the orientation information, a direction of movement resulting in placement of a center of the ultrasound probe over an anatomical target; and
initiating provision of feedback to a user of the ultrasound probe indicating the direction of movement resulting in the placement of the center of the ultrasound probe over the anatomical target.
9. The method of claim 8, wherein the orientation information indicates positioning of the ultrasound probe on a Cartesian coordinate system relative to a skin surface of the patient.
10. The method of claim 8, wherein the optical fiber includes one or more core fibers, wherein each of the one or more core fibers includes a plurality of gratings distributed along a longitudinal length of a corresponding core fiber and each sensor of the plurality of gratings is configured to (i) reflect a light signal of a different spectral width based on received incident light, and (ii) change a characteristic of the reflected light signal for use in determining a physical state of the optical fiber.
11. The method of claim 10, wherein the operations further include:
providing a broadband incident light signal to the optical fiber,
receiving a reflected light signal of the broadband incident light signal, and
processing the reflected light signal to determine the orientation information.
12. The method of claim 8, wherein the identification process includes applying a trained machine learning model configured to detect anatomical features within the ultrasound images and provide a bounding box around the anatomical target.
13. The method of claim 8, wherein the provision of the feedback includes providing haptic feedback from a first side of the ultrasound probe, where the first side corresponds to the direction of movement required by the ultrasound probe to place the ultrasound probe at a position relative to the ultrasound probe over the anatomical target.
14. The method of claim 8, further comprising:
providing a needle including a second optical fiber configured to obtain needle orientation information, and wherein the operations further include:
determining, based on the needle orientation information, an orientation of the needle relative to the ultrasound probe,
determining a trajectory of the needle, and
generating a display screen illustrating the trajectory of the needle.
15. A non-transitory, computer-readable medium having stored thereon logic that, when executed by one or more processors, causes performance of operations comprising:
obtaining orientation information of an ultrasound probe, wherein the ultrasound probe includes an array of ultrasonic transducers and an optical fiber, wherein a portion of the optical fiber extends through at least a portion of the ultrasound probe, wherein the ultrasonic transducers are configured to emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals for processing into ultrasound images, and wherein the optical fiber includes a set of gratings disposed along a length of the optical fiber, and wherein the portion of the optical fiber extending through the portion of the ultrasound probe is configured in a predetermined geometry relative to the ultrasonic transducers;
performing an identification process on the reflected ultrasound signals to identify an anatomical target;
determining, based on the orientation information, a direction of movement resulting in placement of a center of the ultrasound probe over the anatomical target; and
initiating provision of feedback to a user of the ultrasound probe indicating the direction of movement resulting in the placement of the center of the ultrasound probe over the anatomical target.
16. The non-transitory, computer-readable medium of claim 15, wherein the optical fiber includes one or more core fibers, wherein each of the one or more core fibers includes a plurality of gratings distributed along a longitudinal length of a corresponding core fiber and each sensor of the plurality of gratings is configured to (i) reflect a light signal of a different spectral width based on received incident light, and (ii) change a characteristic of the reflected light signal for use in determining a physical state of the optical fiber, and wherein the operations further include:
providing a broadband incident light signal to the optical fiber,
receiving a reflected light signal of the broadband incident light signal, and
processing the reflected light signal to determine the orientation information.
17. The non-transitory, computer-readable medium of claim 16, wherein the provision of the feedback includes providing haptic feedback from a first side of the ultrasound probe, where the first side corresponds to the direction of movement required by the ultrasound probe to place the ultrasound probe at a position relative to the ultrasound probe over the anatomical target.
US17/861,031 2022-07-08 2022-07-08 Systems and methods for intelligent ultrasound probe guidance Active US12137989B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US17/861,031 US12137989B2 (en) 2022-07-08 2022-07-08 Systems and methods for intelligent ultrasound probe guidance
CN202321786916.5U CN220655593U (en) 2022-07-08 2023-07-07 Ultrasound imaging system
CN202310834727.9A CN117357158A (en) 2022-07-08 2023-07-07 System and method for intelligent ultrasound probe guidance
EP23758430.5A EP4543303A1 (en) 2022-07-08 2023-07-07 Systems and methods for intelligent ultrasound probe guidance
PCT/US2023/027147 WO2024010940A1 (en) 2022-07-08 2023-07-07 Systems and methods for intelligent ultrasound probe guidance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/861,031 US12137989B2 (en) 2022-07-08 2022-07-08 Systems and methods for intelligent ultrasound probe guidance

Publications (2)

Publication Number Publication Date
US20240008929A1 US20240008929A1 (en) 2024-01-11
US12137989B2 true US12137989B2 (en) 2024-11-12

Family

ID=87762530

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/861,031 Active US12137989B2 (en) 2022-07-08 2022-07-08 Systems and methods for intelligent ultrasound probe guidance

Country Status (4)

Country Link
US (1) US12137989B2 (en)
EP (1) EP4543303A1 (en)
CN (2) CN220655593U (en)
WO (1) WO2024010940A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113952031A (en) 2020-07-21 2022-01-21 巴德阿克塞斯系统股份有限公司 Magnetic tracking ultrasound probe and system, method and apparatus for generating 3D visualizations thereof
CN114052905A (en) 2020-08-04 2022-02-18 巴德阿克塞斯系统股份有限公司 System and method for optimized medical component insertion monitoring and imaging enhancement
US12150812B2 (en) 2020-08-10 2024-11-26 Bard Access Systems, Inc. System and method for generating virtual blood vessel representations in mixed reality
EP4216825A2 (en) 2020-10-02 2023-08-02 Bard Access Systems, Inc. Ultrasound systems and methods for sustained spatial attention
WO2022081904A1 (en) 2020-10-15 2022-04-21 Bard Access Systems, Inc. Ultrasound imaging system for generation of a three-dimensional ultrasound image
US12102481B2 (en) 2022-06-03 2024-10-01 Bard Access Systems, Inc. Ultrasound probe with smart accessory
US12137989B2 (en) 2022-07-08 2024-11-12 Bard Access Systems, Inc. Systems and methods for intelligent ultrasound probe guidance

Citations (323)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5148809A (en) 1990-02-28 1992-09-22 Asgard Medical Systems, Inc. Method and apparatus for detecting blood vessels and displaying an enhanced video image from an ultrasound scan
US5181513A (en) 1990-05-29 1993-01-26 Pierre-Jean Touboul Method of acquiring ultrasound images
US5325293A (en) 1992-02-18 1994-06-28 Dorne Howard L System and method for correlating medical procedures and medical billing codes
US5441052A (en) 1992-12-28 1995-08-15 Kabushiki Kaisha Toshiba Color doppler-type ultrasonic diagnostic apparatus
US5549554A (en) 1994-04-01 1996-08-27 Advanced Cardiovascular Systems, Inc. Catheters having separable reusable components
US5573529A (en) 1994-10-31 1996-11-12 Haak; Benjamin A. Color coded medical instruments
US5775322A (en) 1996-06-27 1998-07-07 Lucent Medical Systems, Inc. Tracheal tube and methods related thereto
US5879297A (en) 1997-05-08 1999-03-09 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US5908387A (en) 1996-06-21 1999-06-01 Quinton Instrument Company Device and method for improved quantitative coronary artery analysis
EP0933063A1 (en) 1997-12-15 1999-08-04 Medison Co., Ltd. Ultrasonic color doppler imaging system
US5967984A (en) 1995-06-30 1999-10-19 Boston Scientific Corporation Ultrasound imaging catheter with a cutting element
US5970119A (en) 1997-11-18 1999-10-19 Douglas Holtz (Part Interest) Radiological scaling and alignment device
US6004270A (en) 1998-06-24 1999-12-21 Ecton, Inc. Ultrasound system for contrast agent imaging and quantification in echocardiography using template image for image alignment
US6019724A (en) 1995-02-22 2000-02-01 Gronningsaeter; Aage Method for ultrasound guidance during clinical procedures
US6068599A (en) 1997-07-14 2000-05-30 Matsushita Electric Industrial Co., Ltd. Blood vessel puncturing device using ultrasound
US6074367A (en) 1997-10-01 2000-06-13 Scimed Life Systems, Inc. Preinsertion measurement of catheters
JP2000271136A (en) 1999-03-25 2000-10-03 Toshiba Corp Ultrasonic therapeutic apparatus and method for controlling the same
US6129668A (en) 1997-05-08 2000-10-10 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US6132379A (en) 1998-11-04 2000-10-17 Patacsil; Estelito G. Method and apparatus for ultrasound guided intravenous cannulation
US6233476B1 (en) 1999-05-18 2001-05-15 Mediguide Ltd. Medical positioning system
US6263230B1 (en) 1997-05-08 2001-07-17 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US20020038088A1 (en) 1999-08-20 2002-03-28 Novasonics Inc. Miniaturized ultrasound apparatus and method
US6375615B1 (en) 1995-10-13 2002-04-23 Transvascular, Inc. Tissue penetrating catheters having integral imaging transducers and their methods of use
US6436043B2 (en) 1999-12-21 2002-08-20 Koninklijke Phillips Electronics N.V. Ultrasonic image processing method and examination system for displaying an ultrasonic composite image sequence of an artery
US20020148277A1 (en) 2001-04-11 2002-10-17 Manabu Umeda Method of making ultrasonic probe and ultrasonic probe
US6498942B1 (en) 1999-08-06 2002-12-24 The University Of Texas System Optoacoustic monitoring of blood oxygenation
US6503205B2 (en) 1998-11-18 2003-01-07 Cardiosonix Ltd. Dual ultrasonic transducer probe for blood flow measurement, and blood vessel diameter determination method
US6508769B2 (en) 1999-12-28 2003-01-21 Koninklijke Philips Electronics N.V. Ultrasonic image processing method and examination system for displaying an ultrasonic color-coded image sequence of an object having moving parts
US6511458B2 (en) 1998-01-13 2003-01-28 Lumend, Inc. Vascular re-entry catheter
US6524249B2 (en) 1998-11-11 2003-02-25 Spentech, Inc. Doppler ultrasound method and apparatus for monitoring blood flow and detecting emboli
US20030047126A1 (en) 2001-09-12 2003-03-13 Tomaschko Daniel K. System for identifying medical devices
US20030060714A1 (en) 2001-09-24 2003-03-27 Henderson Richard W. Medical ultrasound transducer with interchangeable handle
US6543642B1 (en) 2001-09-21 2003-04-08 Daydots International, Inc. Disposable glove dispenser system
US20030073900A1 (en) 2001-10-12 2003-04-17 Pranitha Senarith System and method for monitoring the movement of an interventional device within an anatomical site
US6554771B1 (en) 2001-12-18 2003-04-29 Koninklijke Philips Electronics N.V. Position sensor in ultrasound transducer probe
US20030093001A1 (en) 2001-11-09 2003-05-15 Antti Martikainen Method and assembly for identifying a measuring cuff
US20030106825A1 (en) 2001-12-07 2003-06-12 The Procter & Gamble Company Package containing a window and performance characteristic indicator
US20030120154A1 (en) 2001-11-28 2003-06-26 Frank Sauer Method and apparatus for ultrasound guidance of needle biopsies
US6592520B1 (en) 2001-07-31 2003-07-15 Koninklijke Philips Electronics N.V. Intravascular ultrasound imaging apparatus and method
US6592565B2 (en) 2001-04-26 2003-07-15 Zbylut J. Twardowski Patient-tailored, central-vein catheters
US6612992B1 (en) 2000-03-02 2003-09-02 Acuson Corp Medical diagnostic ultrasound catheter and method for position determination
US6613002B1 (en) 1999-06-05 2003-09-02 Wilson-Cook Medical Incorporated System of indicia for a medical device
US6623431B1 (en) 2002-02-25 2003-09-23 Ichiro Sakuma Examination method of vascular endothelium function
US6641538B2 (en) 2001-11-22 2003-11-04 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of controlling a ultrasonic diagnostic apparatus
US6647135B2 (en) 1999-12-07 2003-11-11 Koninklijke Philips Electronics N.V. Ultrasonic image processing method and system for displaying a composite image sequence of an artery segment
US6687386B1 (en) 1999-06-15 2004-02-03 Hitachi Denshi Kabushiki Kaisha Object tracking method and object tracking apparatus
US20040055925A1 (en) 2000-06-13 2004-03-25 Judith Franks-Farah Male clean intermittent catheter system
US6749569B1 (en) 2003-01-07 2004-06-15 Esaote S.P.A. Method and apparatus for ultrasound imaging
US6754608B2 (en) 2001-05-23 2004-06-22 Radi Medical Systems Ab Interactive measurement system
US6755789B2 (en) 2002-02-05 2004-06-29 Inceptio Medical Technologies, Llc Ultrasonic vascular imaging system and method of blood vessel cannulation
US20050000975A1 (en) 2003-05-28 2005-01-06 Carco Darlene Marie Sterile surgical glove dispenser
EP1504713A1 (en) 2003-07-14 2005-02-09 Surgical Navigation Technologies, Inc. Navigation system for cardiac therapies
US6857196B2 (en) 2003-04-30 2005-02-22 Robert Dalrymple Method and apparatus for measuring a intracorporal passage image
US20050049504A1 (en) 2003-08-27 2005-03-03 Meng-Tsung Lo Ultrasonic vein detector and relating method
US20050165299A1 (en) 2004-01-23 2005-07-28 Traxyz Medical, Inc. Methods and apparatus for performing procedures on target locations in the body
US20050251030A1 (en) 2004-04-21 2005-11-10 Azar Fred S Method for augmented reality instrument placement using an image based navigation system
US20050267365A1 (en) 2004-06-01 2005-12-01 Alexander Sokulin Method and apparatus for measuring anatomic structures
US6979294B1 (en) 2002-12-13 2005-12-27 California Institute Of Technology Split-screen display system and standardized methods for ultrasound image acquisition and processing for improved measurements of vascular structures
US20060013523A1 (en) 2004-07-16 2006-01-19 Luna Innovations Incorporated Fiber optic position and shape sensing device and method relating thereto
US20060015039A1 (en) 2004-07-19 2006-01-19 Cassidy Kenneth T Guidewire bearing markings simplifying catheter selection
US20060020204A1 (en) 2004-07-01 2006-01-26 Bracco Imaging, S.P.A. System and method for three-dimensional space management and visualization of ultrasound data ("SonoDEX")
US20060079781A1 (en) 2002-12-18 2006-04-13 Koninklijke Philips Electronics N.V. Ultrasonic apparatus for estimating artery parameters
US7074187B2 (en) 2002-12-13 2006-07-11 Selzer Robert H System and method for improving ultrasound image acquisition and replication for repeatable measurements of vascular structures
US20060184029A1 (en) 2005-01-13 2006-08-17 Ronen Haim Ultrasound guiding system and method for vascular access and operation mode
US20060210130A1 (en) 2002-12-18 2006-09-21 Laurence Germond-Rouet Ultrasonic doppler system for determining movement of artery walls
AU2006201646A1 (en) 2005-04-26 2006-11-09 Biosense Webster, Inc. Display of catheter tip with beam direction for ultrasound system
US20070043341A1 (en) 2001-05-30 2007-02-22 Anderson R R Apparatus and method for laser treatment with spectroscopic feedback
US20070049822A1 (en) 2005-08-31 2007-03-01 Sonosite, Inc. Medical device guide locator
US20070073155A1 (en) 2005-09-02 2007-03-29 Ultrasound Ventures, Llc Ultrasound guidance system
US7244234B2 (en) 2003-11-11 2007-07-17 Soma Development Llc Ultrasound guided probe device and method of using same
US20070199848A1 (en) 2006-02-28 2007-08-30 Ellswood Mark R Packaging with color-coded identification
US20070239120A1 (en) 1998-02-24 2007-10-11 Brock David L Flexible instrument
US20070249911A1 (en) 2006-04-21 2007-10-25 Simon David A Method and apparatus for optimizing a therapy
US20080021322A1 (en) 2006-05-24 2008-01-24 Michael Benjamin Stone Ultrasonic imaging apparatus and method
US20080033759A1 (en) 2006-08-02 2008-02-07 Vastrac, Inc. Information manager for a procedure-based medical practice
US20080033293A1 (en) 2006-05-08 2008-02-07 C. R. Bard, Inc. User interface and methods for sonographic display device
US20080051657A1 (en) 2005-02-28 2008-02-28 Rold Michael D Systems And Methods For Estimating The Size And Position Of A Medical Device To Be Applied Within A Patient
US7359554B2 (en) 2002-08-26 2008-04-15 Cleveland Clinic Foundation System and method for identifying a vascular border
EP1591074B1 (en) 2004-04-26 2008-05-21 BrainLAB AG Visualization of procedural guidelines for medical procedures
US20080146915A1 (en) 2006-10-19 2008-06-19 Mcmorrow Gerald Systems and methods for visualizing a cannula trajectory
US20080177186A1 (en) 2007-01-18 2008-07-24 Slater Charles R Methods and Apparatus for Determining a Treatment Volume of a Fluid Treatment Agent for Treating The Interior of a Blood Vessel
US20080221425A1 (en) 2007-03-09 2008-09-11 Olson Eric S System and method for local deformable registration of a catheter navigation system to image data or a model
US20080294037A1 (en) 2007-05-23 2008-11-27 Jacob Richter Apparatus and Method for Guided Chronic Total Occlusion Penetration
US20080300491A1 (en) 2007-06-04 2008-12-04 Medtronic, Inc. Percutaneous needle guide and methods of use
US20090012399A1 (en) 2005-02-07 2009-01-08 Kazuhiro Sunagawa Ultrasonic diagnostic apparatus
US7534209B2 (en) 2000-05-26 2009-05-19 Physiosonics, Inc. Device and method for mapping and tracking blood flow and determining parameters of blood flow
US20090143684A1 (en) 2007-12-04 2009-06-04 Civco Medical Instruments Co., Inc. Needle guide system for use with ultrasound transducers to effect shallow path needle entry and method of use
US20090143672A1 (en) 2007-12-04 2009-06-04 Harms Steven E Method for mapping image reference points to facilitate biopsy using magnetic resonance imaging
US20090156926A1 (en) 2007-11-26 2009-06-18 C.R. Bard, Inc. Integrated System for Intravascular Placement of a Catheter
US7599730B2 (en) 2002-11-19 2009-10-06 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US20090306509A1 (en) 2005-03-30 2009-12-10 Worcester Polytechnic Institute Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
US20100020926A1 (en) 2008-07-25 2010-01-28 Jan Boese Method for representing interventional instruments in a 3d data set of an anatomy to be examined as well as a reproduction system for performing the method
US7681579B2 (en) 2005-08-02 2010-03-23 Biosense Webster, Inc. Guided procedures for treating atrial fibrillation
US7691061B2 (en) 2004-06-24 2010-04-06 Terumo Kabushiki Kaisha Ultrasonic diagnostic apparatus and method of processing an ultrasound signal
US7699779B2 (en) 2003-05-19 2010-04-20 Hitachi, Ltd. Ultrasonic treatment equipment
US20100106015A1 (en) 2008-10-23 2010-04-29 Norris Perry R Medical device alignment
US7720520B2 (en) 2004-12-01 2010-05-18 Boston Scientific Scimed, Inc. Method and system for registering an image with a navigation reference catheter
US7727153B2 (en) 2003-04-07 2010-06-01 Sonosite, Inc. Ultrasonic blood vessel measurement apparatus and method
US7734326B2 (en) 2002-06-20 2010-06-08 Brainlab Ag Method and device for preparing a drainage
US20100179428A1 (en) 2008-03-17 2010-07-15 Worcester Polytechnic Institute Virtual interactive system for ultrasound training
US20100211026A2 (en) 2005-03-04 2010-08-19 C. R. Bard, Inc. Access port identification systems and methods
US20100277305A1 (en) 2007-06-01 2010-11-04 Koninklijke Philips Electronics N.V. Wireless Ultrasound Probe Asset Tracking
US7831449B2 (en) 2001-02-02 2010-11-09 Thompson Reuters (Healthcare) Inc. Method and system for extracting medical information for presentation to medical providers on mobile terminals
US20100286515A1 (en) 2007-09-28 2010-11-11 Dietrich Gravenstein Novel Methods and Devices for Noninvasive Measurement of Energy Absorbers in Blood
US20100312121A1 (en) 2009-06-09 2010-12-09 Zhonghui Guan Apparatus for a needle director for an ultrasound transducer probe
US20110002518A1 (en) 2009-07-01 2011-01-06 General Electric Company Method and system for processing ultrasound data
US7905837B2 (en) 2006-09-04 2011-03-15 Ge Medical Systems Global Technology Company, Llc Ultrasound diagnostic apparatus
US20110071404A1 (en) 2009-09-23 2011-03-24 Lightlab Imaging, Inc. Lumen Morphology and Vascular Resistance Measurements Data Collection Systems, Apparatus and Methods
US7925327B2 (en) 2002-12-04 2011-04-12 Koninklijke Philips Electronics N.V. Apparatus and method for assisting the navigation of a catheter in a vessel
US7927278B2 (en) 2002-12-13 2011-04-19 California Institute Of Technology Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing
US8014848B2 (en) 2004-04-26 2011-09-06 Brainlab Ag Visualization of procedural guidelines for a medical procedure
US8050523B2 (en) 2007-04-20 2011-11-01 Koninklijke Philips Electronics N.V. Optical fiber shape sensing systems
US8060181B2 (en) 2006-04-07 2011-11-15 Brainlab Ag Risk assessment for planned trajectories
US20110295108A1 (en) 2007-11-26 2011-12-01 C.R. Bard, Inc. Apparatus for use with needle insertion guidance system
US8075488B2 (en) 2005-05-12 2011-12-13 Compumedics Medical Innovation Pty. Ltd. Ultrasound diagnosis and treatment apparatus
US20110313293A1 (en) 2009-10-08 2011-12-22 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
US8090427B2 (en) 2003-09-04 2012-01-03 Koninklijke Philips Electronics N.V. Methods for ultrasound visualization of a vessel with location and cycle information
US8105239B2 (en) 2006-02-06 2012-01-31 Maui Imaging, Inc. Method and apparatus to visualize the coronary arteries using ultrasound
US8172754B2 (en) 2006-04-18 2012-05-08 Panasonic Corporation Ultrasonograph
US8175368B2 (en) 2005-04-05 2012-05-08 Scimed Life Systems, Inc. Systems and methods for image segmentation with a multi-state classifier
US8200313B1 (en) 2008-10-01 2012-06-12 Bioquantetics, Inc. Application of image-based dynamic ultrasound spectrography in assisting three dimensional intra-body navigation of diagnostic and therapeutic devices
US8211023B2 (en) 2006-08-11 2012-07-03 Koninklijke Philips Electronics N.V. Ultrasound system for cerebral blood flow monitoring
US20120179038A1 (en) 2011-01-07 2012-07-12 General Electric Company Ultrasound based freehand invasive device positioning system and method
US20120197132A1 (en) 2011-01-31 2012-08-02 Analogic Corporation Ultrasound imaging apparatus
US20120209121A1 (en) 2011-02-15 2012-08-16 General Electric Company Ultrasound probe including a securing member
US20120220865A1 (en) 2010-12-31 2012-08-30 Volcano Corporation Pulmonary Embolism Diagnostic Devices and Associated Methods and Systems
US20120238875A1 (en) 2004-11-30 2012-09-20 Eric Savitsky Embedded Motion Sensing Technology for Integration within Commercial Ultrasound Probes
US8298147B2 (en) 2005-06-24 2012-10-30 Volcano Corporation Three dimensional co-registration for intravascular diagnosis and therapy
US20120277576A1 (en) 2011-04-26 2012-11-01 Chun Kee Lui Echogenic infusion port catheter
US8303505B2 (en) 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
US8323202B2 (en) 2007-11-16 2012-12-04 Pneumrx, Inc. Method and system for measuring pulmonary artery circulation information
US8328727B2 (en) 2000-03-23 2012-12-11 Tensys Medical, Inc. Method and apparatus for assessing hemodynamic parameters within the circulatory system of a living subject
US20130041250A1 (en) 2011-08-09 2013-02-14 Ultrasonix Medical Corporation Methods and apparatus for locating arteries and veins using ultrasound
US8409103B2 (en) 2005-05-06 2013-04-02 Vasonova, Inc. Ultrasound methods of positioning guided vascular access devices in the venous system
US20130102889A1 (en) 2011-10-21 2013-04-25 C. R. Bard, Inc. Systems and Methods for Ultrasound-Based Medical Device Assessment
US20130131499A1 (en) * 2010-02-09 2013-05-23 Koninklijke Philips Electronics N.V. Apparatus, system and method for imaging and treatment using optical position sensing
US20130131502A1 (en) 2011-11-18 2013-05-23 Michael Blaivas Blood vessel access system and device
US8449465B2 (en) 2005-10-14 2013-05-28 Cleveland Clinic Foundation System and method for characterizing vascular tissue
US20130150724A1 (en) 2010-01-07 2013-06-13 Verathon Inc. Blood vessel access devices, systems, and methods
US20130188832A1 (en) 2009-04-14 2013-07-25 Qinglin Ma Systems and methods for adaptive volume imaging
US20130218024A1 (en) 2011-10-09 2013-08-22 Clear Guide Medical, Llc Interventional In-Situ Image-Guidance by Fusing Ultrasound and Video
US8553954B2 (en) 2010-08-24 2013-10-08 Siemens Medical Solutions Usa, Inc. Automated system for anatomical vessel characteristic determination
US8556815B2 (en) 2009-05-20 2013-10-15 Laurent Pelissier Freehand ultrasound imaging systems and methods for guiding fine elongate instruments
US8585600B2 (en) 2010-12-09 2013-11-19 Ge Medical Systems Global Technology Company, Llc Ultrasound volume probe navigation and control method and device
US20130324840A1 (en) 2010-10-08 2013-12-05 Jian Zhongping Detection of blood-vessel wall artifacts
US20140005530A1 (en) 2012-06-29 2014-01-02 General Electric Company Ultrasound imaging method and ultrasound imaging apparatus
US8622913B2 (en) 2010-09-28 2014-01-07 General Electric Company Method and system for non-invasive monitoring of patient parameters
US20140031690A1 (en) 2012-01-10 2014-01-30 Panasonic Corporation Ultrasound diagnostic apparatus and method for identifying blood vessel
US20140036091A1 (en) 2011-11-02 2014-02-06 Seno Medical Instruments, Inc. Interframe energy normalization in an optoacoustic imaging system
US20140073976A1 (en) 2012-09-12 2014-03-13 Heartflow, Inc. Systems and methods for estimating ischemia and blood flow characteristics from vessel geometry and physiology
US20140100440A1 (en) 2012-10-05 2014-04-10 Volcano Corporation System and method for instant and automatic border detection
US8706457B2 (en) 2012-05-14 2014-04-22 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8734357B2 (en) 2010-08-12 2014-05-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8744211B2 (en) 2011-08-31 2014-06-03 Analogic Corporation Multi-modality image acquisition
US20140155737A1 (en) 2011-08-16 2014-06-05 Koninklijke Philips N.V. Curved multi-planar reconstruction using fiber optic shape data
US8754865B2 (en) 2011-11-16 2014-06-17 Volcano Corporation Medical measuring system and method
US8764663B2 (en) 2012-03-14 2014-07-01 Jeffrey Smok Method and apparatus for locating and distinguishing blood vessel
US20140188440A1 (en) 2012-12-31 2014-07-03 Intuitive Surgical Operations, Inc. Systems And Methods For Interventional Procedure Planning
US20140188133A1 (en) 2007-11-26 2014-07-03 C. R. Bard, Inc. Iconic Representations for Guidance of an Indwelling Medical Device
US8781194B2 (en) 2009-04-17 2014-07-15 Tufts Medical Center, Inc. Aneurysm detection
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US8790263B2 (en) 2007-02-05 2014-07-29 Siemens Medical Solutions Usa, Inc. Automated movement detection with audio and visual information
WO2014115150A1 (en) 2013-01-24 2014-07-31 Tylerton International Holdings Inc. Body structure imaging
JP2014150928A (en) 2013-02-07 2014-08-25 Hitachi Aloka Medical Ltd Ultrasonic diagnostic device
US20140276085A1 (en) 2013-03-13 2014-09-18 Volcano Corporation Coregistered intravascular and angiographic images
US20140276081A1 (en) 2013-03-12 2014-09-18 St. Jude Medical Puerto Rico Llc Ultrasound assisted needle puncture mechanism
US20140276059A1 (en) 2013-03-12 2014-09-18 Volcano Corporation Externally imaging a body structure within a patient
US20140276690A1 (en) 2013-03-14 2014-09-18 The Spectranetics Corporation Controller to select optical channel parameters in a catheter
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
WO2014174305A2 (en) 2013-04-26 2014-10-30 Ucl Business Plc A method and apparatus for determining the location of a medical instrument with respect to ultrasound imaging, and a medical instrument to facilitate such determination
US20140343431A1 (en) 2011-12-16 2014-11-20 Koninklijke Philips N.V. Automatic blood vessel identification by name
US20150005738A1 (en) 2013-06-26 2015-01-01 Corindus, Inc. System and method for monitoring of guide catheter seating
US20150011887A1 (en) 2013-07-04 2015-01-08 Samsung Medison Co., Ltd. Ultrasound system and method for providing object information
US8939908B2 (en) 2009-07-16 2015-01-27 Unex Corporation Ultrasonic blood vessel inspecting apparatus
WO2015017270A1 (en) 2013-07-29 2015-02-05 Intuitive Surgical Operations, Inc. Shape sensor systems with redundant sensing
US8961420B2 (en) 2010-04-01 2015-02-24 Siemens Medical Solutions Usa, Inc. System for cardiac condition detection and characterization
US20150065916A1 (en) 2013-08-29 2015-03-05 Vasculogic, Llc Fully automated vascular imaging and access system
US20150073279A1 (en) 2013-09-11 2015-03-12 Boston Scientific Scimed, Inc. Systems and methods for selection and displaying of images using an intravascular ultrasound imaging system
US20150112200A1 (en) 2008-12-18 2015-04-23 C. R. Bard, Inc. Needle Guide Including Enhanced Visibility Entrance
US9022940B2 (en) 2008-07-18 2015-05-05 Joseph H. Meier Handheld imaging devices and related methods
US20150209113A1 (en) 2014-01-29 2015-07-30 Becton, Dickinson And Company Wearable Electronic Device for Enhancing Visualization During Insertion of an Invasive Device
US20150209526A1 (en) 2014-01-14 2015-07-30 Volcano Corporation Devices and methods for forming vascular access
US9138290B2 (en) 2007-07-27 2015-09-22 Meridian Cardiovascular Systems, Inc. Method of ablating arterial plaque
US9155517B2 (en) 2007-07-13 2015-10-13 Ezono Ag Opto-electrical ultrasound sensor and system
US20150294497A1 (en) 2012-05-31 2015-10-15 Koninklijke Philips N.V. Ultrasound imaging system and method for image guidance procedure
US20150297097A1 (en) 2014-01-14 2015-10-22 Volcano Corporation Vascular access evaluation and treatment
US20150327841A1 (en) 2014-05-13 2015-11-19 Kabushiki Kaisha Toshiba Tracking in ultrasound for imaging and user interface
US9204858B2 (en) 2010-02-05 2015-12-08 Ultrasonix Medical Corporation Ultrasound pulse-wave doppler measurement of blood flow velocity and/or turbulence
US20150359991A1 (en) 2013-03-05 2015-12-17 Ezono Ag System for image guided procedure
US9220477B2 (en) 2009-12-18 2015-12-29 Konica Minolta, Inc. Ultrasonic diagnostic device, and region-to-be-detected image display method and measurement method using same
US20160029995A1 (en) 2013-03-15 2016-02-04 Nilus Medical Llc Hemodynamic monitoring device and methods of using same
US20160029998A1 (en) 2013-12-04 2016-02-04 Obalon Therapeutics, Inc. Systems and methods for locating and/or characterizing intragastric devices
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
US9295447B2 (en) 2011-08-17 2016-03-29 Volcano Corporation Systems and methods for identifying vascular borders
US20160100970A1 (en) 2014-10-09 2016-04-14 Obalon Therapeutics, Inc. Ultrasonic systems and methods for locating and/or characterizing intragastric devices
US20160101263A1 (en) 2014-10-10 2016-04-14 Intuitive Surgical Operations, Inc. Systems and methods for reducing measurement error using optical fiber shape sensors
US9320493B2 (en) 2014-07-08 2016-04-26 Nadarasa Visveshwara System and method for measuring fluidics in arteries
US20160113699A1 (en) 2013-05-23 2016-04-28 CardioSonic Ltd. Devices and methods for renal denervation and assessment thereof
US20160120607A1 (en) 2014-11-03 2016-05-05 Michael Sorotzkin Ultrasonic imaging device for examining superficial skin structures during surgical and dermatological procedures
US20160143622A1 (en) 2013-06-26 2016-05-26 Koninklijke Philips N.V. System and method for mapping ultrasound shear wave elastography measurements
US9364171B2 (en) 2010-12-22 2016-06-14 Veebot Systems, Inc. Systems and methods for autonomous intravenous needle insertion
US20160166232A1 (en) 2014-12-10 2016-06-16 Volcano Corporation Devices, systems, and methods for in-stent restenosis prediction
US20160202053A1 (en) 2013-03-13 2016-07-14 Hansen Medical, Inc. Reducing incremental measurement sensor error
US20160213398A1 (en) 2015-01-26 2016-07-28 Ming-Wei Liu Ultrasound needle guide apparatus
US9427207B2 (en) 2011-04-05 2016-08-30 Houston Medical Robotics, Inc. Motorized systems and methods for accessing the lumen of a vessel
US9445780B2 (en) 2009-12-04 2016-09-20 University Of Virginia Patent Foundation Tracked ultrasound vessel imaging
US20160278743A1 (en) 2014-06-11 2016-09-29 Olympus Corporation Medical diagnostic apparatus, method for operating medical diagnostic apparatus, and computer-readable recording medium
US20160278869A1 (en) 2015-01-19 2016-09-29 Bard Access Systems, Inc. Device and Method for Vascular Access
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
US9456804B2 (en) 2013-06-04 2016-10-04 Seiko Epson Corporation Ultrasound measurement apparatus and ultrasound measurement method
US20160296208A1 (en) 2007-02-09 2016-10-13 Board Of Regents, The University Of Texas System Intravascular Photoacoustic and Ultrasound Echo Imaging
US9468413B2 (en) 2008-09-05 2016-10-18 General Electric Company Method and apparatus for catheter guidance using a combination of ultrasound and X-ray imaging
US9492097B2 (en) 2007-11-26 2016-11-15 C. R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US20160374644A1 (en) 2015-06-25 2016-12-29 Rivanna Medical Llc Ultrasonic Guidance of a Probe with Respect to Anatomical Features
US9554716B2 (en) 2007-11-26 2017-01-31 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US9597008B2 (en) 2011-09-06 2017-03-21 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US20170086785A1 (en) 2015-09-30 2017-03-30 General Electric Company System and method for providing tactile feedback via a probe of a medical imaging system
US9610061B2 (en) 2011-04-14 2017-04-04 Regents Of The University Of Minnesota Vascular characterization using ultrasound imaging
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US9649037B2 (en) 2009-12-03 2017-05-16 Deltex Medical Limited Method and apparatus for hemodynamic monitoring using combined blood flow and blood pressure measurement
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US20170164923A1 (en) 2015-12-14 2017-06-15 Konica Minolta, Inc. Image Processor, Ultrasound Diagnostic Device Including Same, And Image Processing Method
WO2017096487A1 (en) 2015-12-10 2017-06-15 1929803 Ontario Corp. D/B/A Ke2 Technologies Systems and methods for automated fluid response measurement
EP3181083A1 (en) 2015-12-18 2017-06-21 Biosense Webster (Israel), Ltd. Using force sensor to give angle of ultrasound beam
US20170172424A1 (en) * 2014-12-22 2017-06-22 Eggers & Associates, Inc. Wearable Apparatus, System and Method for Detection of Cardiac Arrest and Alerting Emergency Response
US20170188839A1 (en) 2014-09-25 2017-07-06 Fujifilm Corporation Photoacoustic image generation apparatus
US9702969B2 (en) 2009-05-13 2017-07-11 Koninklijke Philips Electronics N.V. Ultrasonic blood flow doppler audio with pitch shifting
US20170196535A1 (en) 2014-06-04 2017-07-13 Hitachi, Ltd. Medical treatment system
US9717415B2 (en) 2007-03-08 2017-08-01 Sync-Rx, Ltd. Automatic quantitative vessel analysis at the location of an automatically-detected tool
US20170215842A1 (en) 2014-08-28 2017-08-03 Samsung Electronics Co., Ltd. Ultrasound diagnosis apparatus for self-diagnosis and remote-diagnosis, and method of operating the ultrasound diagnosis apparatus
US9731066B2 (en) 2011-09-30 2017-08-15 General Electric Company Device, system and method of automatic vessel access based on real time volumetric ultrasound
US20170259013A1 (en) 2012-10-30 2017-09-14 Elwha Llc Systems and Methods for Generating an Injection Guide
US20170265840A1 (en) * 2014-12-01 2017-09-21 Koninklijke Philips N.V. Registration of optical shape sensing tool
US20170303894A1 (en) 2015-01-08 2017-10-26 The Charlotte Mecklenburg Hospital Authority D/B/A Carolinas Healthcare System Ultrasound probe couplers and related methods
US9814531B2 (en) 2011-08-26 2017-11-14 EBM Corporation System for diagnosing bloodflow characteristics, method thereof, and computer software program
US9814433B2 (en) 2012-10-24 2017-11-14 Cathworks Ltd. Creating a vascular tree model
WO2017214428A1 (en) 2016-06-08 2017-12-14 The United States Of America, As Represented By The Secretary, Department Of Health And Human Services Tissue characterization with acoustic wave tomosynthesis
US20170367678A1 (en) 2016-06-22 2017-12-28 Cesare Sirtori Ultrasound automated method for measuring the thickness of the walls of the left anterior descending, right and circumflex coronary arteries
US9861337B2 (en) 2013-02-04 2018-01-09 General Electric Company Apparatus and method for detecting catheter in three-dimensional ultrasound images
US20180015256A1 (en) 2016-07-14 2018-01-18 C. R. Bard, Inc. Automated Catheter-To-Vessel Size Comparison Tool And Related Methods
WO2018026878A1 (en) 2016-08-02 2018-02-08 Avent, Inc. Motor-assisted needle guide assembly for ultrasound needle placement
US9895138B2 (en) 2011-06-06 2018-02-20 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus
US20180116723A1 (en) 2016-10-28 2018-05-03 Medtronic Ardian Luxembourg S.A.R.L. Methods and Systems for Optimizing Perivascular Neuromodulation Therapy Using Computational Fluid Dynamics
US20180125450A1 (en) 2015-04-24 2018-05-10 U.S. Government, As Represented By The Secretary Of The Army Vascular Targeting System
US20180161502A1 (en) 2015-06-15 2018-06-14 The University Of Sydney Insertion system and method
KR20180070878A (en) 2016-12-19 2018-06-27 지멘스 메디컬 솔루션즈 유에스에이, 인크. Method of providing annotation information of ultrasound probe and ultrasound system
US20180199914A1 (en) 2015-07-22 2018-07-19 Koninklijke Philips N.V. Fiber-optic realshape sensor for enhanced dopper measurement display
WO2018134726A1 (en) 2017-01-20 2018-07-26 Politecnico Di Torino Method and apparatus to characterise non-invasively images containing venous blood vessels
US20180214119A1 (en) 2017-01-27 2018-08-02 Wayne State University Ultrasound and photoacoustic systems and methods for fetal brain assessment during delivery
US10043272B2 (en) 2014-09-16 2018-08-07 Esaote S.P.A. Method and apparatus for acquiring and fusing ultrasound images with pre-acquired images
US20180225993A1 (en) 2017-01-24 2018-08-09 Tietronix Software, Inc. System and method for three-dimensional augmented reality guidance for use of medical equipment
US20180235576A1 (en) 2017-02-22 2018-08-23 Covidien Lp Ultrasound doppler and elastography for ablation prediction and monitoring
US20180250078A1 (en) 2015-09-10 2018-09-06 Xact Robotics Ltd. Systems and methods for guiding the insertion of a medical tool
US20180272108A1 (en) 2017-03-27 2018-09-27 Biosense Webster (Israel) Ltd Catheter with improved loop contraction and greater contraction displacement
US20180286287A1 (en) 2017-03-28 2018-10-04 Covidien Lp System and methods for training physicians to perform ablation procedures
US20180279996A1 (en) 2014-11-18 2018-10-04 C.R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US20180310955A1 (en) 2017-04-27 2018-11-01 Bard Access Systems, Inc. Magnetizing System For Needle Assemblies
US20180317881A1 (en) 2017-05-05 2018-11-08 International Business Machines Corporation Automating ultrasound examination of a vascular system
JP2018175547A (en) 2017-04-17 2018-11-15 ニプロ株式会社 Puncture guide and ultrasound diagnostic apparatus with puncture guide
WO2018206473A1 (en) 2017-05-11 2018-11-15 Koninklijke Philips N.V. Workflow, system and method for motion compensation in ultrasound procedures
US20180366035A1 (en) 2017-06-20 2018-12-20 Ezono Ag System and method for image-guided procedure analysis and training
KR20190013133A (en) 2017-07-31 2019-02-11 (재)예수병원유지재단 Apparatus for guiding syringe arrangement based on ultrasonic probe
US20190060014A1 (en) 2017-08-30 2019-02-28 Intuitive Surgical Operations, Inc. System and method for providing on-demand functionality during a medical procedure
US20190069923A1 (en) 2015-11-08 2019-03-07 Qin Wang Paracentesis needle frame
US20190076121A1 (en) 2017-09-13 2019-03-14 Bard Access Systems, Inc. Ultrasound Finger Probe
US20190088019A1 (en) 2016-03-16 2019-03-21 Koninklijke Philips N.V. Calculation device for superimposing a laparoscopic image and an ultrasound image
US20190105017A1 (en) 2017-10-11 2019-04-11 Geoffrey Steven Hastings Laser assisted ultrasound guidance
US20190117190A1 (en) * 2016-04-19 2019-04-25 Koninklijke Philips N.V. Ultrasound imaging probe positioning
US20190223757A1 (en) 2017-12-04 2019-07-25 Bard Access Systems, Inc. Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US20190239850A1 (en) 2018-02-06 2019-08-08 Steven Philip Dalvin Augmented/mixed reality system and method for the guidance of a medical exam
US10380919B2 (en) 2013-11-21 2019-08-13 SonoSim, Inc. System and method for extended spectrum ultrasound training using animate and inanimate training objects
US10380920B2 (en) 2013-09-23 2019-08-13 SonoSim, Inc. System and method for augmented ultrasound simulation using flexible touch sensitive surfaces
EP3530221A1 (en) 2018-02-26 2019-08-28 Covidien LP System and method for performing a percutaneous navigation procedure
US20190282324A1 (en) 2018-03-15 2019-09-19 Zoll Medical Corporation Augmented Reality Device for Providing Feedback to an Acute Care Provider
US10424225B2 (en) 2013-09-23 2019-09-24 SonoSim, Inc. Method for ultrasound training with a pressure sensing array
US20190298457A1 (en) 2016-11-08 2019-10-03 Koninklijke Philips N.V. System and method for tracking an interventional instrument with feedback concerning tracking reliability
US20190307516A1 (en) 2018-04-06 2019-10-10 Medtronic, Inc. Image-based navigation system and method of using same
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US20190339525A1 (en) 2018-05-07 2019-11-07 The Cleveland Clinic Foundation Live 3d holographic guidance and navigation for performing interventional procedures
US20190355278A1 (en) 2018-05-18 2019-11-21 Marion Surgical Inc. Virtual reality surgical system including a surgical tool assembly with haptic feedback
US20190365348A1 (en) * 2015-06-23 2019-12-05 Hemonitor Medical Ltd. Systems and methods for hand-free continuous ultrasonic monitoring
WO2019232451A1 (en) 2018-05-31 2019-12-05 Matt Mcgrath Design & Co, Llc Method of medical imaging using multiple arrays
WO2020002620A1 (en) 2018-06-29 2020-01-02 Koninklijke Philips N.V. Biopsy prediction and guidance with ultrasound imaging and associated devices, systems, and methods
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
WO2020016018A1 (en) 2018-07-18 2020-01-23 Koninklijke Philips N.V. Automatic image vetting on a handheld medical scanning device
US20200041261A1 (en) 2017-10-06 2020-02-06 Advanced Scanners, Inc. Generation of one or more edges of luminosity to form three-dimensional models of objects
WO2020044769A1 (en) 2018-08-27 2020-03-05 富士フイルム株式会社 Ultrasound diagnosis device and ultrasound diagnosis device control method
US20200069285A1 (en) 2018-08-31 2020-03-05 General Electric Company System and method for ultrasound navigation
US20200113540A1 (en) 2017-06-07 2020-04-16 Koninklijke Philips N.V. Ultrasound system and method
US20200129136A1 (en) 2018-10-31 2020-04-30 Medtronic, Inc. Real-time rendering and referencing for medical procedures
WO2020102665A1 (en) 2018-11-16 2020-05-22 Lang Philipp K Augmented reality guidance for surgical procedures with adjustment of scale, convergence and focal plane or focal point of virtual data
US20200188028A1 (en) 2017-08-21 2020-06-18 The Trustees Of Columbia University In The City Of New York Systems and methods for augmented reality guidance
US20200230391A1 (en) 2019-01-18 2020-07-23 Becton, Dickinson And Company Intravenous therapy system for blood vessel detection and vascular access device placement
WO2020186198A1 (en) 2019-03-13 2020-09-17 University Of Florida Research Foundation Guidance and tracking system for templated and targeted biopsy and treatment
US20210007710A1 (en) * 2019-07-12 2021-01-14 Verathon Inc. Representation of a target during aiming of an ultrasound probe
US10896628B2 (en) 2017-01-26 2021-01-19 SonoSim, Inc. System and method for multisensory psychomotor skill training
US20210045716A1 (en) 2019-08-13 2021-02-18 GE Precision Healthcare LLC Method and system for providing interaction with a visual artificial intelligence ultrasound image segmentation module
US11120709B2 (en) 2012-12-18 2021-09-14 SonoSim, Inc. System and method for teaching basic ultrasound skills
US20210307838A1 (en) * 2017-12-29 2021-10-07 Weipeng (Suzhou) Co., Ltd. Surgical navigation method and system
US20210353255A1 (en) * 2014-03-31 2021-11-18 Koninklijke Philips N.V. Haptic feedback for ultrasound image acquisition
US20210402144A1 (en) * 2020-06-29 2021-12-30 Bard Access Systems, Inc. Automatic Dimensional Frame Reference for Fiber Optic
US20220022969A1 (en) 2020-07-21 2022-01-27 Bard Access Systems, Inc. System, Method and Apparatus for Magnetic Tracking of Ultrasound Probe and Generation of 3D Visualization Thereof
US20220039777A1 (en) 2020-08-10 2022-02-10 Bard Access Systems, Inc. System and Method for Generating Vessel Representations in Mixed Reality/Virtual Reality
US20220039685A1 (en) 2020-08-04 2022-02-10 Bard Access Systems, Inc. Systemized and Method for Optimized Medical Component Insertion Monitoring and Imaging Enhancement
US20220096797A1 (en) 2020-09-25 2022-03-31 Bard Access Systems, Inc. Minimum Catheter Length Tool
WO2022072727A2 (en) 2020-10-02 2022-04-07 Bard Access Systems, Inc. Ultrasound systems and methods for sustained spatial attention
WO2022081904A1 (en) 2020-10-15 2022-04-21 Bard Access Systems, Inc. Ultrasound imaging system for generation of a three-dimensional ultrasound image
US11311269B2 (en) 2008-04-22 2022-04-26 Ezono Ag Ultrasound imaging system and method for providing assistance in an ultrasound imaging system
US20220160434A1 (en) 2020-11-24 2022-05-26 Bard Access Systems, Inc. Ultrasound System with Target and Medical Instrument Awareness
US20220172354A1 (en) 2020-12-01 2022-06-02 Bard Access Systems, Inc. Ultrasound System with Pressure and Flow Determination Capability
US20220168050A1 (en) 2020-12-01 2022-06-02 Bard Access Systems, Inc. Ultrasound Probe with Target Tracking Capability
US20220211442A1 (en) 2021-01-06 2022-07-07 Bard Access Systems, Inc. Needle Guidance Using Fiber Optic Shape Sensing
CN114129137B (en) * 2021-12-02 2022-09-09 深圳先进技术研究院 Intravascular imaging system, device and imaging method
WO2022203713A2 (en) 2020-09-18 2022-09-29 Bard Access Systems, Inc. Ultrasound probe with pointer remote control capability
WO2022263763A1 (en) * 2021-06-16 2022-12-22 Quantum Surgical Medical robot for placement of medical instruments under ultrasound guidance
US11600201B1 (en) 2015-06-30 2023-03-07 The Regents Of The University Of California System and method for converting handheld diagnostic ultrasound systems into ultrasound training systems
US20230113291A1 (en) 2020-04-02 2023-04-13 Koninklijke Philips N.V. Ultrasound probe, user console, system and method
US20230240643A1 (en) 2020-06-16 2023-08-03 Innovacell Ag Parallel path puncture device guide and method
US20230389893A1 (en) 2022-06-03 2023-12-07 Bard Access Systems, Inc. Ultrasound Probe with Smart Accessory
WO2024010940A1 (en) 2022-07-08 2024-01-11 Bard Access Systems, Inc. Systems and methods for intelligent ultrasound probe guidance
US20240050061A1 (en) 2022-08-15 2024-02-15 Bard Access Systems, Inc. Spatially Aware Medical Device Configured for Performance of Insertion Pathway Approximation
US20240062678A1 (en) 2022-08-17 2024-02-22 Bard Access Systems, Inc. Ultrasound Training System

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US668A (en) 1838-04-02 Improvement in wardrobe-bedsteads
US6129A (en) 1849-02-20 Washbtjrn race

Patent Citations (359)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5148809A (en) 1990-02-28 1992-09-22 Asgard Medical Systems, Inc. Method and apparatus for detecting blood vessels and displaying an enhanced video image from an ultrasound scan
US5181513A (en) 1990-05-29 1993-01-26 Pierre-Jean Touboul Method of acquiring ultrasound images
US5325293A (en) 1992-02-18 1994-06-28 Dorne Howard L System and method for correlating medical procedures and medical billing codes
US5441052A (en) 1992-12-28 1995-08-15 Kabushiki Kaisha Toshiba Color doppler-type ultrasonic diagnostic apparatus
US5549554A (en) 1994-04-01 1996-08-27 Advanced Cardiovascular Systems, Inc. Catheters having separable reusable components
US5573529A (en) 1994-10-31 1996-11-12 Haak; Benjamin A. Color coded medical instruments
US6019724A (en) 1995-02-22 2000-02-01 Gronningsaeter; Aage Method for ultrasound guidance during clinical procedures
US5967984A (en) 1995-06-30 1999-10-19 Boston Scientific Corporation Ultrasound imaging catheter with a cutting element
US6375615B1 (en) 1995-10-13 2002-04-23 Transvascular, Inc. Tissue penetrating catheters having integral imaging transducers and their methods of use
US20140180098A1 (en) 1995-10-13 2014-06-26 Medtronic Vascular, Inc. Tissue Penetrating Catheters Having Integral Imaging Transducers and Their Methods of Use
US8727988B2 (en) 1995-10-13 2014-05-20 Medtronic Vascular, Inc. Tissue penetrating catheters having integral imaging transducers and their methods of use
US7637870B2 (en) 1995-10-13 2009-12-29 Medtronic Vascular, Inc. Tissue penetrating catheters having integral imaging transducers and their methods of use
US5908387A (en) 1996-06-21 1999-06-01 Quinton Instrument Company Device and method for improved quantitative coronary artery analysis
US5775322A (en) 1996-06-27 1998-07-07 Lucent Medical Systems, Inc. Tracheal tube and methods related thereto
US6129668A (en) 1997-05-08 2000-10-10 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US6216028B1 (en) 1997-05-08 2001-04-10 Lucent Medical Systems, Inc. Method to determine the location and orientation of an indwelling medical device
US6263230B1 (en) 1997-05-08 2001-07-17 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US5879297A (en) 1997-05-08 1999-03-09 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US6068599A (en) 1997-07-14 2000-05-30 Matsushita Electric Industrial Co., Ltd. Blood vessel puncturing device using ultrasound
US6074367A (en) 1997-10-01 2000-06-13 Scimed Life Systems, Inc. Preinsertion measurement of catheters
US5970119A (en) 1997-11-18 1999-10-19 Douglas Holtz (Part Interest) Radiological scaling and alignment device
US6245018B1 (en) 1997-12-15 2001-06-12 Medison Co., Ltd. Ultrasonic color doppler imaging system capable of discriminating artery and vein
EP0933063A1 (en) 1997-12-15 1999-08-04 Medison Co., Ltd. Ultrasonic color doppler imaging system
US6511458B2 (en) 1998-01-13 2003-01-28 Lumend, Inc. Vascular re-entry catheter
US20070239120A1 (en) 1998-02-24 2007-10-11 Brock David L Flexible instrument
US6004270A (en) 1998-06-24 1999-12-21 Ecton, Inc. Ultrasound system for contrast agent imaging and quantification in echocardiography using template image for image alignment
US6132379A (en) 1998-11-04 2000-10-17 Patacsil; Estelito G. Method and apparatus for ultrasound guided intravenous cannulation
US6524249B2 (en) 1998-11-11 2003-02-25 Spentech, Inc. Doppler ultrasound method and apparatus for monitoring blood flow and detecting emboli
US6503205B2 (en) 1998-11-18 2003-01-07 Cardiosonix Ltd. Dual ultrasonic transducer probe for blood flow measurement, and blood vessel diameter determination method
JP2000271136A (en) 1999-03-25 2000-10-03 Toshiba Corp Ultrasonic therapeutic apparatus and method for controlling the same
US6233476B1 (en) 1999-05-18 2001-05-15 Mediguide Ltd. Medical positioning system
US6613002B1 (en) 1999-06-05 2003-09-02 Wilson-Cook Medical Incorporated System of indicia for a medical device
US6687386B1 (en) 1999-06-15 2004-02-03 Hitachi Denshi Kabushiki Kaisha Object tracking method and object tracking apparatus
US6498942B1 (en) 1999-08-06 2002-12-24 The University Of Texas System Optoacoustic monitoring of blood oxygenation
US20020038088A1 (en) 1999-08-20 2002-03-28 Novasonics Inc. Miniaturized ultrasound apparatus and method
US6647135B2 (en) 1999-12-07 2003-11-11 Koninklijke Philips Electronics N.V. Ultrasonic image processing method and system for displaying a composite image sequence of an artery segment
US6436043B2 (en) 1999-12-21 2002-08-20 Koninklijke Phillips Electronics N.V. Ultrasonic image processing method and examination system for displaying an ultrasonic composite image sequence of an artery
US6508769B2 (en) 1999-12-28 2003-01-21 Koninklijke Philips Electronics N.V. Ultrasonic image processing method and examination system for displaying an ultrasonic color-coded image sequence of an object having moving parts
US6612992B1 (en) 2000-03-02 2003-09-02 Acuson Corp Medical diagnostic ultrasound catheter and method for position determination
US8328727B2 (en) 2000-03-23 2012-12-11 Tensys Medical, Inc. Method and apparatus for assessing hemodynamic parameters within the circulatory system of a living subject
US7534209B2 (en) 2000-05-26 2009-05-19 Physiosonics, Inc. Device and method for mapping and tracking blood flow and determining parameters of blood flow
US6840379B2 (en) 2000-06-13 2005-01-11 Judith Franks-Farah Male clean intermittent catheter system
US20040055925A1 (en) 2000-06-13 2004-03-25 Judith Franks-Farah Male clean intermittent catheter system
US7831449B2 (en) 2001-02-02 2010-11-09 Thompson Reuters (Healthcare) Inc. Method and system for extracting medical information for presentation to medical providers on mobile terminals
US20020148277A1 (en) 2001-04-11 2002-10-17 Manabu Umeda Method of making ultrasonic probe and ultrasonic probe
US6592565B2 (en) 2001-04-26 2003-07-15 Zbylut J. Twardowski Patient-tailored, central-vein catheters
US6754608B2 (en) 2001-05-23 2004-06-22 Radi Medical Systems Ab Interactive measurement system
US20070043341A1 (en) 2001-05-30 2007-02-22 Anderson R R Apparatus and method for laser treatment with spectroscopic feedback
US6592520B1 (en) 2001-07-31 2003-07-15 Koninklijke Philips Electronics N.V. Intravascular ultrasound imaging apparatus and method
US20030047126A1 (en) 2001-09-12 2003-03-13 Tomaschko Daniel K. System for identifying medical devices
US6543642B1 (en) 2001-09-21 2003-04-08 Daydots International, Inc. Disposable glove dispenser system
US20030060714A1 (en) 2001-09-24 2003-03-27 Henderson Richard W. Medical ultrasound transducer with interchangeable handle
US20030073900A1 (en) 2001-10-12 2003-04-17 Pranitha Senarith System and method for monitoring the movement of an interventional device within an anatomical site
US20030093001A1 (en) 2001-11-09 2003-05-15 Antti Martikainen Method and assembly for identifying a measuring cuff
US6641538B2 (en) 2001-11-22 2003-11-04 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of controlling a ultrasonic diagnostic apparatus
US20030120154A1 (en) 2001-11-28 2003-06-26 Frank Sauer Method and apparatus for ultrasound guidance of needle biopsies
US20030106825A1 (en) 2001-12-07 2003-06-12 The Procter & Gamble Company Package containing a window and performance characteristic indicator
US6601705B2 (en) 2001-12-07 2003-08-05 The Procter & Gamble Company Package containing a window and performance characteristic indicator
US6554771B1 (en) 2001-12-18 2003-04-29 Koninklijke Philips Electronics N.V. Position sensor in ultrasound transducer probe
US6755789B2 (en) 2002-02-05 2004-06-29 Inceptio Medical Technologies, Llc Ultrasonic vascular imaging system and method of blood vessel cannulation
US6623431B1 (en) 2002-02-25 2003-09-23 Ichiro Sakuma Examination method of vascular endothelium function
US7734326B2 (en) 2002-06-20 2010-06-08 Brainlab Ag Method and device for preparing a drainage
US7359554B2 (en) 2002-08-26 2008-04-15 Cleveland Clinic Foundation System and method for identifying a vascular border
US7599730B2 (en) 2002-11-19 2009-10-06 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US7925327B2 (en) 2002-12-04 2011-04-12 Koninklijke Philips Electronics N.V. Apparatus and method for assisting the navigation of a catheter in a vessel
US7074187B2 (en) 2002-12-13 2006-07-11 Selzer Robert H System and method for improving ultrasound image acquisition and replication for repeatable measurements of vascular structures
US7927278B2 (en) 2002-12-13 2011-04-19 California Institute Of Technology Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing
US6979294B1 (en) 2002-12-13 2005-12-27 California Institute Of Technology Split-screen display system and standardized methods for ultrasound image acquisition and processing for improved measurements of vascular structures
US20060079781A1 (en) 2002-12-18 2006-04-13 Koninklijke Philips Electronics N.V. Ultrasonic apparatus for estimating artery parameters
US20060210130A1 (en) 2002-12-18 2006-09-21 Laurence Germond-Rouet Ultrasonic doppler system for determining movement of artery walls
US6749569B1 (en) 2003-01-07 2004-06-15 Esaote S.P.A. Method and apparatus for ultrasound imaging
US7727153B2 (en) 2003-04-07 2010-06-01 Sonosite, Inc. Ultrasonic blood vessel measurement apparatus and method
US6857196B2 (en) 2003-04-30 2005-02-22 Robert Dalrymple Method and apparatus for measuring a intracorporal passage image
US7699779B2 (en) 2003-05-19 2010-04-20 Hitachi, Ltd. Ultrasonic treatment equipment
US20050000975A1 (en) 2003-05-28 2005-01-06 Carco Darlene Marie Sterile surgical glove dispenser
EP1504713A1 (en) 2003-07-14 2005-02-09 Surgical Navigation Technologies, Inc. Navigation system for cardiac therapies
US20050049504A1 (en) 2003-08-27 2005-03-03 Meng-Tsung Lo Ultrasonic vein detector and relating method
US8090427B2 (en) 2003-09-04 2012-01-03 Koninklijke Philips Electronics N.V. Methods for ultrasound visualization of a vessel with location and cycle information
US7244234B2 (en) 2003-11-11 2007-07-17 Soma Development Llc Ultrasound guided probe device and method of using same
US20050165299A1 (en) 2004-01-23 2005-07-28 Traxyz Medical, Inc. Methods and apparatus for performing procedures on target locations in the body
US20050251030A1 (en) 2004-04-21 2005-11-10 Azar Fred S Method for augmented reality instrument placement using an image based navigation system
US8014848B2 (en) 2004-04-26 2011-09-06 Brainlab Ag Visualization of procedural guidelines for a medical procedure
EP1591074B1 (en) 2004-04-26 2008-05-21 BrainLAB AG Visualization of procedural guidelines for medical procedures
US20050267365A1 (en) 2004-06-01 2005-12-01 Alexander Sokulin Method and apparatus for measuring anatomic structures
US7691061B2 (en) 2004-06-24 2010-04-06 Terumo Kabushiki Kaisha Ultrasonic diagnostic apparatus and method of processing an ultrasound signal
US20060020204A1 (en) 2004-07-01 2006-01-26 Bracco Imaging, S.P.A. System and method for three-dimensional space management and visualization of ultrasound data ("SonoDEX")
US20060013523A1 (en) 2004-07-16 2006-01-19 Luna Innovations Incorporated Fiber optic position and shape sensing device and method relating thereto
US20060015039A1 (en) 2004-07-19 2006-01-19 Cassidy Kenneth T Guidewire bearing markings simplifying catheter selection
US11062624B2 (en) 2004-11-30 2021-07-13 The Regents Of University Of California Embedded motion sensing technology for integration within commercial ultrasound probes
US20120238875A1 (en) 2004-11-30 2012-09-20 Eric Savitsky Embedded Motion Sensing Technology for Integration within Commercial Ultrasound Probes
US7720520B2 (en) 2004-12-01 2010-05-18 Boston Scientific Scimed, Inc. Method and system for registering an image with a navigation reference catheter
US20060184029A1 (en) 2005-01-13 2006-08-17 Ronen Haim Ultrasound guiding system and method for vascular access and operation mode
US20090012399A1 (en) 2005-02-07 2009-01-08 Kazuhiro Sunagawa Ultrasonic diagnostic apparatus
US20080051657A1 (en) 2005-02-28 2008-02-28 Rold Michael D Systems And Methods For Estimating The Size And Position Of A Medical Device To Be Applied Within A Patient
US20100211026A2 (en) 2005-03-04 2010-08-19 C. R. Bard, Inc. Access port identification systems and methods
US20090306509A1 (en) 2005-03-30 2009-12-10 Worcester Polytechnic Institute Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
US8175368B2 (en) 2005-04-05 2012-05-08 Scimed Life Systems, Inc. Systems and methods for image segmentation with a multi-state classifier
AU2006201646A1 (en) 2005-04-26 2006-11-09 Biosense Webster, Inc. Display of catheter tip with beam direction for ultrasound system
US8409103B2 (en) 2005-05-06 2013-04-02 Vasonova, Inc. Ultrasound methods of positioning guided vascular access devices in the venous system
US8075488B2 (en) 2005-05-12 2011-12-13 Compumedics Medical Innovation Pty. Ltd. Ultrasound diagnosis and treatment apparatus
US8298147B2 (en) 2005-06-24 2012-10-30 Volcano Corporation Three dimensional co-registration for intravascular diagnosis and therapy
US7681579B2 (en) 2005-08-02 2010-03-23 Biosense Webster, Inc. Guided procedures for treating atrial fibrillation
US20070049822A1 (en) 2005-08-31 2007-03-01 Sonosite, Inc. Medical device guide locator
US20070073155A1 (en) 2005-09-02 2007-03-29 Ultrasound Ventures, Llc Ultrasound guidance system
US8449465B2 (en) 2005-10-14 2013-05-28 Cleveland Clinic Foundation System and method for characterizing vascular tissue
US8303505B2 (en) 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
US9582876B2 (en) 2006-02-06 2017-02-28 Maui Imaging, Inc. Method and apparatus to visualize the coronary arteries using ultrasound
US8105239B2 (en) 2006-02-06 2012-01-31 Maui Imaging, Inc. Method and apparatus to visualize the coronary arteries using ultrasound
US20070199848A1 (en) 2006-02-28 2007-08-30 Ellswood Mark R Packaging with color-coded identification
US8060181B2 (en) 2006-04-07 2011-11-15 Brainlab Ag Risk assessment for planned trajectories
US8172754B2 (en) 2006-04-18 2012-05-08 Panasonic Corporation Ultrasonograph
US20070249911A1 (en) 2006-04-21 2007-10-25 Simon David A Method and apparatus for optimizing a therapy
US8228347B2 (en) 2006-05-08 2012-07-24 C. R. Bard, Inc. User interface and methods for sonographic display device
US20080033293A1 (en) 2006-05-08 2008-02-07 C. R. Bard, Inc. User interface and methods for sonographic display device
US20080021322A1 (en) 2006-05-24 2008-01-24 Michael Benjamin Stone Ultrasonic imaging apparatus and method
US20080033759A1 (en) 2006-08-02 2008-02-07 Vastrac, Inc. Information manager for a procedure-based medical practice
US8211023B2 (en) 2006-08-11 2012-07-03 Koninklijke Philips Electronics N.V. Ultrasound system for cerebral blood flow monitoring
US7905837B2 (en) 2006-09-04 2011-03-15 Ge Medical Systems Global Technology Company, Llc Ultrasound diagnostic apparatus
US20080146915A1 (en) 2006-10-19 2008-06-19 Mcmorrow Gerald Systems and methods for visualizing a cannula trajectory
US20080177186A1 (en) 2007-01-18 2008-07-24 Slater Charles R Methods and Apparatus for Determining a Treatment Volume of a Fluid Treatment Agent for Treating The Interior of a Blood Vessel
US8790263B2 (en) 2007-02-05 2014-07-29 Siemens Medical Solutions Usa, Inc. Automated movement detection with audio and visual information
US20160296208A1 (en) 2007-02-09 2016-10-13 Board Of Regents, The University Of Texas System Intravascular Photoacoustic and Ultrasound Echo Imaging
US9717415B2 (en) 2007-03-08 2017-08-01 Sync-Rx, Ltd. Automatic quantitative vessel analysis at the location of an automatically-detected tool
US20080221425A1 (en) 2007-03-09 2008-09-11 Olson Eric S System and method for local deformable registration of a catheter navigation system to image data or a model
US8050523B2 (en) 2007-04-20 2011-11-01 Koninklijke Philips Electronics N.V. Optical fiber shape sensing systems
US20080294037A1 (en) 2007-05-23 2008-11-27 Jacob Richter Apparatus and Method for Guided Chronic Total Occlusion Penetration
US20100277305A1 (en) 2007-06-01 2010-11-04 Koninklijke Philips Electronics N.V. Wireless Ultrasound Probe Asset Tracking
US20080300491A1 (en) 2007-06-04 2008-12-04 Medtronic, Inc. Percutaneous needle guide and methods of use
US9155517B2 (en) 2007-07-13 2015-10-13 Ezono Ag Opto-electrical ultrasound sensor and system
US9138290B2 (en) 2007-07-27 2015-09-22 Meridian Cardiovascular Systems, Inc. Method of ablating arterial plaque
US20100286515A1 (en) 2007-09-28 2010-11-11 Dietrich Gravenstein Novel Methods and Devices for Noninvasive Measurement of Energy Absorbers in Blood
US8323202B2 (en) 2007-11-16 2012-12-04 Pneumrx, Inc. Method and system for measuring pulmonary artery circulation information
US9554716B2 (en) 2007-11-26 2017-01-31 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9492097B2 (en) 2007-11-26 2016-11-15 C. R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US20090156926A1 (en) 2007-11-26 2009-06-18 C.R. Bard, Inc. Integrated System for Intravascular Placement of a Catheter
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US20170079548A1 (en) 2007-11-26 2017-03-23 C. R. Bard, Inc. Systems and Methods for Guiding a Medical Instrument
US8388541B2 (en) 2007-11-26 2013-03-05 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US20110295108A1 (en) 2007-11-26 2011-12-01 C.R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9456766B2 (en) 2007-11-26 2016-10-04 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US20140188133A1 (en) 2007-11-26 2014-07-03 C. R. Bard, Inc. Iconic Representations for Guidance of an Indwelling Medical Device
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US10751509B2 (en) 2007-11-26 2020-08-25 C. R. Bard, Inc. Iconic representations for guidance of an indwelling medical device
US20090143672A1 (en) 2007-12-04 2009-06-04 Harms Steven E Method for mapping image reference points to facilitate biopsy using magnetic resonance imaging
US20090143684A1 (en) 2007-12-04 2009-06-04 Civco Medical Instruments Co., Inc. Needle guide system for use with ultrasound transducers to effect shallow path needle entry and method of use
US20100179428A1 (en) 2008-03-17 2010-07-15 Worcester Polytechnic Institute Virtual interactive system for ultrasound training
US11311269B2 (en) 2008-04-22 2022-04-26 Ezono Ag Ultrasound imaging system and method for providing assistance in an ultrasound imaging system
US9022940B2 (en) 2008-07-18 2015-05-05 Joseph H. Meier Handheld imaging devices and related methods
US8068581B2 (en) 2008-07-25 2011-11-29 Siemens Aktiengesellschaft Method for representing interventional instruments in a 3D data set of an anatomy to be examined as well as a reproduction system for performing the method
US20100020926A1 (en) 2008-07-25 2010-01-28 Jan Boese Method for representing interventional instruments in a 3d data set of an anatomy to be examined as well as a reproduction system for performing the method
US9468413B2 (en) 2008-09-05 2016-10-18 General Electric Company Method and apparatus for catheter guidance using a combination of ultrasound and X-ray imaging
US8200313B1 (en) 2008-10-01 2012-06-12 Bioquantetics, Inc. Application of image-based dynamic ultrasound spectrography in assisting three dimensional intra-body navigation of diagnostic and therapeutic devices
US20100106015A1 (en) 2008-10-23 2010-04-29 Norris Perry R Medical device alignment
US20150112200A1 (en) 2008-12-18 2015-04-23 C. R. Bard, Inc. Needle Guide Including Enhanced Visibility Entrance
US20130188832A1 (en) 2009-04-14 2013-07-25 Qinglin Ma Systems and methods for adaptive volume imaging
US8781194B2 (en) 2009-04-17 2014-07-15 Tufts Medical Center, Inc. Aneurysm detection
US9702969B2 (en) 2009-05-13 2017-07-11 Koninklijke Philips Electronics N.V. Ultrasonic blood flow doppler audio with pitch shifting
US8556815B2 (en) 2009-05-20 2013-10-15 Laurent Pelissier Freehand ultrasound imaging systems and methods for guiding fine elongate instruments
US20100312121A1 (en) 2009-06-09 2010-12-09 Zhonghui Guan Apparatus for a needle director for an ultrasound transducer probe
US20110002518A1 (en) 2009-07-01 2011-01-06 General Electric Company Method and system for processing ultrasound data
US8939908B2 (en) 2009-07-16 2015-01-27 Unex Corporation Ultrasonic blood vessel inspecting apparatus
US20110071404A1 (en) 2009-09-23 2011-03-24 Lightlab Imaging, Inc. Lumen Morphology and Vascular Resistance Measurements Data Collection Systems, Apparatus and Methods
US20110313293A1 (en) 2009-10-08 2011-12-22 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
US9649037B2 (en) 2009-12-03 2017-05-16 Deltex Medical Limited Method and apparatus for hemodynamic monitoring using combined blood flow and blood pressure measurement
US9445780B2 (en) 2009-12-04 2016-09-20 University Of Virginia Patent Foundation Tracked ultrasound vessel imaging
US9220477B2 (en) 2009-12-18 2015-12-29 Konica Minolta, Inc. Ultrasonic diagnostic device, and region-to-be-detected image display method and measurement method using same
US20130150724A1 (en) 2010-01-07 2013-06-13 Verathon Inc. Blood vessel access devices, systems, and methods
US9204858B2 (en) 2010-02-05 2015-12-08 Ultrasonix Medical Corporation Ultrasound pulse-wave doppler measurement of blood flow velocity and/or turbulence
US20130131499A1 (en) * 2010-02-09 2013-05-23 Koninklijke Philips Electronics N.V. Apparatus, system and method for imaging and treatment using optical position sensing
US8961420B2 (en) 2010-04-01 2015-02-24 Siemens Medical Solutions Usa, Inc. System for cardiac condition detection and characterization
US8734357B2 (en) 2010-08-12 2014-05-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8553954B2 (en) 2010-08-24 2013-10-08 Siemens Medical Solutions Usa, Inc. Automated system for anatomical vessel characteristic determination
US8622913B2 (en) 2010-09-28 2014-01-07 General Electric Company Method and system for non-invasive monitoring of patient parameters
US20130324840A1 (en) 2010-10-08 2013-12-05 Jian Zhongping Detection of blood-vessel wall artifacts
US8585600B2 (en) 2010-12-09 2013-11-19 Ge Medical Systems Global Technology Company, Llc Ultrasound volume probe navigation and control method and device
US9913605B2 (en) 2010-12-22 2018-03-13 Veebot Systems, Inc. Systems and methods for autonomous intravenous needle insertion
US9364171B2 (en) 2010-12-22 2016-06-14 Veebot Systems, Inc. Systems and methods for autonomous intravenous needle insertion
US20120220865A1 (en) 2010-12-31 2012-08-30 Volcano Corporation Pulmonary Embolism Diagnostic Devices and Associated Methods and Systems
US20120179038A1 (en) 2011-01-07 2012-07-12 General Electric Company Ultrasound based freehand invasive device positioning system and method
US20120197132A1 (en) 2011-01-31 2012-08-02 Analogic Corporation Ultrasound imaging apparatus
US20120209121A1 (en) 2011-02-15 2012-08-16 General Electric Company Ultrasound probe including a securing member
US9427207B2 (en) 2011-04-05 2016-08-30 Houston Medical Robotics, Inc. Motorized systems and methods for accessing the lumen of a vessel
US9610061B2 (en) 2011-04-14 2017-04-04 Regents Of The University Of Minnesota Vascular characterization using ultrasound imaging
US20120277576A1 (en) 2011-04-26 2012-11-01 Chun Kee Lui Echogenic infusion port catheter
US9895138B2 (en) 2011-06-06 2018-02-20 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus
US20130041250A1 (en) 2011-08-09 2013-02-14 Ultrasonix Medical Corporation Methods and apparatus for locating arteries and veins using ultrasound
US20140155737A1 (en) 2011-08-16 2014-06-05 Koninklijke Philips N.V. Curved multi-planar reconstruction using fiber optic shape data
US9295447B2 (en) 2011-08-17 2016-03-29 Volcano Corporation Systems and methods for identifying vascular borders
US9814531B2 (en) 2011-08-26 2017-11-14 EBM Corporation System for diagnosing bloodflow characteristics, method thereof, and computer software program
US8744211B2 (en) 2011-08-31 2014-06-03 Analogic Corporation Multi-modality image acquisition
US10674935B2 (en) 2011-09-06 2020-06-09 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US10765343B2 (en) 2011-09-06 2020-09-08 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US10758155B2 (en) 2011-09-06 2020-09-01 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US9597008B2 (en) 2011-09-06 2017-03-21 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US9731066B2 (en) 2011-09-30 2017-08-15 General Electric Company Device, system and method of automatic vessel access based on real time volumetric ultrasound
US20130218024A1 (en) 2011-10-09 2013-08-22 Clear Guide Medical, Llc Interventional In-Situ Image-Guidance by Fusing Ultrasound and Video
US20130102889A1 (en) 2011-10-21 2013-04-25 C. R. Bard, Inc. Systems and Methods for Ultrasound-Based Medical Device Assessment
US20180228465A1 (en) 2011-10-21 2018-08-16 C. R. Bard, Inc. Systems and Methods for Ultrasound-Based Medical Device Assessment
US9949720B2 (en) 2011-10-21 2018-04-24 C. R. Bard, Inc. Systems and methods for ultrasound-based medical device assessment
WO2013059714A1 (en) 2011-10-21 2013-04-25 C.R.Bard, Inc. Systems and methods for ultrasound-based medical device assessment
US20140036091A1 (en) 2011-11-02 2014-02-06 Seno Medical Instruments, Inc. Interframe energy normalization in an optoacoustic imaging system
US8754865B2 (en) 2011-11-16 2014-06-17 Volcano Corporation Medical measuring system and method
US20130131502A1 (en) 2011-11-18 2013-05-23 Michael Blaivas Blood vessel access system and device
US20140343431A1 (en) 2011-12-16 2014-11-20 Koninklijke Philips N.V. Automatic blood vessel identification by name
US9357980B2 (en) 2012-01-10 2016-06-07 Konica Minolta, Inc. Ultrasound diagnostic apparatus and method for identifying blood vessel
US20140031690A1 (en) 2012-01-10 2014-01-30 Panasonic Corporation Ultrasound diagnostic apparatus and method for identifying blood vessel
US8764663B2 (en) 2012-03-14 2014-07-01 Jeffrey Smok Method and apparatus for locating and distinguishing blood vessel
US8706457B2 (en) 2012-05-14 2014-04-22 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9715757B2 (en) 2012-05-31 2017-07-25 Koninklijke Philips N.V. Ultrasound imaging system and method for image guidance procedure
US20150294497A1 (en) 2012-05-31 2015-10-15 Koninklijke Philips N.V. Ultrasound imaging system and method for image guidance procedure
US20140005530A1 (en) 2012-06-29 2014-01-02 General Electric Company Ultrasound imaging method and ultrasound imaging apparatus
US20140073976A1 (en) 2012-09-12 2014-03-13 Heartflow, Inc. Systems and methods for estimating ischemia and blood flow characteristics from vessel geometry and physiology
US20140100440A1 (en) 2012-10-05 2014-04-10 Volcano Corporation System and method for instant and automatic border detection
US9814433B2 (en) 2012-10-24 2017-11-14 Cathworks Ltd. Creating a vascular tree model
US20170259013A1 (en) 2012-10-30 2017-09-14 Elwha Llc Systems and Methods for Generating an Injection Guide
US11120709B2 (en) 2012-12-18 2021-09-14 SonoSim, Inc. System and method for teaching basic ultrasound skills
US20140188440A1 (en) 2012-12-31 2014-07-03 Intuitive Surgical Operations, Inc. Systems And Methods For Interventional Procedure Planning
WO2014115150A1 (en) 2013-01-24 2014-07-31 Tylerton International Holdings Inc. Body structure imaging
US9861337B2 (en) 2013-02-04 2018-01-09 General Electric Company Apparatus and method for detecting catheter in three-dimensional ultrasound images
JP2014150928A (en) 2013-02-07 2014-08-25 Hitachi Aloka Medical Ltd Ultrasonic diagnostic device
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
US10434278B2 (en) 2013-03-05 2019-10-08 Ezono Ag System for image guided procedure
US20150359991A1 (en) 2013-03-05 2015-12-17 Ezono Ag System for image guided procedure
US20140276059A1 (en) 2013-03-12 2014-09-18 Volcano Corporation Externally imaging a body structure within a patient
US20140276081A1 (en) 2013-03-12 2014-09-18 St. Jude Medical Puerto Rico Llc Ultrasound assisted needle puncture mechanism
US20140276085A1 (en) 2013-03-13 2014-09-18 Volcano Corporation Coregistered intravascular and angiographic images
US20160202053A1 (en) 2013-03-13 2016-07-14 Hansen Medical, Inc. Reducing incremental measurement sensor error
US20140276690A1 (en) 2013-03-14 2014-09-18 The Spectranetics Corporation Controller to select optical channel parameters in a catheter
US20160029995A1 (en) 2013-03-15 2016-02-04 Nilus Medical Llc Hemodynamic monitoring device and methods of using same
WO2014174305A2 (en) 2013-04-26 2014-10-30 Ucl Business Plc A method and apparatus for determining the location of a medical instrument with respect to ultrasound imaging, and a medical instrument to facilitate such determination
US20160113699A1 (en) 2013-05-23 2016-04-28 CardioSonic Ltd. Devices and methods for renal denervation and assessment thereof
US9456804B2 (en) 2013-06-04 2016-10-04 Seiko Epson Corporation Ultrasound measurement apparatus and ultrasound measurement method
US20160143622A1 (en) 2013-06-26 2016-05-26 Koninklijke Philips N.V. System and method for mapping ultrasound shear wave elastography measurements
US20150005738A1 (en) 2013-06-26 2015-01-01 Corindus, Inc. System and method for monitoring of guide catheter seating
US20150011887A1 (en) 2013-07-04 2015-01-08 Samsung Medison Co., Ltd. Ultrasound system and method for providing object information
WO2015017270A1 (en) 2013-07-29 2015-02-05 Intuitive Surgical Operations, Inc. Shape sensor systems with redundant sensing
US20150065916A1 (en) 2013-08-29 2015-03-05 Vasculogic, Llc Fully automated vascular imaging and access system
US20150073279A1 (en) 2013-09-11 2015-03-12 Boston Scientific Scimed, Inc. Systems and methods for selection and displaying of images using an intravascular ultrasound imaging system
US10380920B2 (en) 2013-09-23 2019-08-13 SonoSim, Inc. System and method for augmented ultrasound simulation using flexible touch sensitive surfaces
US10424225B2 (en) 2013-09-23 2019-09-24 SonoSim, Inc. Method for ultrasound training with a pressure sensing array
US11315439B2 (en) 2013-11-21 2022-04-26 SonoSim, Inc. System and method for extended spectrum ultrasound training using animate and inanimate training objects
US10380919B2 (en) 2013-11-21 2019-08-13 SonoSim, Inc. System and method for extended spectrum ultrasound training using animate and inanimate training objects
US20160029998A1 (en) 2013-12-04 2016-02-04 Obalon Therapeutics, Inc. Systems and methods for locating and/or characterizing intragastric devices
US20150297097A1 (en) 2014-01-14 2015-10-22 Volcano Corporation Vascular access evaluation and treatment
US20150209526A1 (en) 2014-01-14 2015-07-30 Volcano Corporation Devices and methods for forming vascular access
US20150209113A1 (en) 2014-01-29 2015-07-30 Becton, Dickinson And Company Wearable Electronic Device for Enhancing Visualization During Insertion of an Invasive Device
US20210353255A1 (en) * 2014-03-31 2021-11-18 Koninklijke Philips N.V. Haptic feedback for ultrasound image acquisition
US20150327841A1 (en) 2014-05-13 2015-11-19 Kabushiki Kaisha Toshiba Tracking in ultrasound for imaging and user interface
US20170196535A1 (en) 2014-06-04 2017-07-13 Hitachi, Ltd. Medical treatment system
US20160278743A1 (en) 2014-06-11 2016-09-29 Olympus Corporation Medical diagnostic apparatus, method for operating medical diagnostic apparatus, and computer-readable recording medium
US9320493B2 (en) 2014-07-08 2016-04-26 Nadarasa Visveshwara System and method for measuring fluidics in arteries
US20170215842A1 (en) 2014-08-28 2017-08-03 Samsung Electronics Co., Ltd. Ultrasound diagnosis apparatus for self-diagnosis and remote-diagnosis, and method of operating the ultrasound diagnosis apparatus
US10043272B2 (en) 2014-09-16 2018-08-07 Esaote S.P.A. Method and apparatus for acquiring and fusing ultrasound images with pre-acquired images
US20170188839A1 (en) 2014-09-25 2017-07-06 Fujifilm Corporation Photoacoustic image generation apparatus
US20160100970A1 (en) 2014-10-09 2016-04-14 Obalon Therapeutics, Inc. Ultrasonic systems and methods for locating and/or characterizing intragastric devices
US20160101263A1 (en) 2014-10-10 2016-04-14 Intuitive Surgical Operations, Inc. Systems and methods for reducing measurement error using optical fiber shape sensors
US20160120607A1 (en) 2014-11-03 2016-05-05 Michael Sorotzkin Ultrasonic imaging device for examining superficial skin structures during surgical and dermatological procedures
US20180279996A1 (en) 2014-11-18 2018-10-04 C.R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US20170265840A1 (en) * 2014-12-01 2017-09-21 Koninklijke Philips N.V. Registration of optical shape sensing tool
US20160166232A1 (en) 2014-12-10 2016-06-16 Volcano Corporation Devices, systems, and methods for in-stent restenosis prediction
US20170172424A1 (en) * 2014-12-22 2017-06-22 Eggers & Associates, Inc. Wearable Apparatus, System and Method for Detection of Cardiac Arrest and Alerting Emergency Response
US20170303894A1 (en) 2015-01-08 2017-10-26 The Charlotte Mecklenburg Hospital Authority D/B/A Carolinas Healthcare System Ultrasound probe couplers and related methods
US20160278869A1 (en) 2015-01-19 2016-09-29 Bard Access Systems, Inc. Device and Method for Vascular Access
US20160213398A1 (en) 2015-01-26 2016-07-28 Ming-Wei Liu Ultrasound needle guide apparatus
US20180125450A1 (en) 2015-04-24 2018-05-10 U.S. Government, As Represented By The Secretary Of The Army Vascular Targeting System
US20180161502A1 (en) 2015-06-15 2018-06-14 The University Of Sydney Insertion system and method
US20190365348A1 (en) * 2015-06-23 2019-12-05 Hemonitor Medical Ltd. Systems and methods for hand-free continuous ultrasonic monitoring
US20160374644A1 (en) 2015-06-25 2016-12-29 Rivanna Medical Llc Ultrasonic Guidance of a Probe with Respect to Anatomical Features
US11600201B1 (en) 2015-06-30 2023-03-07 The Regents Of The University Of California System and method for converting handheld diagnostic ultrasound systems into ultrasound training systems
US20180199914A1 (en) 2015-07-22 2018-07-19 Koninklijke Philips N.V. Fiber-optic realshape sensor for enhanced dopper measurement display
US20180250078A1 (en) 2015-09-10 2018-09-06 Xact Robotics Ltd. Systems and methods for guiding the insertion of a medical tool
US20170086785A1 (en) 2015-09-30 2017-03-30 General Electric Company System and method for providing tactile feedback via a probe of a medical imaging system
US20190069923A1 (en) 2015-11-08 2019-03-07 Qin Wang Paracentesis needle frame
WO2017096487A1 (en) 2015-12-10 2017-06-15 1929803 Ontario Corp. D/B/A Ke2 Technologies Systems and methods for automated fluid response measurement
US20170164923A1 (en) 2015-12-14 2017-06-15 Konica Minolta, Inc. Image Processor, Ultrasound Diagnostic Device Including Same, And Image Processing Method
EP3181083A1 (en) 2015-12-18 2017-06-21 Biosense Webster (Israel), Ltd. Using force sensor to give angle of ultrasound beam
US20190088019A1 (en) 2016-03-16 2019-03-21 Koninklijke Philips N.V. Calculation device for superimposing a laparoscopic image and an ultrasound image
US20190117190A1 (en) * 2016-04-19 2019-04-25 Koninklijke Philips N.V. Ultrasound imaging probe positioning
WO2017214428A1 (en) 2016-06-08 2017-12-14 The United States Of America, As Represented By The Secretary, Department Of Health And Human Services Tissue characterization with acoustic wave tomosynthesis
US20170367678A1 (en) 2016-06-22 2017-12-28 Cesare Sirtori Ultrasound automated method for measuring the thickness of the walls of the left anterior descending, right and circumflex coronary arteries
US20180015256A1 (en) 2016-07-14 2018-01-18 C. R. Bard, Inc. Automated Catheter-To-Vessel Size Comparison Tool And Related Methods
WO2018026878A1 (en) 2016-08-02 2018-02-08 Avent, Inc. Motor-assisted needle guide assembly for ultrasound needle placement
US20180116723A1 (en) 2016-10-28 2018-05-03 Medtronic Ardian Luxembourg S.A.R.L. Methods and Systems for Optimizing Perivascular Neuromodulation Therapy Using Computational Fluid Dynamics
US20190298457A1 (en) 2016-11-08 2019-10-03 Koninklijke Philips N.V. System and method for tracking an interventional instrument with feedback concerning tracking reliability
KR20180070878A (en) 2016-12-19 2018-06-27 지멘스 메디컬 솔루션즈 유에스에이, 인크. Method of providing annotation information of ultrasound probe and ultrasound system
WO2018134726A1 (en) 2017-01-20 2018-07-26 Politecnico Di Torino Method and apparatus to characterise non-invasively images containing venous blood vessels
US20210166583A1 (en) * 2017-01-24 2021-06-03 Tienovix, Llc System and method for three-dimensional augmented reality guidance for use of equipment
US20180225993A1 (en) 2017-01-24 2018-08-09 Tietronix Software, Inc. System and method for three-dimensional augmented reality guidance for use of medical equipment
US10896628B2 (en) 2017-01-26 2021-01-19 SonoSim, Inc. System and method for multisensory psychomotor skill training
US20180214119A1 (en) 2017-01-27 2018-08-02 Wayne State University Ultrasound and photoacoustic systems and methods for fetal brain assessment during delivery
US20180235576A1 (en) 2017-02-22 2018-08-23 Covidien Lp Ultrasound doppler and elastography for ablation prediction and monitoring
US20180272108A1 (en) 2017-03-27 2018-09-27 Biosense Webster (Israel) Ltd Catheter with improved loop contraction and greater contraction displacement
US20180286287A1 (en) 2017-03-28 2018-10-04 Covidien Lp System and methods for training physicians to perform ablation procedures
JP2018175547A (en) 2017-04-17 2018-11-15 ニプロ株式会社 Puncture guide and ultrasound diagnostic apparatus with puncture guide
US20180310955A1 (en) 2017-04-27 2018-11-01 Bard Access Systems, Inc. Magnetizing System For Needle Assemblies
US20180317881A1 (en) 2017-05-05 2018-11-08 International Business Machines Corporation Automating ultrasound examination of a vascular system
WO2018206473A1 (en) 2017-05-11 2018-11-15 Koninklijke Philips N.V. Workflow, system and method for motion compensation in ultrasound procedures
US20200113540A1 (en) 2017-06-07 2020-04-16 Koninklijke Philips N.V. Ultrasound system and method
US20180366035A1 (en) 2017-06-20 2018-12-20 Ezono Ag System and method for image-guided procedure analysis and training
KR20190013133A (en) 2017-07-31 2019-02-11 (재)예수병원유지재단 Apparatus for guiding syringe arrangement based on ultrasonic probe
US20200188028A1 (en) 2017-08-21 2020-06-18 The Trustees Of Columbia University In The City Of New York Systems and methods for augmented reality guidance
US20190060014A1 (en) 2017-08-30 2019-02-28 Intuitive Surgical Operations, Inc. System and method for providing on-demand functionality during a medical procedure
US20190076121A1 (en) 2017-09-13 2019-03-14 Bard Access Systems, Inc. Ultrasound Finger Probe
US20200041261A1 (en) 2017-10-06 2020-02-06 Advanced Scanners, Inc. Generation of one or more edges of luminosity to form three-dimensional models of objects
US20190105017A1 (en) 2017-10-11 2019-04-11 Geoffrey Steven Hastings Laser assisted ultrasound guidance
US20190223757A1 (en) 2017-12-04 2019-07-25 Bard Access Systems, Inc. Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US20210307838A1 (en) * 2017-12-29 2021-10-07 Weipeng (Suzhou) Co., Ltd. Surgical navigation method and system
US20190239850A1 (en) 2018-02-06 2019-08-08 Steven Philip Dalvin Augmented/mixed reality system and method for the guidance of a medical exam
EP3530221A1 (en) 2018-02-26 2019-08-28 Covidien LP System and method for performing a percutaneous navigation procedure
US20190282324A1 (en) 2018-03-15 2019-09-19 Zoll Medical Corporation Augmented Reality Device for Providing Feedback to an Acute Care Provider
US20190307516A1 (en) 2018-04-06 2019-10-10 Medtronic, Inc. Image-based navigation system and method of using same
US20190339525A1 (en) 2018-05-07 2019-11-07 The Cleveland Clinic Foundation Live 3d holographic guidance and navigation for performing interventional procedures
US20190355278A1 (en) 2018-05-18 2019-11-21 Marion Surgical Inc. Virtual reality surgical system including a surgical tool assembly with haptic feedback
WO2019232451A1 (en) 2018-05-31 2019-12-05 Matt Mcgrath Design & Co, Llc Method of medical imaging using multiple arrays
WO2019232454A9 (en) 2018-05-31 2020-02-27 Matt Mcgrath Design & Co, Llc Anatomical attachment device and associated method of use
WO2020002620A1 (en) 2018-06-29 2020-01-02 Koninklijke Philips N.V. Biopsy prediction and guidance with ultrasound imaging and associated devices, systems, and methods
WO2020016018A1 (en) 2018-07-18 2020-01-23 Koninklijke Philips N.V. Automatic image vetting on a handheld medical scanning device
WO2020044769A1 (en) 2018-08-27 2020-03-05 富士フイルム株式会社 Ultrasound diagnosis device and ultrasound diagnosis device control method
US20200069285A1 (en) 2018-08-31 2020-03-05 General Electric Company System and method for ultrasound navigation
US20200129136A1 (en) 2018-10-31 2020-04-30 Medtronic, Inc. Real-time rendering and referencing for medical procedures
WO2020102665A1 (en) 2018-11-16 2020-05-22 Lang Philipp K Augmented reality guidance for surgical procedures with adjustment of scale, convergence and focal plane or focal point of virtual data
US20200230391A1 (en) 2019-01-18 2020-07-23 Becton, Dickinson And Company Intravenous therapy system for blood vessel detection and vascular access device placement
WO2020186198A1 (en) 2019-03-13 2020-09-17 University Of Florida Research Foundation Guidance and tracking system for templated and targeted biopsy and treatment
US20210007710A1 (en) * 2019-07-12 2021-01-14 Verathon Inc. Representation of a target during aiming of an ultrasound probe
US20210045716A1 (en) 2019-08-13 2021-02-18 GE Precision Healthcare LLC Method and system for providing interaction with a visual artificial intelligence ultrasound image segmentation module
US20230113291A1 (en) 2020-04-02 2023-04-13 Koninklijke Philips N.V. Ultrasound probe, user console, system and method
US20230240643A1 (en) 2020-06-16 2023-08-03 Innovacell Ag Parallel path puncture device guide and method
US20210402144A1 (en) * 2020-06-29 2021-12-30 Bard Access Systems, Inc. Automatic Dimensional Frame Reference for Fiber Optic
US20240058074A1 (en) 2020-07-21 2024-02-22 Bard Access Systems, Inc. System, Method and Apparatus for Magnetic Tracking of Ultrasound Probe and Generation of 3D Visualization Thereof
US20220022969A1 (en) 2020-07-21 2022-01-27 Bard Access Systems, Inc. System, Method and Apparatus for Magnetic Tracking of Ultrasound Probe and Generation of 3D Visualization Thereof
WO2022031762A1 (en) 2020-08-04 2022-02-10 Bard Access Systems, Inc. System and method for optimized medical component insertion monitoring and imaging enhancement
US20220039685A1 (en) 2020-08-04 2022-02-10 Bard Access Systems, Inc. Systemized and Method for Optimized Medical Component Insertion Monitoring and Imaging Enhancement
US20220039777A1 (en) 2020-08-10 2022-02-10 Bard Access Systems, Inc. System and Method for Generating Vessel Representations in Mixed Reality/Virtual Reality
US20220381630A1 (en) 2020-09-18 2022-12-01 Bard Access Systems, Inc. Ultrasound Probe with Pointer Remote Control Capability
WO2022203713A2 (en) 2020-09-18 2022-09-29 Bard Access Systems, Inc. Ultrasound probe with pointer remote control capability
US20220096797A1 (en) 2020-09-25 2022-03-31 Bard Access Systems, Inc. Minimum Catheter Length Tool
WO2022072727A2 (en) 2020-10-02 2022-04-07 Bard Access Systems, Inc. Ultrasound systems and methods for sustained spatial attention
US20220104886A1 (en) 2020-10-02 2022-04-07 Bard Access Systems, Inc. Ultrasound Systems and Methods for Sustained Spatial Attention
US20220117582A1 (en) 2020-10-15 2022-04-21 Bard Access Systems, Inc. Ultrasound Imaging System for Generation of a Three-Dimensional Ultrasound Image
WO2022081904A1 (en) 2020-10-15 2022-04-21 Bard Access Systems, Inc. Ultrasound imaging system for generation of a three-dimensional ultrasound image
US20220160434A1 (en) 2020-11-24 2022-05-26 Bard Access Systems, Inc. Ultrasound System with Target and Medical Instrument Awareness
US20220172354A1 (en) 2020-12-01 2022-06-02 Bard Access Systems, Inc. Ultrasound System with Pressure and Flow Determination Capability
US20220168050A1 (en) 2020-12-01 2022-06-02 Bard Access Systems, Inc. Ultrasound Probe with Target Tracking Capability
US20220211442A1 (en) 2021-01-06 2022-07-07 Bard Access Systems, Inc. Needle Guidance Using Fiber Optic Shape Sensing
WO2022263763A1 (en) * 2021-06-16 2022-12-22 Quantum Surgical Medical robot for placement of medical instruments under ultrasound guidance
CN114129137B (en) * 2021-12-02 2022-09-09 深圳先进技术研究院 Intravascular imaging system, device and imaging method
WO2023235435A1 (en) 2022-06-03 2023-12-07 Bard Access Systems, Inc. Ultrasound probe with smart accessory
US20230389893A1 (en) 2022-06-03 2023-12-07 Bard Access Systems, Inc. Ultrasound Probe with Smart Accessory
WO2024010940A1 (en) 2022-07-08 2024-01-11 Bard Access Systems, Inc. Systems and methods for intelligent ultrasound probe guidance
US20240050061A1 (en) 2022-08-15 2024-02-15 Bard Access Systems, Inc. Spatially Aware Medical Device Configured for Performance of Insertion Pathway Approximation
WO2024039608A1 (en) 2022-08-15 2024-02-22 Bard Access Systems, Inc. Spatially aware medical device configured for performance of insertion pathway approximation
US20240062678A1 (en) 2022-08-17 2024-02-22 Bard Access Systems, Inc. Ultrasound Training System
WO2024039719A1 (en) 2022-08-17 2024-02-22 Bard Access Systems, Inc. Ultrasound training system

Non-Patent Citations (52)

* Cited by examiner, † Cited by third party
Title
EZono, eZSimulator, https://www.ezono.com/en/ezsimulator/, last accessed Sep. 13, 2022.
Ikhsan Mohammad et al: "Assistive technology for ultrasound-guided central venous catheter placement", Journal of Medical Ultrasonics, Japan Society of Ultrasonics in Medicine, Tokyo, JP, vol. 45, No. 1, Apr. 19, 2017, pp. 41-57, XPO36387340, ISSN: 1346-4523, DOI: 10.1007/S10396-017-0789-2 [retrieved on Apr. 19, 2017].
Lu Zhenyu et al "Recent advances in 5 robot-assisted echography combining perception control and cognition." Cognitive Computation and Systems The Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020).
Pagoulatos, N et al. "New spatial localizer based on fiber optics with applications in 3D ultrasound imaging" Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000).
PCT/US2021/042369 filed Jul. 20, 2021 International Search Report and Written Opinion dated Oct. 25, 2021.
PCT/US2021/044419 filed Aug. 3, 2021 International Search Report and Written Opinion dated Nov. 19, 2021.
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021.
PCT/US2021/050973 filed Sep. 17, 2021 International Search Report and Written Opinion dated Nov. 7, 2022.
PCT/US2021/053018 filed Sep. 30, 2021 International Search Report and Written Opinion dated May 3, 2022.
PCT/US2021/055076 filed Oct. 14, 2021 International Search Report and Written Opinion dated Mar. 25, 2022.
PCT/US2023/024067 filed May 31, 2023 International Search Report and Written Opinion dated Sep. 15, 2023.
PCT/US2023/030160 filed Aug. 14, 2023 International Search Report and Written Opinion dated Oct. 23, 2023.
PCT/US2023/030347 filed Aug. 16, 2023 International Search Report and Written Opinion dated Nov. 6, 2023.
Practical guide for safe central venous catheterization and management 2017 Journal of Anesthesia vol. 34 published online Nov. 30, 2019 pp. 167-186.
Sebastian Vogt: "Real-Time Augmented Reality for Image-Guided Interventions", Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docld/1235/file/SebastianVogtDissertation.pdf.
Sonosim, https://sonosim.com/ultrasound-simulation/? last accessed Sep. 13, 2022.
State, A., et al. (Aug. 1996). Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies. In Proceedings of the 23rd annual conference on computer graphics and interactive techniques (pp. 439-446) (Year: 1996).
Stolka, P.J., et al., (2014). Needle Guidance Using Handheld Stereo Vision and Projection for Ultrasound-Based Interventions. In: Galland, P., Hata, N., Barillot, C., Hornegger, J., Howe, R. (eds) Medical Image Computing and Computer-Assisted Intervention—MICCAI 2014. MICCAI 2014. (Year: 2014).
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Non-Final Office Action dated Mar. 6, 2023.
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Notice of Allowance dated Aug. 31, 2023.
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Restriction Requirement dated Dec. 15, 2022.
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Advisory Action dated Jan. 19, 2024.
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Final Office Action dated Oct. 16, 2023.
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Non-Final Office Action dated Feb. 29, 2024.
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Non-Final Office Action dated Mar. 31, 2023.
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Restriction Requirement dated Jan. 12, 2023.
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Advisory Action dated Oct. 5, 2023.
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Final Office Action dated Aug. 4, 2023.
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Non-Final Office Action dated Jan. 23, 2023.
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Non-Final Office Action dated Mar. 1, 2024.
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Notice of Allowance dated Jul. 10, 2024.
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Restriction Requirement dated Aug. 12, 2022.
U.S. Appl. No. 17/478,754, filed Sep. 17, 2021 Non-Final Office Action dated Jul. 1, 2024.
U.S. Appl. No. 17/478,754, filed Sep. 17, 2021 Restriction Requirement dated Jan. 22, 2024.
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Board Decison dated Oct. 25, 2023.
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Final Office Action dated Aug. 29, 2023.
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Non-Final Office Action dated Jun. 5, 2023.
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Non-Final Office Action dated Mar. 22, 2024.
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Notice of Allowance dated Jun. 27, 2024.
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Restriction Requirement dated Feb. 27, 2023.
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Advisory Action dated Jan. 24, 2024.
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Final Office Action dated Aug. 5, 2024.
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Final Office Action dated Nov. 21, 2023.
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Non-Final Office Action dated Jun. 6, 2023.
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Non-Final Office Action dated Mar. 21, 2024.
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Restriction Requirement dated Feb. 1, 2023.
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Advisory Action dated Apr. 4, 2024.
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Final Office Action dated Jan. 25, 2024.
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Non-Final Office Action dated Oct. 6, 2023.
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Notice of Allowance dated May 15, 2024.
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Restriction Requirement dated Jul. 13, 2023.
William F Garrett et al: "Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees", Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240.

Also Published As

Publication number Publication date
CN220655593U (en) 2024-03-26
EP4543303A1 (en) 2025-04-30
WO2024010940A1 (en) 2024-01-11
CN117357158A (en) 2024-01-09
US20240008929A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
US12137989B2 (en) Systems and methods for intelligent ultrasound probe guidance
EP4251063B1 (en) Ultrasound probe with target tracking capability
US11992363B2 (en) Dynamically adjusting ultrasound-imaging systems and methods thereof
EP3530221B1 (en) System for performing a percutaneous navigation procedure
US20230132148A1 (en) High Fidelity Doppler Ultrasound Using Vessel Detection For Relative Orientation
US20230138970A1 (en) Optimized Functionality Through Interoperation of Doppler and Image Based Vessel Differentiation
US20230135562A1 (en) Doppler-Based Vein-Artery Detection for Vascular Assessment
CN217310576U (en) Guidance system for assisting the advancement of a medical component within a patient
US20240050061A1 (en) Spatially Aware Medical Device Configured for Performance of Insertion Pathway Approximation
CN213372456U (en) A guiding system for guiding needle inserts patient's internal
CN105828721B (en) Robotic ultrasound for shape sensing for minimally invasive interventions
US20230148993A1 (en) Ultrasound Probe with Integrated Data Collection Methodologies
US20230147164A1 (en) Systems and Methods for Artificial Intelligence Enabled Ultrasound Correlation
US20240016548A1 (en) Method and system for monitoring an orientation of a medical object

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

AS Assignment

Owner name: BARD ACCESS SYSTEMS, INC., UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MISENER, ANTHONY K.;SOWARDS, STEFFAN;MCLAUGHLIN, WILLIAM ROBERT;SIGNING DATES FROM 20220703 TO 20220705;REEL/FRAME:068313/0258

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP, ISSUE FEE PAYMENT VERIFIED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

OSZAR »