US20180263710A1 - Medical imaging apparatus and surgical navigation system - Google Patents
Medical imaging apparatus and surgical navigation system Download PDFInfo
- Publication number
- US20180263710A1 US20180263710A1 US15/761,507 US201615761507A US2018263710A1 US 20180263710 A1 US20180263710 A1 US 20180263710A1 US 201615761507 A US201615761507 A US 201615761507A US 2018263710 A1 US2018263710 A1 US 2018263710A1
- Authority
- US
- United States
- Prior art keywords
- surgical
- information
- imaging device
- unit
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/14—Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00207—Electrical control of surgical instruments with hand gesture control or hand gesture recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
- A61B2090/3612—Image-producing devices, e.g. surgical cameras with images taken automatically
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
- A61B90/25—Supports therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
Definitions
- a position detection device for example, a device using an optical marker and an optical sensor is known.
- a section for detecting the position and posture of a rigid scope which is composed of a position sensor formed of a photodetector such as a CCD camera, a light emitting unit provided at the rigid scope as a surgical instrument and formed of a light source such as an LED, and a position calculation unit is disclosed.
- the magnetic field-type position detection device using a magnetic field generating device and a magnetic sensor; but in the magnetic field-type position detection device, when an electrically conductive device or the like is used in a device other than the magnetic field generating device for position detection or a surgical instrument, the detection result may have an error, or it may be difficult to perform position detection. Furthermore, also in the magnetic field-type position detection device, similarly to the optical position detection device, position detection may be no longer possible when a physical shield is present between the magnetic field generating device and the magnetic sensor.
- FIG. 10 is a flow chart showing the automatic registration processing of the surgical navigation system according to the embodiment.
- the imaging unit may include a driving mechanism that moves the zoom lens and the focus lens of the optical system along the optical axis. By the zoom lens and the focus lens being moved as appropriate by the driving mechanism, the magnification of the captured image and the focal distance at the time of imaging can be adjusted.
- various functions that may be generally provided in an electronic imaging microscope unit such as an auto-exposure (AE) function and an auto-focus (AF) function, may be mounted.
- AE auto-exposure
- AF auto-focus
- the imaging unit may be configured as what is called a single-chip imaging unit including one imaging element, or may be configured as what is called a multi-chip imaging unit including a plurality of imaging elements.
- the imaging unit may be configured so as to include a pair of imaging elements for acquiring image signals for the right eye and the left eye, respectively, corresponding to stereoscopic vision (3D display).
- the microscope unit 14 is configured as a stereo camera. By 3D display being performed, the operator can grasp the depth of the surgical site more accurately.
- the imaging apparatus 10 of each embodiment according to the present disclosure includes a stereo camera as the microscope unit 14 .
- a plurality of optical systems may be provided to correspond to the imaging elements.
- the user may move the microscope unit 14 in a state of grasping the cylindrical unit 3111 by gripping it.
- the camera manipulation interface 12 may be provided in a position where the user can easily manipulate it with the finger in the state of gripping the cylindrical unit 3111 .
- the user may manipulate an input device (hereinafter, occasionally referred to as an “arm manipulation interface”) to control the posture of the arm unit 30 to move the microscope unit 14 .
- the arm unit 30 is configured by a plurality of links (a first link 3123 a to a sixth link 31230 being linked together in a rotationally movable manner relative to each other by a plurality of joint units (a first joint unit 3121 a to a sixth joint unit 31210 .
- the first joint unit 3121 a has a substantially circular columnar shape, and supports, at its tip (its lower end), the upper end of the cylindrical unit 3111 of the microscope unit 14 in a rotationally movable manner around a rotation axis (a first axis O 1 ) parallel to the center axis of the cylindrical unit 3111 .
- the first joint unit 3121 a may be configured such that the first axis O 1 coincides with the optical axis of the imaging unit of the microscope unit 14 . Thereby, the microscope unit 14 can be rotationally moved around the first axis O 1 , and thus the visual field can be altered so as to rotate the captured image.
- the first link 3123 a fixedly supports, at its tip, the first joint unit 3121 a .
- the first link 3123 a is a bar-like member having a substantially L-shaped configuration, and is connected to the first joint unit 3121 a in such a manner that one side on the tip side of the first link 3123 a extends in a direction orthogonal to the first axis O 1 and the end of the one side is in contact with an upper end portion of the outer periphery of the first joint unit 3121 a .
- the second joint unit 3121 b is connected to the end of the other side on the root end side of the substantially L-shaped configuration of the first link 3123 a.
- the second joint unit 3121 b has a substantially circular columnar shape, and supports, at its tip, the root end of the first link 3123 a in a rotationally movable manner around a rotation axis (a second axis O 2 ) orthogonal to the first axis O 1 .
- the tip of the second link 3123 b is fixedly connected to the root end of the second joint unit 3121 b.
- the second link 3123 b is a bar-like member having a substantially L-shaped configuration, and one side on its tip side extends in a direction orthogonal to the second axis O 2 and the end of the one side is fixedly connected to the root end of the second joint unit 3121 b .
- the third joint unit 3121 c is connected to the other side on the root end side of the substantially L-shaped configuration of the second link 3123 b.
- the third joint unit 3121 c has a substantially circular columnar shape, and supports, at its tip, the root end of the second link 3123 b in a rotationally movable manner around a rotation axis (a third axis O 3 ) orthogonal to both of the first axis O 1 and the second axis O 2 .
- the tip of the third link 3123 c is fixedly connected to the root end of the third joint unit 3121 c .
- the third link 3123 c is configured such that the tip side has a substantially circular columnar shape, and the root end of the third joint unit 3121 c is fixedly connected to the tip of the circular columnar shape in such a manner that both have substantially the same center axis.
- the root end side of the third link 3123 c has a prismatic shape, and the fourth joint unit 3121 d is connected to the end on the root end side.
- the fourth joint unit 3121 d has a substantially circular columnar shape, and supports, at its tip, the root end of the third link 3123 c in a rotationally movable manner around a rotation axis (a fourth axis O 4 ) orthogonal to the third axis O 3 .
- the tip of the fourth link 3123 d is fixedly connected to the root end of the fourth joint unit 3121 d.
- the fourth link 3123 d is a bar-like member extending substantially in a straight line, and extends orthogonally to the fourth axis O 4 and is fixedly connected to the fourth joint unit 3121 d in such a manner that the end of the tip of the fourth link 3123 d is in contact with a side surface of the substantially circular columnar shape of the fourth joint unit 3121 d .
- the fifth joint unit 3121 e is connected to the root end of the fourth link 3123 d.
- the fifth link 3123 e is configured such that a first member having a substantially L-shaped configuration in which one side extends in the vertical direction and the other side extends in the horizontal direction and a second member in a bar-like shape that extends downward in the vertical direction from the portion extending in the horizontal direction of the first member are combined.
- the root end of the fifth joint unit 3121 e is fixedly connected to the vicinity of the upper end of the portion extending in the vertical direction of the first member of the fifth link 3123 e .
- the sixth joint unit 3121 f is connected to the root end (the lower end) of the second member of the fifth link 3123 e.
- the sixth joint unit 3121 f has a substantially circular columnar shape, and supports, on its tip side, the root end of the fifth link 3123 e in a rotationally movable manner around a rotation axis (a sixth axis O 6 ) parallel to the vertical direction.
- the tip of the sixth link 3123 f is fixedly connected to the root end of the sixth joint unit 3121 f.
- the illustrated configuration of the arm unit 30 is only an example, and the number and shape (length) of links and the number, arrangement position, direction of the rotation axis, etc. of joint units that constitute the arm unit 30 may be appropriately designed so that desired degrees of freedom can be achieved.
- the arm unit 30 may be configured to have 6 degrees of freedom in order to freely move the microscope unit 14
- the arm unit 30 may be configured to have larger degrees of freedom (that is, redundant degrees of freedom).
- the posture of the arm unit 30 can be altered in a state where the position and posture of the microscope unit 14 are fixed.
- control with higher convenience for the operator can be achieved, such as controlling the posture of the arm unit 30 so that the arm unit 30 does not interfere with the visual field of the operator who views the display device 54 of the navigation apparatus 50 .
- the first joint unit 3121 a to the sixth joint unit 3121 f may be provided with a driving mechanism such as a motor and an actuator equipped with an encoder or the like that detects the rotation angle in each joint unit.
- the driving of each actuator provided in the first joint unit 3121 a to the sixth joint unit 3121 f may be controlled as appropriate by the control device 100 , and thereby the posture of the arm unit 30 , that is, the position and posture of the microscope unit 14 can be controlled.
- the value detected by the encoder provided in each joint unit may be used as posture information concerning the posture of the arm unit 30 .
- Such an operation of the brake may be performed in accordance with the manipulation input by the user via the camera manipulation interface 12 described above.
- the user manipulates the camera manipulation interface 12 to release the brake of each joint unit.
- the operating mode of the arm unit 30 transitions to a mode in which the rotation in each joint unit can be freely made (an all-free mode).
- the user manipulates the camera manipulation interface 12 to put the brake in each joint unit into operation.
- the operating mode of the arm unit 30 transitions to a mode in which the rotation in each joint unit is restricted (a fixed mode).
- the navigation apparatus 50 includes a navigation manipulation interface 52 through which the manipulation input of the navigation apparatus 50 is performed by the user, the display device 54 , a memory device 56 , and a navigation control device 60 .
- the navigation control device 60 performs various signal processings on an image signal acquired from the imaging apparatus 10 to produce 3D image information for display, and causes the display device 54 to display the 3D image information.
- various known signal processings such as development processing (demosaic processing), image quality improvement processing (range enhancement processing, super-resolution processing, noise reduction (NR) processing, camera shake compensation processing, and/or the like), and/or magnification processing (i.e. electronic zoom processing) may be performed.
- a plurality of display devices 54 may be provided, and an image of the surgical site and various pieces of information concerning the operation may be displayed individually on the plurality of display devices 54 .
- the display device 54 various known display devices such as a liquid crystal display device or an electro-luminescence (EL) display device may be used.
- a preoperative image or a 3D model of the surgical site of the patient 1 of which the relative relationship with a predetermined reference position in the three-dimensional space is found in advance is stored.
- a preoperative image is produced or a 3D model of the surgical site is produced on the basis of an MRI image or the like of a part including the surgical site of the patient 1 .
- information for assisting the operation such as the position of incision, the position of an affected part, and the position of excision may be superimposed on the preoperative image or the 3D model, or on an image of contours or the like of the surgical site of the patient 1 obtained from the preoperative image or the 3D model, and the resulting image may be stored in the memory device 56 .
- the navigation control device 60 superimposes at least one preoperative image or 3D model on 3D image information captured by the microscope unit 14 to produce 3D image information, and causes the display device 54 to display the 3D image information.
- the memory device 56 may be provided in the navigation apparatus 50 , or may be provided in a server connected via a network or the like.
- FIG. 8 is a flow chart executed by the navigation control device 60 of the navigation apparatus 50 in the processing of grasping a surgical field.
- the navigation control device 60 acquires the relative position of the head of the patient 1 from the control device 100 of the imaging apparatus 10 .
- the navigation control device 60 calls up, from the memory device 56 , at least one of a 3D model and a preoperative image of the head of the patient 1 of which the relative positional relationship with the origin P 0 is found in advance, and superimposes the relative position of the head of the patient 1 transmitted from the position computation unit 110 to produce 3D image information for display.
- the navigation control device 60 outputs the produced 3D image information to the display device 54 , and causes the display device 54 to display the image.
- step S 130 the position calculation unit 116 transmits the 3D image information captured by the stereo camera 14 A and the information of the relative three-dimensional coordinates of the feature point to the navigation control device 60 .
- the comparison and matching between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model can be performed, and the comparison result may be displayed on the display device 54 .
- the user adjusts the posture of the arm unit 30 so that the head of the patient 1 in the captured image and the preoperative image or the 3D model are registered.
- FIG. 10 is a flow chart of the automatic registration processing performed by the arm posture control unit 120 .
- the position computation unit 110 of the control device 100 performs step S 122 to step S 130 in accordance with the flow chart shown in FIG. 9 .
- step S 132 the arm posture control unit 120 of the control device 100 acquires, from the navigation control device 60 , the result of comparison between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model.
- the arm posture control unit 120 goes to step S 136 and determines the pivot point at the time of moving the position of the stereo camera 14 A.
- the arm posture control unit 120 may calculate the position of a virtual center of the head of the patient 1 that is stereoscopically reconstructed, and may take the position of the virtual center as the pivot point.
- the reference marker 134 and the surgical instrument marker 130 may be an optical marker including four marker units serving as marks for detecting the position or posture.
- a configuration in which a marker unit with a distinctive color such as red is used and the position and posture of the marker are detected on the basis of 3D image information acquired by a stereo camera 14 A is possible. Since the positional relationships among the four marker units in the captured image vary with the position and posture of the marker, the position calculation unit 116 can identify the position and posture of the marker by detecting the positional relationships among the four marker units.
- the processing of grasping a surgical field is basically executed in accordance with the flow chart shown in FIG. 7 .
- a predetermined position specified on the basis of the reference marker 134 is taken as the origin P 0 of the three-dimensional coordinate system. Therefore, in step S 106 of FIG. 7 , on the basis of the posture information of the arm unit 30 and the information of the focal distance of the stereo camera 14 A, the position calculation unit 116 calculates the relative three-dimensional coordinates of the head of the patient 1 of which the origin P 0 is the predetermined position specified on the basis of the reference marker 134 .
- the origin P 0 may be set in advance as, for example, the position of the three-dimensional coordinates of the reference marker 134 calculated on the basis of the posture information of the arm unit 30 and the camera parameters outputted from the stereo camera 14 A.
- the position of the reference marker 134 serving as the origin P 0 may be the position of any one of the four marker units of the reference marker 134 , or may be an arbitrary position that is other than the marker unit and has a fixed relative position to the reference marker 134 .
- the three-dimensional coordinates with respect to the arbitrary origin P 0 may be defined by the posture of the reference marker 134 . That is, the position calculation unit 116 may specify the three axes of x, y, and z on the basis of the posture of the identified reference marker 134 . Thereby, the position calculation unit 116 can find the relative three-dimensional coordinates of the head of the patient 1 to the origin P 0 .
- step S 192 the position calculation unit 116 transmits the calculated relative position of the tip of the probe 148 and the calculated posture information of the probe 148 to the navigation control device 60 . After that, the procedure returns to step S 182 , and step S 182 to step S 192 are repeated.
- the navigation control device 60 acquires, from the control device 100 of the imaging apparatus 10 , the relative position of the tip of the probe 148 and the posture information of the probe 148 , depicts the probe 148 on the image information of the head of the patient 1 , and causes the display device 54 to display the image of the probe 148 in real time. Thereby, even when the tip of the probe 148 has entered the interior of the body, the operator can move the tip of the surgical instrument to a desired position while viewing navigation display displayed on the display device 54 .
- control device 100 may operate so as to capture the reference marker 134 in the captured image at an appropriate timing, and may execute the examination of positional shift and the automatic correction of the posture information of the arm unit 30 .
- FIG. 18 shows a flow chart of recalibration processing.
- the position calculation unit 116 sends a command to the arm posture control unit 120 to cause the arm posture control unit 120 to change the posture of the arm unit 30 so that the reference marker 134 comes within the captured image of the stereo camera 14 A.
- a surgical information processing apparatus including:
- the surgical information processing apparatus according to (1) to (4), wherein the first image information is obtained in the registration mode at a different perspective that the second image information that is obtained in the imaging mode.
- the medical image processing method wherein the position determination is further performed by determining the first position information indicating a position of the medical imaging device with respect to the predetermined position based on the arm position information and by determining the second position information from a stereoscopic distance between the patient and the medical imaging device.
- the medical imaging apparatus according to any one of (1A) to (9A), wherein the arm posture information detection unit detects the posture information on the basis of an output of an encoder provided in the joint unit.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Robotics (AREA)
- Neurosurgery (AREA)
- Gynecology & Obstetrics (AREA)
- Radiology & Medical Imaging (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Description
- This application claims the benefit of Japanese Priority Patent Application JP 2015-252869 filed Dec. 25, 2015, the entire contents of which are incorporated herein by reference.
- The present disclosure relates to a surgical information processing apparatus and method.
- Thus far, surgical navigation systems for assisting accurate operations have been known. The surgical navigation system is used in the field of, for example, neurosurgery, otolaryngology, orthopedics, or the like; and displays an image in which an MRI image, a 3D model, or the like prepared in advance is superimposed on a captured image of a surgical field, and thus assists an operation so that the operation is advanced in accordance with a prior plan. Such a surgical navigation system includes, for example, a position detection device for detecting the position of a microscope, a patient, or a surgical instrument. That is, since neither the microscope nor the surgical instrument has a section for acquiring the relationship between the relative three-dimensional positions of the microscope or the surgical instrument itself and the patient, a section for finding the mutual positional relationship is necessary.
- As such a position detection device, for example, a device using an optical marker and an optical sensor is known. In
PTL 1, a section for detecting the position and posture of a rigid scope which is composed of a position sensor formed of a photodetector such as a CCD camera, a light emitting unit provided at the rigid scope as a surgical instrument and formed of a light source such as an LED, and a position calculation unit is disclosed. - PTL 1: JP 2002-102249A
- However, in the optical position detection device disclosed in
PTL 1, when a physical shield is present between the light emitting unit provided at the rigid scope and the optical sensor, position detection may be no longer possible. For example, there are many surgical instruments and surgical staff members in the surgical place; hence, to prevent a physical shield between the light emitting unit and the optical sensor, an inconvenience such as the necessity to install the optical sensor in a high position may occur. - Other than the optical position detection device, there is a magnetic field-type position detection device using a magnetic field generating device and a magnetic sensor; but in the magnetic field-type position detection device, when an electrically conductive device or the like is used in a device other than the magnetic field generating device for position detection or a surgical instrument, the detection result may have an error, or it may be difficult to perform position detection. Furthermore, also in the magnetic field-type position detection device, similarly to the optical position detection device, position detection may be no longer possible when a physical shield is present between the magnetic field generating device and the magnetic sensor.
- According to the present disclosure, there is provided a surgical information processing apparatus, including circuitry that obtains position information of a surgical imaging device, the position information indicating displacement of the surgical imaging device from a predetermined position, in a registration mode, obtain first image information from the surgical imaging device regarding a position of a surgical component, determines the position of the surgical component based on the first image information and the position information, and in an imaging mode, obtains second image information from the surgical imaging device of the surgical component based on the determined position.
- Further, according to the present disclosure, there is provided a surgical information processing method implemented using circuitry, including the steps of obtaining first position information of a surgical imaging device, the first position information indicating displacement of the surgical imaging device from a predetermined position, generating second position information of a surgical component with respect to the surgical imaging device based on first image information obtained in a registration mode from the surgical imaging device, determining the position of a surgical component with respect to the predetermined position based on first position information and the second position information, and in an imaging mode, obtaining second image information from the medical imaging device of the surgical component based on the determined position.
- Further, according to the present disclosure, there is provided a non-transitory computer readable medium having stored therein a program that when executed by a computer including circuitry causes the computer to implement a surgical information processing method implemented using circuitry, including the steps of obtaining first position information of a surgical imaging device, the first position information indicating displacement of the surgical imaging device from a predetermined position, generating second position information of a surgical component with respect to the surgical imaging device based on first image information obtained in a registration mode from the surgical imaging device, determining the position of a surgical component with respect to the predetermined position based on first position information and the second position information, and in an imaging mode, obtaining second image information from the medical imaging device of the surgical component based on the determined position.
- As described above, according to an embodiment of the present disclosure, a medical imaging apparatus and a surgical navigation system capable of calculating a predetermined position on the basis of information acquired by an imaging apparatus that images a patient, without using an additional sensor such as an optical sensor or a magnetic sensor, can be obtained. Note that the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
-
FIG. 1 is an illustration diagram for describing a rough configuration of a surgical navigation system including an imaging apparatus. -
FIG. 2 is an illustration diagram showing an example of the configuration of the imaging apparatus. -
FIG. 3 is a block diagram showing an example of the system configuration of the surgical navigation system including the imaging apparatus. -
FIG. 4 is a block diagram showing the functional configuration of a position computation unit of the imaging apparatus. -
FIG. 5 is an illustration diagram showing an example of the use of the surgical navigation system including the imaging apparatus. -
FIG. 6 is an illustration diagram showing a situation of an operation for which a surgical navigation system according to a first embodiment of the present disclosure can be used. -
FIG. 7 is a flow chart showing the processing of grasping a surgical field of the surgical navigation system according to the embodiment. -
FIG. 8 is a flow chart showing the processing of grasping a surgical field of the surgical navigation system according to the embodiment. -
FIG. 9 is a flow chart showing the registration processing of the surgical navigation system according to the embodiment. -
FIG. 10 is a flow chart showing the automatic registration processing of the surgical navigation system according to the embodiment. -
FIG. 11 is a flow chart showing the processing of detecting the position of the tip of a surgical instrument of the surgical navigation system according to the embodiment. -
FIG. 12 is a flow chart showing the processing of detecting the position of the tip of a surgical instrument of the surgical navigation system according to the embodiment. -
FIG. 13 is an illustration diagram showing an example of the configuration of an imaging apparatus according to a second embodiment of the present disclosure. -
FIG. 14 is an illustration diagram showing a situation of an operation for which a surgical navigation system according to the embodiment can be used. -
FIG. 15 is a flow chart showing the registration processing of the surgical navigation system according to the embodiment. -
FIG. 16 is a flow chart showing the processing of detecting the position of the tip of a surgical instrument of the surgical navigation system according to the embodiment. -
FIG. 17 is a flow chart showing the processing of examining the positional shift of a stereo camera by the imaging apparatus according to the embodiment. -
FIG. 18 is a flow chart showing the recalibration processing by the imaging apparatus according to the embodiment. - Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. In this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- The description is given in the following order.
- 1. Basic configuration of the surgical navigation system
- 1-1. Examples of the configuration of the surgical navigation system
- 1-2. Examples of the system configuration of the surgical navigation system
- 1-3. Examples of the use of the surgical navigation system
- 2. First embodiment (an example using a bed-mounted arm)
- 2-1. Overview of the surgical navigation system
- 2-2. Control processing
- 2-3. Conclusions
- 3. Second embodiment (an example using an arm movable cart)
- 3-1. Overview of the surgical navigation system
- 3-2. Control processing
- 3-3. Conclusions
- In the following description, “the user” refers to any medical staff member who uses the imaging apparatus or the surgical navigation system, such as an operator or an assistant.
- First, the basic configuration common to the embodiments described later out of the configuration of an imaging apparatus to which the technology according to the present disclosure can be applied or a surgical navigation system including the imaging apparatus is described.
- <1-1. Examples of the Configuration of the Surgical Navigation System>
-
FIG. 1 is an illustration diagram for describing a rough configuration of a surgical navigation system.FIG. 2 is an illustration diagram showing an example of the configuration of animaging apparatus 10. The surgical navigation system includes animaging apparatus 10 that images an object to be observed (a surgical site of a patient 1) and anavigation apparatus 50 that performs the navigation of an operation using a surgical field image captured by theimaging apparatus 10. The surgical navigation system is a system for assisting an operator so that an operation is advanced in accordance with a prior plan. An image in which a preoperative image or a 3D model of the surgical site that is prepared in advance and includes the information of the position of incision, the position of an affected part, a treatment procedure, etc. is superimposed on a surgical field image captured by theimaging apparatus 10 may be displayed on adisplay device 54 of thenavigation apparatus 50. - (1-1-1. Imaging Apparatus)
- The
imaging apparatus 10 includes amicroscope unit 14 for imaging the surgical site of thepatient 1 and anarm unit 30 that supports themicroscope unit 14. Themicroscope unit 14 corresponds to a camera in the technology of an embodiment of the present disclosure, and is composed of an imaging unit (not illustrated) provided in acylindrical unit 3111 in a substantially circular cylindrical shape and a manipulation unit (hereinafter, occasionally referred to as a “camera manipulation interface”) 12 provided in a partial area of the outer periphery of thecylindrical unit 3111. Themicroscope unit 14 is an electronic imaging microscope unit (what is called a video microscope unit) that electronically acquires a captured image with the imaging unit. - A cover glass that protects the imaging unit provided inside is provided on the opening surface at the lower end of the
cylindrical unit 3111. The light from the object to be observed (hereinafter, occasionally referred to as observation light) passes through the cover glass, and is incident on the imaging unit in thecylindrical unit 3111. A light source formed of, for example, a light emitting diode (LED) or the like may be provided in thecylindrical unit 3111, and at the time of imaging, light may be applied from the light source to the object to be observed via the cover glass. - The imaging unit is composed of an optical system that collects observation light and an imaging element that receives the observation light collected by the optical system. The optical system is configured such that a plurality of lenses including a zoom lens and a focus lens are combined, and the optical characteristics thereof are adjusted so as to cause observation light to form an image on the light receiving surface of the imaging element. The imaging element receives and photoelectrically converts observation light, and thereby generates a signal corresponding to the observation light, that is, an image signal corresponding to the observed image. As the imaging element, for example, an imaging element having the Bayer arrangement to allow color photographing is used. The imaging element may be any of various known imaging elements such as a complementary metal oxide semiconductor (CMOS) image sensor and a charge-coupled device (CCD) image sensor.
- The image signal generated by the imaging element is transmitted as raw data to a not-illustrated
control device 100. Here, the transmission of the image signal may preferably be performed by optical communication. This is because in the surgical place the operator performs an operation while observing the condition of an affected part using a captured image, and therefore for a safer and more reliable operation it is required that moving images of the surgical site be displayed in real time to the extent possible. By the image signal being transmitted by optical communication, the captured image can be displayed with low latency. - The imaging unit may include a driving mechanism that moves the zoom lens and the focus lens of the optical system along the optical axis. By the zoom lens and the focus lens being moved as appropriate by the driving mechanism, the magnification of the captured image and the focal distance at the time of imaging can be adjusted. In the imaging unit, also various functions that may be generally provided in an electronic imaging microscope unit, such as an auto-exposure (AE) function and an auto-focus (AF) function, may be mounted.
- The imaging unit may be configured as what is called a single-chip imaging unit including one imaging element, or may be configured as what is called a multi-chip imaging unit including a plurality of imaging elements. In the case where the imaging unit is configured as a multi-chip type, for example, an image signal corresponding to each of RGB may be generated by each imaging element, and the image signals thus generated may be synthesized to obtain a color image. Alternatively, the imaging unit may be configured so as to include a pair of imaging elements for acquiring image signals for the right eye and the left eye, respectively, corresponding to stereoscopic vision (3D display). In this case, the
microscope unit 14 is configured as a stereo camera. By 3D display being performed, the operator can grasp the depth of the surgical site more accurately. Theimaging apparatus 10 of each embodiment according to the present disclosure includes a stereo camera as themicroscope unit 14. In the case where the imaging unit is configured as a multi-chip type, a plurality of optical systems may be provided to correspond to the imaging elements. - The
camera manipulation interface 12 is formed of, for example, a cross lever, a switch, or the like, and is an input section that receives the manipulation input of the user. For example, the user may input, via thecamera manipulation interface 12, instructions to alter the magnification of the observed image and the focal distance to the object to be observed. The driving mechanism of the imaging unit may move the zoom lens and the focus lens as appropriate in accordance with the instructions, and thereby the magnification and the focal distance can be adjusted. Furthermore, for example, the user may input, via thecamera manipulation interface 12, an instruction to switch the operating mode of the arm unit 30 (an all-free mode and a fixed mode described later). - When the user intends to move the
microscope unit 14, the user may move themicroscope unit 14 in a state of grasping thecylindrical unit 3111 by gripping it. In this case, in order that thecamera manipulation interface 12 can be manipulated even while the user moves thecylindrical unit 3111, thecamera manipulation interface 12 may be provided in a position where the user can easily manipulate it with the finger in the state of gripping thecylindrical unit 3111. Alternatively, the user may manipulate an input device (hereinafter, occasionally referred to as an “arm manipulation interface”) to control the posture of thearm unit 30 to move themicroscope unit 14. - The
arm unit 30 is configured by a plurality of links (afirst link 3123 a to a sixth link 31230 being linked together in a rotationally movable manner relative to each other by a plurality of joint units (a firstjoint unit 3121 a to a sixth joint unit 31210. - The first
joint unit 3121 a has a substantially circular columnar shape, and supports, at its tip (its lower end), the upper end of thecylindrical unit 3111 of themicroscope unit 14 in a rotationally movable manner around a rotation axis (a first axis O1) parallel to the center axis of thecylindrical unit 3111. Here, the firstjoint unit 3121 a may be configured such that the first axis O1 coincides with the optical axis of the imaging unit of themicroscope unit 14. Thereby, themicroscope unit 14 can be rotationally moved around the first axis O1, and thus the visual field can be altered so as to rotate the captured image. - The
first link 3123 a fixedly supports, at its tip, the firstjoint unit 3121 a. Specifically, thefirst link 3123 a is a bar-like member having a substantially L-shaped configuration, and is connected to the firstjoint unit 3121 a in such a manner that one side on the tip side of thefirst link 3123 a extends in a direction orthogonal to the first axis O1 and the end of the one side is in contact with an upper end portion of the outer periphery of the firstjoint unit 3121 a. The secondjoint unit 3121 b is connected to the end of the other side on the root end side of the substantially L-shaped configuration of thefirst link 3123 a. - The second
joint unit 3121 b has a substantially circular columnar shape, and supports, at its tip, the root end of thefirst link 3123 a in a rotationally movable manner around a rotation axis (a second axis O2) orthogonal to the first axis O1. The tip of thesecond link 3123 b is fixedly connected to the root end of the secondjoint unit 3121 b. - The
second link 3123 b is a bar-like member having a substantially L-shaped configuration, and one side on its tip side extends in a direction orthogonal to the second axis O2 and the end of the one side is fixedly connected to the root end of the secondjoint unit 3121 b. The thirdjoint unit 3121 c is connected to the other side on the root end side of the substantially L-shaped configuration of thesecond link 3123 b. - The third
joint unit 3121 c has a substantially circular columnar shape, and supports, at its tip, the root end of thesecond link 3123 b in a rotationally movable manner around a rotation axis (a third axis O3) orthogonal to both of the first axis O1 and the second axis O2. The tip of thethird link 3123 c is fixedly connected to the root end of the thirdjoint unit 3121 c. By rotationally moving the formation on the tip side including themicroscope unit 14 around the second axis O2 and the third axis O3, themicroscope unit 14 can be moved so that the position of themicroscope unit 14 in the horizontal plane is altered. In other words, by controlling the rotation around the second axis O2 and the third axis O3, the visual field of the captured image can be moved in the plane. - The
third link 3123 c is configured such that the tip side has a substantially circular columnar shape, and the root end of the thirdjoint unit 3121 c is fixedly connected to the tip of the circular columnar shape in such a manner that both have substantially the same center axis. The root end side of thethird link 3123 c has a prismatic shape, and the fourthjoint unit 3121 d is connected to the end on the root end side. - The fourth
joint unit 3121 d has a substantially circular columnar shape, and supports, at its tip, the root end of thethird link 3123 c in a rotationally movable manner around a rotation axis (a fourth axis O4) orthogonal to the third axis O3. The tip of thefourth link 3123 d is fixedly connected to the root end of the fourthjoint unit 3121 d. - The
fourth link 3123 d is a bar-like member extending substantially in a straight line, and extends orthogonally to the fourth axis O4 and is fixedly connected to the fourthjoint unit 3121 d in such a manner that the end of the tip of thefourth link 3123 d is in contact with a side surface of the substantially circular columnar shape of the fourthjoint unit 3121 d. The fifthjoint unit 3121 e is connected to the root end of thefourth link 3123 d. - The fifth
joint unit 3121 e has a substantially circular columnar shape, and supports, on its tip side, the root end of thefourth link 3123 d in a rotationally movable manner around a rotation axis (a fifth axis O5) parallel to the fourth axis O4. The tip of thefifth link 3123 e is fixedly connected to the root end of the fifthjoint unit 3121 e. The fourth axis O4 and the fifth axis O5 are rotation axes that allow themicroscope unit 14 to move in the vertical direction. By rotationally moving the formation on the tip side including themicroscope unit 14 around the fourth axis O4 and the fifth axis O5, the height of themicroscope unit 14, that is, the distance between themicroscope unit 14 and the object to be observed can be adjusted. - The
fifth link 3123 e is configured such that a first member having a substantially L-shaped configuration in which one side extends in the vertical direction and the other side extends in the horizontal direction and a second member in a bar-like shape that extends downward in the vertical direction from the portion extending in the horizontal direction of the first member are combined. The root end of the fifthjoint unit 3121 e is fixedly connected to the vicinity of the upper end of the portion extending in the vertical direction of the first member of thefifth link 3123 e. The sixthjoint unit 3121 f is connected to the root end (the lower end) of the second member of thefifth link 3123 e. - The sixth
joint unit 3121 f has a substantially circular columnar shape, and supports, on its tip side, the root end of thefifth link 3123 e in a rotationally movable manner around a rotation axis (a sixth axis O6) parallel to the vertical direction. The tip of thesixth link 3123 f is fixedly connected to the root end of the sixthjoint unit 3121 f. - The
sixth link 3123 f is a bar-like member extending in the vertical direction, and its root end is fixedly connected to the upper surface of abed 40. - The range in which the first
joint unit 3121 a to the sixthjoint unit 3121 f can rotate is appropriately set so that themicroscope unit 14 can make desired movements. Thereby, in thearm unit 30 having the configuration described above, movements with 3 degrees of freedom of translation and 3 degrees of freedom of rotation, i.e. a total of 6 degrees of freedom, can be achieved for the movement of themicroscope unit 14. By thus configuring thearm unit 30 so that 6 degrees of freedom are achieved for the movement of themicroscope unit 14, the position and posture of themicroscope unit 14 can be freely controlled in the range in which thearm unit 30 can move. Therefore, the surgical site can be observed from any angle, and the operation can be executed more smoothly. - The illustrated configuration of the
arm unit 30 is only an example, and the number and shape (length) of links and the number, arrangement position, direction of the rotation axis, etc. of joint units that constitute thearm unit 30 may be appropriately designed so that desired degrees of freedom can be achieved. For example, although as described above it is preferable that thearm unit 30 be configured to have 6 degrees of freedom in order to freely move themicroscope unit 14, thearm unit 30 may be configured to have larger degrees of freedom (that is, redundant degrees of freedom). In the case where there are redundant degrees of freedom, the posture of thearm unit 30 can be altered in a state where the position and posture of themicroscope unit 14 are fixed. Thus, control with higher convenience for the operator can be achieved, such as controlling the posture of thearm unit 30 so that thearm unit 30 does not interfere with the visual field of the operator who views thedisplay device 54 of thenavigation apparatus 50. - Here, the first
joint unit 3121 a to the sixthjoint unit 3121 f may be provided with a driving mechanism such as a motor and an actuator equipped with an encoder or the like that detects the rotation angle in each joint unit. The driving of each actuator provided in the firstjoint unit 3121 a to the sixthjoint unit 3121 f may be controlled as appropriate by thecontrol device 100, and thereby the posture of thearm unit 30, that is, the position and posture of themicroscope unit 14 can be controlled. The value detected by the encoder provided in each joint unit may be used as posture information concerning the posture of thearm unit 30. - Further, the first
joint unit 3121 a to the sixthjoint unit 3121 f may be provided with a brake that restricts the rotation of the joint unit. The operation of the brake may be controlled by thecontrol device 100. For example, when it is intended to fix the position and posture of themicroscope unit 14, thecontrol device 100 puts the brake of each joint unit into operation. Thereby, the posture of thearm unit 30, that is, the position and posture of themicroscope unit 14 can be fixed without driving the actuator, and therefore the power consumption can be reduced. When it is intended to move the position and posture of themicroscope unit 14, thecontrol device 100 may release the brake of each joint unit, and may drive the actuator in accordance with a predetermined control system. - Such an operation of the brake may be performed in accordance with the manipulation input by the user via the
camera manipulation interface 12 described above. When the user intends to move the position and posture of themicroscope unit 14, the user manipulates thecamera manipulation interface 12 to release the brake of each joint unit. Thereby, the operating mode of thearm unit 30 transitions to a mode in which the rotation in each joint unit can be freely made (an all-free mode). Further, when the user intends to fix the position and posture of themicroscope unit 14, the user manipulates thecamera manipulation interface 12 to put the brake in each joint unit into operation. Thereby, the operating mode of thearm unit 30 transitions to a mode in which the rotation in each joint unit is restricted (a fixed mode). - The
control device 100 puts the actuator of the firstjoint unit 3121 a to the sixthjoint unit 3121 f into operation in accordance with a predetermined control system, and thereby controls the driving of thearm unit 30. Further, for example, thecontrol device 100 controls the operation of the brake of the firstjoint unit 3121 a to the sixthjoint unit 3121 f, and thereby alters the operating mode of thearm unit 30. - Further, the
control device 100 outputs an image signal acquired by the imaging unit of themicroscope unit 14 of theimaging apparatus 10 to thenavigation apparatus 50. At this time, thecontrol device 100 outputs also the information of the position of the surgical site of thepatient 1 and the position of a surgical instrument to thenavigation apparatus 50. - (1-1-2. Navigation Apparatus)
- The
navigation apparatus 50 includes anavigation manipulation interface 52 through which the manipulation input of thenavigation apparatus 50 is performed by the user, thedisplay device 54, amemory device 56, and anavigation control device 60. Thenavigation control device 60 performs various signal processings on an image signal acquired from theimaging apparatus 10 to produce 3D image information for display, and causes thedisplay device 54 to display the 3D image information. In the signal processings, various known signal processings such as development processing (demosaic processing), image quality improvement processing (range enhancement processing, super-resolution processing, noise reduction (NR) processing, camera shake compensation processing, and/or the like), and/or magnification processing (i.e. electronic zoom processing) may be performed. - The
navigation apparatus 50 is provided in the operating room, and displays an image corresponding to 3D image information produced by thenavigation control device 60 on thedisplay device 54, on the basis of a control command of thenavigation control device 60. Thenavigation control device 60 corresponds to a navigation control unit in the technology of an embodiment of the present disclosure. On thedisplay device 54, an image of the surgical site photographed by themicroscope unit 14 may be displayed. Thenavigation apparatus 50 may cause thedisplay device 54 to display, in place of or together with an image of the surgical site, various pieces of information concerning the operation such as the information of the body of thepatient 1 and/or information regarding the surgical technique. In this case, the display of thedisplay device 54 may be switched as appropriate by the user's manipulation. Alternatively, a plurality ofdisplay devices 54 may be provided, and an image of the surgical site and various pieces of information concerning the operation may be displayed individually on the plurality ofdisplay devices 54. As thedisplay device 54, various known display devices such as a liquid crystal display device or an electro-luminescence (EL) display device may be used. - In the
memory device 56, for example, a preoperative image or a 3D model of the surgical site of thepatient 1 of which the relative relationship with a predetermined reference position in the three-dimensional space is found in advance is stored. For example, prior to the operation, a preoperative image is produced or a 3D model of the surgical site is produced on the basis of an MRI image or the like of a part including the surgical site of thepatient 1. Then, information for assisting the operation such as the position of incision, the position of an affected part, and the position of excision may be superimposed on the preoperative image or the 3D model, or on an image of contours or the like of the surgical site of thepatient 1 obtained from the preoperative image or the 3D model, and the resulting image may be stored in thememory device 56. Thenavigation control device 60 superimposes at least one preoperative image or 3D model on 3D image information captured by themicroscope unit 14 to produce 3D image information, and causes thedisplay device 54 to display the 3D image information. Thememory device 56 may be provided in thenavigation apparatus 50, or may be provided in a server connected via a network or the like. - <1-2. Examples of the System Configuration of the Surgical Navigation System>
-
FIG. 3 is a block diagram showing an example of the system configuration of the surgical navigation system.FIG. 4 is a block diagram showing the functional configuration of aposition computation unit 110 of thecontrol device 100. Theimaging apparatus 10 includes thecamera manipulation interface 12, themicroscope unit 14, anencoder 16, amotor 18, anarm manipulation interface 20, and thecontrol device 100. Of them, theencoder 16 and themotor 18 are mounted on the actuator provided in the joint unit of thearm unit 30. Thenavigation apparatus 50 includes thenavigation manipulation interface 52, thedisplay device 54, thememory device 56, and thenavigation control device 60. - The
control device 100 may be a processor such as a central processing unit (CPU) or a graphics processing unit (GPU), or a microcomputer, a control board, or the like in which a processor and a memory element such as a memory are combined. The processor of thecontrol device 100 operates in accordance with a predetermined program, and thereby the various functions described above can be achieved. Although in the illustrated example thecontrol device 100 is provided as a separate device from theimaging apparatus 10, thecontrol device 100 may be installed in theimaging apparatus 10 and may be configured integrally with theimaging apparatus 10. Alternatively, thecontrol device 100 may be composed of a plurality of devices. For example, a microcomputer, a control board, or the like may be provided in each of themicroscope unit 14 and the firstjoint unit 3121 a to the sixthjoint unit 3121 f of thearm unit 30, and they may be connected to be communicable with each other; thereby, a similar function to thecontrol device 100 can be achieved. - Similarly, also the
navigation control device 60 may be a processor such as a CPU or a GPU, or a microcomputer, a control board, or the like in which a processor and a memory element such as a memory are combined. The processor of thenavigation control device 60 operates in accordance with a predetermined program, and thereby the various functions described above can be achieved. Although in the illustrated example thenavigation control device 60 is provided as a separate device from thenavigation apparatus 50, thenavigation control device 60 may be installed in thenavigation apparatus 50 and may be configured integrally with thenavigation apparatus 50. Alternatively, thenavigation control device 60 may be composed of a plurality of devices. - The communication between the
control device 100 and themicroscope unit 14 and the communication between thecontrol device 100 and the firstjoint unit 3121 a to the sixthjoint unit 3121 f may be wired communication or may be wireless communication. The communication between thenavigation control device 60 and thenavigation manipulation interface 52, the communication between thenavigation control device 60 and thedisplay device 54, and the communication between thenavigation control device 60 and thememory device 56 may be wired communication or may be wireless communication. In the case of wired communication, communication by electrical signals may be performed, or optical communication may be performed. In this case, the transmission cable used for the wired communication may be configured as an electrical signal cable, an optical fiber, or a composite cable of these in accordance with the communication system. On the other hand, in the case of wireless communication, since it is not necessary to lay transmission cables in the operating room, a situation in which the movements of medical staff members in the operating room are hindered by such transmission cables can be avoided. - The
control device 100 of theimaging apparatus 10 includes aposition computation unit 110 and an armposture control unit 120. Theposition computation unit 110 calculates a predetermined position on the basis of information acquired from themicroscope unit 14 and information acquired from theencoder 16. Theposition computation unit 110 transmits the calculation result to thenavigation control device 60. A design in which the calculation result obtained by theposition computation unit 110 is readable by the armposture control unit 120 is possible. Further, theposition computation unit 110 outputs image information based on an image signal acquired by themicroscope unit 14 to thenavigation control device 60. In this case, theposition computation unit 110 corresponds also to an output unit that outputs image information produced from an image signal acquired by themicroscope unit 14. - As shown in
FIG. 4 , theposition computation unit 110 includes an arm postureinformation detection unit 112, a camerainformation detection unit 114, and aposition calculation unit 116. The arm postureinformation detection unit 112 grasps the current posture of thearm unit 30 and the current position and posture of themicroscope unit 14 on the basis of information concerning the rotation angle of each joint unit detected by theencoder 16. The camerainformation detection unit 114 acquires image information concerning an image captured by themicroscope unit 14. In the image information acquired, also the information of the focal distance and magnification of themicroscope unit 14 may be included. The focal distance of themicroscope unit 14 may be outputted while being replaced with, for example, the distance from the rotation axis of the secondjoint unit 3121 b that supports themicroscope unit 14 in thearm unit 30 to the surgical site of thepatient 1. The processing executed by theposition computation unit 110 will be described in detail in the later embodiments. - Returning to
FIG. 3 , the armposture control unit 120 drives themotor 18 provided in each joint unit of thearm unit 30 on the basis of a control command from thenavigation control device 60, and thus controls thearm unit 30 to a predetermined posture. Thereby, for example, the surgical site of thepatient 1 can be imaged from a desired angle by themicroscope unit 14. The armposture control unit 120 may control eachmotor 18 on the basis of the calculation result of theposition computation unit 110. - Specifically, using the posture information of the
arm unit 30 detected by theposition computation unit 110, the armposture control unit 120 calculates a control value for each joint unit (for example, the rotation angle, the torque to be generated, etc.) which achieves a movement of themicroscope unit 14 in accordance with the manipulation input from the user or the control command from thenavigation control device 60. The armposture control unit 120 drives themotor 18 of each joint unit in accordance with the calculated control value. At this time, the system of the control of thearm unit 30 by the armposture control unit 120 is not limited, and various known control systems such as force control or position control may be employed. - For example, the operator may perform a manipulation input via a not-illustrated
arm manipulation interface 20 as appropriate; thereby, the driving of thearm unit 30 can be appropriately controlled by the armposture control unit 120 in accordance with the manipulation input, and the position and posture of themicroscope unit 14 can be controlled. By the control, themicroscope unit 14 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement. As thearm manipulation interface 20, one that can be manipulated even when the operator holds a surgical instrument in the hand, such as a foot switch, is preferably used in view of the convenience of the operator. The manipulation input may be performed in a non-contact manner based on gesture tracking or eye-gaze tracking using a wearable device or a camera provided in the operating room. Thereby, even a user in a clean area can manipulate a device in an unclean area with higher degrees of freedom. Alternatively, thearm unit 30 may be manipulated by what is called a master-slave system. In this case, thearm unit 30 may be remotely manipulated by the user via thearm manipulation interface 20 installed in a place distant from the operating room. - Further, in the case where force control is employed, what is called power-assisted control may be performed in which an external force from the user is received and the
motor 18 of the firstjoint unit 3121 a to the sixthjoint unit 3121 f is driven so that thearm unit 30 moves smoothly in accordance with the external force. Thus, when the user intends to directly move the position of themicroscope unit 14 by grasping it, the user can move themicroscope unit 14 with a relatively small force. Therefore, themicroscope unit 14 can be moved more intuitively by a simpler manipulation, and the convenience of the user can be improved. - Further, the driving of the
arm unit 30 may be controlled so that thearm unit 30 performs pivot operation. Here, the pivot operation is an operation of moving themicroscope unit 14 so that the optical axis of themicroscope unit 14 is oriented to a predetermined point in the space (hereinafter, referred to as a pivot point) at all times. By the pivot operation, the same observation position can be observed from various directions, and therefore more detailed observation of an affected part becomes possible. In the case where themicroscope unit 14 is configured such that its focal distance is unadjustable, it is preferable that the pivot operation be performed in a state where the distance between themicroscope unit 14 and the pivot point is fixed. In this case, the distance between themicroscope unit 14 and the pivot point may be adjusted to the fixed focal distance of themicroscope unit 14. Thereby, themicroscope unit 14 moves on a hemisphere surface having a radius corresponding to the focal distance, with the pivot point as the center (schematically illustrated inFIG. 1 andFIG. 2 ), and a clear captured image is obtained even when the observation direction is altered. - On the other hand, in the case where the
microscope unit 14 is configured such that its focal distance is adjustable, the pivot operation may be performed in a state where the distance between themicroscope unit 14 and the pivot point is variable. In this case, for example, thecontrol device 100 may calculate the distance between themicroscope unit 14 and the pivot point on the basis of information concerning the rotation angle of each joint unit detected by the encoder, and may adjust the focal distance of themicroscope unit 14 automatically on the basis of the calculation result. Alternatively, in the case where themicroscope unit 14 is provided with an AF function, the focal distance may be adjusted automatically by the AF function every time the distance between themicroscope unit 14 and the pivot point changes by the pivot operation. - <1-3. Examples of the Use of the Surgical Navigation System>
-
FIG. 5 is a diagram showing an example of the use of the surgical navigation system shown inFIG. 1 . InFIG. 5 , a situation in which, using the surgical navigation system, anoperator 3401 performs an operation on thepatient 1 on thebed 40 as a support base that supports thepatient 1 is schematically shown. InFIG. 5 , the surgical navigation system is simplified for illustration for ease of understanding. - As shown in
FIG. 5 , during the operation, a surgical field image photographed by theimaging apparatus 10 is displayed with magnification on thedisplay device 54. Thedisplay device 54 is installed in a position easily viewable from theoperator 3401, and theoperator 3401 performs various treatments, such as the excision of an affected part, on a surgical site while observing the condition of the surgical site using a video image shown on thedisplay device 54. The surgical instrument used may be, for example, a surgical instrument equipped with a pair of forceps, a grasper, or the like at its tip, or any of various surgical instruments such as an electric scalpel and an ultrasonic scalpel. - During the operation, an image in which a surgical field image captured by the
imaging apparatus 10 is superimposed on a preoperative image or a 3D model is displayed on thedisplay device 54. Theoperator 3401 performs various treatments, such as the excision of an affected part, in accordance with navigation display displayed on thedisplay device 54 while observing the condition of the surgical site using a video image shown on thedisplay device 54. At this time, on thedisplay device 54, for example, information such as the position of incision, the position of excision, and the position or posture of the tip of a surgical instrument may be displayed. - Hereinabove, an overview of the surgical navigation system to which the technology according to the present disclosure can be applied is described. Some specific embodiments of the technology according to the present disclosure will now be described. In each embodiment described below, an example in which a stereo camera 14A that enables 3D display is used as the
microscope unit 14 is described. - <2-1. Overview of the Surgical Navigation System>
- In a surgical navigation system according to a first embodiment of the present disclosure, the
arm unit 30 of theimaging apparatus 10 is fixed to the bed 40 (seeFIG. 1 ). That is, the positional relationship between a fixedportion 32 fixed to thebed 40 of thearm unit 30 and thepatient 1 can be kept fixed. Hence, theimaging apparatus 10 according to the embodiment is configured so as to calculate a predetermined position in a three-dimensional coordinate system in which the fixedportion 32 of thearm unit 30 or an arbitrary spatial position having a fixed relative positional relationship with the fixedportion 32 is taken as the origin (reference position) P0. The surgical navigation system according to the embodiment is an example of the system in which neither a reference marker for setting the position of the origin P0 of the three-dimensional coordinates nor a surgical instrument marker for identifying the position or posture of a surgical instrument is used. -
FIG. 6 is an illustration diagram showing a situation of an operation for which the surgical navigation system according to the embodiment can be used. The illustrated example shows a situation of a brain surgery, and thepatient 1 is supported on thebed 40 in a state of facing down and the head is fixed by a fixingtool 42. As described above, neither a reference marker for setting the position of the origin P0 of the three-dimensional coordinates nor a surgical instrument marker for indicating the position or posture of a surgical instrument is used. - <2-2. Control Processing>
- The control processing executed in the surgical navigation system according to the embodiment will now be described with reference to
FIG. 3 andFIG. 4 . As the control processing, the processing of grasping a surgical field, registration processing, and the processing of detecting the position of the tip of a surgical instrument are described. - (2-2-1. Processing of Grasping a Surgical Field)
- First, an example of the processing of grasping a surgical field imaged by the stereo camera 14A is described. The processing of grasping a surgical field may be a processing for sharing an in-focus position in the captured image obtained by the stereo camera 14A with the
navigation apparatus 50. During the operation, since the focus is placed on the surgical site of thepatient 1 automatically or by the user's manipulation, the in-focus position can be said to be the position of the surgical site. The in-focus position can be grasped on the basis of the focal distance, the magnification, the angle of view, etc. of the stereo camera 14A. -
FIG. 7 is a flow chart executed by thecontrol device 100 of theimaging apparatus 10 in the processing of grasping a surgical field. In step S102, in a state where the focus is placed on the head of thepatient 1, the arm postureinformation detection unit 112 detects the posture information of thearm unit 30 on the basis of information concerning the rotation angle of each joint unit detected by theencoder 16 provided in each joint unit of thearm unit 30. - Subsequently, in step S104, the camera
information detection unit 114 acquires information outputted from the stereo camera 14A. The information outputted from the stereo camera 14A may include the information of the focal distance, the magnification, the angle of view, etc. of the stereo camera 14A (hereinafter, occasionally referred to as “camera parameters”). The focal distance of the stereo camera 14A may be outputted while being replaced with, for example, the information of the distance in the optical axis direction from the end rotation axis on the stereo camera 14A side in thearm unit 30 to the head of thepatient 1. The focal distance, the magnification, the angle of view, etc. of the stereo camera 14A may be altered by the manipulation input of thecamera manipulation interface 12, and the set values thereof may be detected by a potentiometer or the like provided in the lens portion of the stereo camera 14A. - Subsequently, in step S106, on the basis of the posture information of the
arm unit 30 and the information of the focal distance of the stereo camera 14A, theposition calculation unit 116 calculates the relative position of the head of thepatient 1 to a predetermined reference position of which the position does not change even when the posture of thearm unit 30 changes. For example, theposition calculation unit 116 may calculate the relative three-dimensional coordinates of the head of thepatient 1 in a coordinate system (an xyz three-dimensional coordinate system) in which an arbitrary position in the fixedportion 32 of thearm unit 30 fixed to thebed 40 is taken as the origin P0. The origin P0 may be also an arbitrary position having a fixed relative positional relationship with the fixedportion 32 of thearm unit 30. - Subsequently, in step S108, the
position calculation unit 116 transmits the calculated relative three-dimensional coordinates of the head of thepatient 1 to thenavigation control device 60. Theposition calculation unit 116 performs step S102 to step S108 when at least the posture of thearm unit 30 or any one of the focal distance, the magnification, the angle of view, etc. of the stereo camera 14A is altered. Alternatively, step S102 to step S108 may be performed repeatedly at a predetermined time interval that is set in advance. -
FIG. 8 is a flow chart executed by thenavigation control device 60 of thenavigation apparatus 50 in the processing of grasping a surgical field. In step S112, thenavigation control device 60 acquires the relative position of the head of thepatient 1 from thecontrol device 100 of theimaging apparatus 10. Subsequently, in step S114, thenavigation control device 60 calls up, from thememory device 56, at least one of a 3D model and a preoperative image of the head of thepatient 1 of which the relative positional relationship with the origin P0 is found in advance, and superimposes the relative position of the head of thepatient 1 transmitted from theposition computation unit 110 to produce 3D image information for display. Subsequently, in step S116, thenavigation control device 60 outputs the produced 3D image information to thedisplay device 54, and causes thedisplay device 54 to display the image. - The
navigation control device 60 may perform step S112 to step S116 repeatedly when the relative position of the head of thepatient 1 transmitted from thecontrol device 100 is altered, or at a predetermined time interval that is set in advance. The way of superimposition in the captured image displayed may be designed to be alterable by manipulating thenavigation manipulation interface 52. - In order to adjust the surgical field, the user may manipulate the
navigation manipulation interface 52 to transmit a control command of thearm unit 30 to the armposture control unit 120 via thenavigation control device 60. Alternatively, a design in which thenavigation control device 60 itself can transmit a control command of thearm unit 30 to the armposture control unit 120 on the basis of a predetermined arithmetic processing is possible. The armposture control unit 120 resolves the control command of thearm unit 30 into the operation of each joint unit, and outputs the resolved control command to themotor 18 of each joint unit as the instruction value of the rotation angle and/or the amount of movement. The manipulation of thearm unit 30 may also be performed directly by the manipulation of thearm manipulation interface 20 by the user without using thenavigation control device 60. - (2-2-2. Registration Processing)
- Next, an example of the processing of registration between the head of the
patient 1 in the captured image and a preoperative image or reference points present in a 3D model, a preoperative image, or the like is described. In the registration processing, the head of thepatient 1 in the captured image acquired by the stereo camera 14A, a preoperative image or a 3D model produced from an MRI image or the like photographed prior to the operation, and reference points are registered. -
FIG. 9 shows a flow chart of registration processing. First, in step S122, the camerainformation detection unit 114 of theposition computation unit 110 of thecontrol device 100 acquires 3D image information outputted from the stereo camera 14A. Here, the head of thepatient 1 is photographed by the stereo camera 14A. Subsequently, in step S124, theposition calculation unit 116 estimates the depth value of each pixel by the stereo matching method on the basis of captured images produced on the basis of the 3D image information acquired by the stereo camera 14A and the camera parameters. The depth value may be estimated by utilizing known technology. - Subsequently, in step S126, the
position calculation unit 116 computes the shape change (undulation) around the obtained depth value, and extracts an arbitrary number of feature points with a large undulation. The number of feature points may be three or more, for example. Subsequently, in step S128, theposition calculation unit 116 calculates the relative three-dimensional coordinates of the extracted feature point. At this time, the detected value of theencoder 16 of each joint unit detected by the arm postureinformation detection unit 112 and the camera parameters of the stereo camera 14A are utilized to obtain the relative three-dimensional coordinates, with the fixedportion 32 of thearm unit 30 or the like as the reference position. - Subsequently, in step S130, the
position calculation unit 116 transmits the 3D image information captured by the stereo camera 14A and the information of the relative three-dimensional coordinates of the feature point to thenavigation control device 60. Thereby, in thenavigation control device 60, the comparison and matching between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model can be performed, and the comparison result may be displayed on thedisplay device 54. Viewing the displayed comparison result, the user adjusts the posture of thearm unit 30 so that the head of thepatient 1 in the captured image and the preoperative image or the 3D model are registered. - In the surgical navigation system according to the embodiment, the
arm unit 30 equipped with the stereo camera 14A is fixed to thebed 40, and the positional relationship with the head of thepatient 1 can be kept fixed; thus, once one registration processing is performed, it is not necessary to perform registration again during the operation. Furthermore, the surgical navigation system according to the embodiment finds the relative position with respect to, as the reference position, the fixedportion 32 of thearm unit 30 having a fixed positional relationship with the head of thepatient 1; therefore, it is not necessary to find the absolute position of the head of thepatient 1 in the three-dimensional space and a reference marker is not necessary. - The posture of the
arm unit 30 may be adjusted also by automatic correction control by the armposture control unit 120 without using the user's manipulation.FIG. 10 is a flow chart of the automatic registration processing performed by the armposture control unit 120. Theposition computation unit 110 of thecontrol device 100 performs step S122 to step S130 in accordance with the flow chart shown inFIG. 9 . In step S132, the armposture control unit 120 of thecontrol device 100 acquires, from thenavigation control device 60, the result of comparison between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model. - Subsequently, in step S134, the arm
posture control unit 120 assesses the error between the position of the feature point and the position of the reference point in the preoperative image or the 3D model. For example, the armposture control unit 120 may determine whether or not the distance between the relative three-dimensional coordinate position of the feature point and the relative three-dimensional coordinate position of the reference point in the preoperative image or the 3D model falls within less than a previously set threshold. In the case where the result of assessment of the error shows that there is a large discrepancy between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model (S134: No), the armposture control unit 120 goes to step S136 and determines the pivot point at the time of moving the position of the stereo camera 14A. For example, the armposture control unit 120 may calculate the position of a virtual center of the head of thepatient 1 that is stereoscopically reconstructed, and may take the position of the virtual center as the pivot point. - Subsequently, in step S138, on the basis of the amount of discrepancy and the direction of discrepancy between the position of the feature point and the position of the reference point, the arm
posture control unit 120 controls themotor 18 of each joint unit of thearm unit 30 to put the stereo camera 14A into pivot operation with the pivot point as the center, and then performs photographing with the stereo camera 14A. After that, the procedure returns to step S124, and the processing of step S124 to step S134 described above is performed repeatedly. Then, in the case where the result of assessment of the error in step S134 shows that there is not a large discrepancy between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model (S134: Yes), the armposture control unit 120 finishes the registration processing. - When automatic registration processing by the arm
posture control unit 120 is possible, the position of the stereo camera 14A can be moved to an appropriate position, and thus the head of thepatient 1 in the captured image and the preoperative image or the 3D model can be registered easily, without using adjustment by the user. Also in the case where automatic registration processing is performed, in the surgical navigation system according to the embodiment, after one registration processing is performed, registration is not performed again during the operation. - (2-2-3. Processing of Detecting the Position of a Surgical Instrument)
- Next, an example of the processing of detecting the position of the tip of a surgical instrument is described. During the operation, for example as shown in
FIG. 6 , there is a case where aprobe 48 that is a surgical instrument dedicated to position detection is put on the surface of the brain in an attempt to find the positional relationship between the position of theprobe 48 and a reference point on a preoperative image or on a 3D model of the surgical site. Specifically, there may occur a situation in which it is desired to find the position of the tip of a surgical instrument accurately when neither a microscope nor a video microscope is used as a camera, alternatively when a microscope or the like is used and yet it is desired to find a more accurate position pin-pointedly, or when the tip of the surgical instrument is buried in the brain parenchyma. -
FIG. 11 is a flow chart executed by thecontrol device 100 of theimaging apparatus 10 in the processing of detecting the position of the tip of theprobe 48. The flow chart may be basically executed after the registration processing shown inFIG. 9 andFIG. 10 . That is, the processing of detecting the position of the tip of theprobe 48 may be executed in a state where the relative positions between the head of thepatient 1 and the stereo camera 14A are determined. - First, in step S142, the camera
information detection unit 114 of theposition computation unit 110 of thecontrol device 100 acquires 3D image information outputted from the stereo camera 14A. Here, the head of thepatient 1 is photographed by the stereo camera 14A. Subsequently, in step S144, theposition calculation unit 116 performs image processing on a captured image produced on the basis of the 3D image information acquired by the stereo camera 14A, and thereby attempts to detect theprobe 48. For example, theposition calculation unit 116 attempts to detect theprobe 48 in the captured image by the processing of matching with the shape of the grasping portion of theprobe 48, the shape of the connection portion between the grasping portion and the tip portion of theprobe 48, or the like stored in advance. - Subsequently, in step S146, the
position calculation unit 116 determines whether theprobe 48 is detected in the captured image or not. In the case where theprobe 48 is not detected in the captured image (S146: No), the procedure returns to step S142, and step S142 to step S146 are repeated until theprobe 48 is detected. On the other hand, in the case where in step S146 theprobe 48 is detected in the captured image (S146: Yes), theposition calculation unit 116 calculates the position of the tip of theprobe 48 in step S148. For example, theposition calculation unit 116 may detect the position of the tip of theprobe 48 on the basis of the information of the shape and length of theprobe 48 stored in advance. - Further, in step S150, the
position calculation unit 116 calculates the relative three-dimensional coordinates of the tip of theprobe 48 and the posture of theprobe 48 in the three-dimensional coordinate space. The posture of theprobe 48 may be calculated by, for example, image processing. Subsequently, in step S152, theposition calculation unit 116 transmits the calculated relative position of the tip of theprobe 48 and the calculated posture information of theprobe 48 to thenavigation control device 60. After that, the procedure returns to step S142, and step S142 to step S152 are repeated. -
FIG. 12 is a flow chart executed by thenavigation control device 60 of thenavigation apparatus 50 in the processing of detecting the position of theprobe 48. In step S162, thenavigation control device 60 acquires, from thecontrol device 100 of theimaging apparatus 10, the relative position information of the tip of theprobe 48 and the posture information of theprobe 48. Subsequently, in step S164, thenavigation control device 60 depicts theprobe 48 on the image information of the head of thepatient 1 for which registration has been completed, and causes thedisplay device 54 to display the image of theprobe 48 in real time. Thereby, the operator can move the tip of theprobe 48 to a desired position while viewing navigation display displayed on thedisplay device 54. - <2-3. Conclusions>
- Thus, by the
imaging apparatus 10 and the surgical navigation system according to the embodiment, a predetermined position can be calculated on the basis of the posture information of thearm unit 30 equipped with the stereo camera 14A and the information outputted from the stereo camera 14A. Therefore, it is not necessary to add a sensor such as an optical sensor or a magnetic sensor separately from theimaging apparatus 10. Thus, the setting of a sensor is not necessary, and false detection and an undetectable state due to a disturbance such as an optical shield, a magnetic shield, or noise can be eliminated. Furthermore, the number of equipment parts in the surgical navigation system can be reduced, and the cost can be reduced. - Furthermore, by the
imaging apparatus 10 according to the embodiment, the relative three-dimensional coordinates of a surgical site imaged by the stereo camera 14A can be calculated on the basis of the posture information of thearm unit 30 and the camera parameters such as the focal distance of the stereo camera 14A. Therefore, the relative position of the surgical site can be detected and utilized for navigation control, without using an additional sensor. - Furthermore, by the
imaging apparatus 10 according to the embodiment, the relative three-dimensional coordinates of the feature point of the surgical site can be calculated on the basis of the posture information of thearm unit 30, and the 3D image information and the camera parameters outputted from the stereo camera 14A. Therefore, the registration of the surgical site can be easily performed in thenavigation apparatus 50 without using an additional sensor. In addition, when the result of matching between the captured image and a preoperative image is fed back to the posture control of thearm unit 30, automatic registration of the surgical site becomes possible, and registration working is simplified. - Moreover, by the
imaging apparatus 10 according to the embodiment, the position and posture of a surgical instrument or the tip of a surgical instrument can be calculated on the basis of the posture information of thearm unit 30, and the 3D image information and the camera parameters outputted from the stereo camera 14A. Therefore, without using an additional sensor, the position and posture of the surgical instrument or the tip of the surgical instrument can be accurately detected in thenavigation apparatus 50, and the surgical instrument can be superimposed and displayed on thedisplay device 54 accurately in real time. Thereby, even when the tip of the surgical instrument has entered the interior of the body, the operator can move the tip of the surgical instrument to a desired position. - <3-1. Overview of the Surgical Navigation System>
- In a surgical navigation system according to a second embodiment of the present disclosure, the
arm unit 30 of animaging apparatus 10A is mounted on a movable cart. That is, thearm unit 30 is not fixed to thebed 40, and any position of thearm unit 30 can change with respect to thepatient 1; hence, it is necessary to perform the processing of setting the origin of the three-dimensional coordinates. Thus, in the surgical navigation system according to the embodiment, areference marker 134 is used to set the origin (reference position) P0 of the three-dimensional coordinates. -
FIG. 13 is an illustration diagram showing an example of the configuration of theimaging apparatus 10A used in the surgical navigation system according to the embodiment. Theimaging apparatus 10A may be configured in a similar manner to theimaging apparatus 10 shown inFIG. 2 except that thearm unit 30 is mounted on amovable cart 3130. Theimaging apparatus 10A may be placed in an arbitrary position on a side of thebed 40 by the user. -
FIG. 14 is an illustration diagram showing a situation of an operation for which the surgical navigation system according to the embodiment can be used. The illustrated example shows a situation of a brain surgery, and thepatient 1 is supported on thebed 40 in a state of facing down and the head is fixed by the fixingtool 42. Areference marker 134 is connected to the fixingtool 42 via a connecting jig. That is, the positional relationship between thereference marker 134 and thepatient 1 can be kept fixed. Thus, theimaging apparatus 10A according to the embodiment is configured so as to detect a predetermined position in a three-dimensional coordinate system in which a predetermined position specified on the basis of the three-dimensional position of thereference marker 134 is taken as the origin P0. In the surgical navigation system according to the embodiment, asurgical instrument 148 includes asurgical instrument marker 130, and thesurgical instrument marker 130 is utilized to detect the position and posture of thesurgical instrument 148. - The
reference marker 134 and thesurgical instrument marker 130 may be an optical marker including four marker units serving as marks for detecting the position or posture. For example, a configuration in which a marker unit that diffusely reflects light of a wavelength in the infrared region emitted from a light source is used and the position and posture of the marker are detected on the basis of 3D image information acquired by a stereo camera 14A having sensitivity at the wavelength in the infrared region is possible. Alternatively, a configuration in which a marker unit with a distinctive color such as red is used and the position and posture of the marker are detected on the basis of 3D image information acquired by a stereo camera 14A is possible. Since the positional relationships among the four marker units in the captured image vary with the position and posture of the marker, theposition calculation unit 116 can identify the position and posture of the marker by detecting the positional relationships among the four marker units. - <3-2. Position Detection Processing>
- The control processing executed in the surgical navigation system according to the embodiment will now be described with reference to
FIG. 3 andFIG. 4 . - (3-2-1. Processing of Grasping a Surgical Field)
- First, the processing of grasping a surgical field executed by the
control device 100 of theimaging apparatus 10A according to the embodiment is described. The processing of grasping a surgical field is basically executed in accordance with the flow chart shown inFIG. 7 . However, in theimaging apparatus 10A according to the embodiment, a predetermined position specified on the basis of thereference marker 134 is taken as the origin P0 of the three-dimensional coordinate system. Therefore, in step S106 ofFIG. 7 , on the basis of the posture information of thearm unit 30 and the information of the focal distance of the stereo camera 14A, theposition calculation unit 116 calculates the relative three-dimensional coordinates of the head of thepatient 1 of which the origin P0 is the predetermined position specified on the basis of thereference marker 134. The origin P0 may be set in advance as, for example, the position of the three-dimensional coordinates of thereference marker 134 calculated on the basis of the posture information of thearm unit 30 and the camera parameters outputted from the stereo camera 14A. - The position of the
reference marker 134 serving as the origin P0 may be the position of any one of the four marker units of thereference marker 134, or may be an arbitrary position that is other than the marker unit and has a fixed relative position to thereference marker 134. The three-dimensional coordinates with respect to the arbitrary origin P0 may be defined by the posture of thereference marker 134. That is, theposition calculation unit 116 may specify the three axes of x, y, and z on the basis of the posture of the identifiedreference marker 134. Thereby, theposition calculation unit 116 can find the relative three-dimensional coordinates of the head of thepatient 1 to the origin P0. - In the surgical navigation system according to the embodiment, the processing of grasping a surgical field can be executed in a similar manner to the case of the processing of grasping a surgical field by the surgical navigation system according to the first embodiment except that the three-dimensional position is calculated as the relative three-dimensional coordinates to the origin P0 specified by the
reference marker 134. - (3-2-2. Registration Processing)
- Next, an example of the processing of registration between the head of the
patient 1 in the captured image and a preoperative image or reference points present in a 3D model, a preoperative image, or the like is described.FIG. 15 shows a flow chart of the registration processing. - Also in the
control device 100 of theimaging apparatus 10A according to the embodiment, first, step S122 to step S130 are performed in accordance with a similar procedure to the flow chart shown inFIG. 9 . Thereby, the comparison and matching between the position of the feature point and the position of the corresponding reference point in the preoperative image or the 3D model are performed in thenavigation control device 60, and the comparison result is displayed on thedisplay device 54. Viewing the displayed comparison result, the user adjusts the posture of thearm unit 30 so that the head of thepatient 1 in the captured image and the preoperative image or the 3D model are registered. - When the registration between the head of the
patient 1 and the preoperative image or the 3D model is completed, in step S172, the camerainformation detection unit 114 acquires 3D image information outputted from the stereo camera 14A. Here, thereference marker 134 is photographed by the stereo camera 14A. The position of the stereo camera 14A may move as long as themovable cart 3130 equipped with thearm unit 30 does not move. Subsequently, in step S174, theposition calculation unit 116 calculates the three-dimensional coordinates of thereference marker 134 on the basis of the posture information of thearm unit 30 and the camera parameters outputted from the stereo camera 14A, and sets a predetermined position specified by thereference marker 134 as the origin P0. - Subsequently, in step S176, the
position calculation unit 116 calculates and stores the relative three-dimensional coordinates of the head of thepatient 1 to the origin P0 specified by thereference marker 134. The information of the relative three-dimensional coordinates of the head of thepatient 1 may be also transmitted to and stored in thenavigation apparatus 50. - In the surgical navigation system according to the embodiment, since the stereo camera 14A is mounted on the
movable cart 3130 to be made movable, when the position of themovable cart 3130 has changed, registration processing is executed again. In other words, as long as the relative positional relationship between the head of thepatient 1 and thereference marker 134 does not change and the position of themovable cart 3130 does not change either, once one registration processing is performed, it is not necessary to perform registration again during the operation. Also in the surgical navigation system according to the embodiment, automatic registration processing may be performed in accordance with the flow chart shown inFIG. 10 . - (3-2-3. Processing of Detecting the Position of a Surgical Instrument)
- Next, an example of the processing of detecting the position of the tip of a surgical instrument is described.
FIG. 16 is a flow chart executed by thecontrol device 100 of theimaging apparatus 10A in the processing of detecting the position of the tip of a surgical instrument dedicated to position detection (a probe) 148. The flow chart may be basically executed after the registration processing shown inFIG. 15 . That is, the processing of detecting the position of the tip of theprobe 148 may be executed in a state where the origin P0 of the three-dimensional coordinates and the relative positions between the head of thepatient 1 and the stereo camera 14A are determined. - First, in step S182, the camera
information detection unit 114 of theposition computation unit 110 of thecontrol device 100 acquires 3D image information outputted from the stereo camera 14A. Here, the head of thepatient 1 is photographed by the stereo camera 14A. Subsequently, in step S184, it is attempted to detect thesurgical instrument marker 130 from a captured image produced on the basis of the 3D image information acquired by the stereo camera 14A. Subsequently, in step S186, theposition calculation unit 116 determines whether thesurgical instrument marker 130 is detected in the captured image or not. In the case where thesurgical instrument marker 130 is not detected in the captured image (S186: No), the procedure returns to step S182, and step S182 to step S186 are repeated until thesurgical instrument marker 130 is detected. - On the other hand, in the case where in step S186 the
surgical instrument marker 130 is detected in the captured image (S186: Yes), theposition calculation unit 116 detects the position of the tip of theprobe 148 in step S188. For example, theposition calculation unit 116 may detect the position of the tip of theprobe 148 on the basis of the information of the shape and length of theprobe 148 stored in advance. Further, in step S190, theposition calculation unit 116 calculates the relative three-dimensional coordinates of the tip of theprobe 148 to the origin P0 specified by thereference marker 134 and the posture of theprobe 148 in the three-dimensional space. Subsequently, in step S192, theposition calculation unit 116 transmits the calculated relative position of the tip of theprobe 148 and the calculated posture information of theprobe 148 to thenavigation control device 60. After that, the procedure returns to step S182, and step S182 to step S192 are repeated. - In accordance with the flow chart shown in
FIG. 12 , thenavigation control device 60 acquires, from thecontrol device 100 of theimaging apparatus 10, the relative position of the tip of theprobe 148 and the posture information of theprobe 148, depicts theprobe 148 on the image information of the head of thepatient 1, and causes thedisplay device 54 to display the image of theprobe 148 in real time. Thereby, even when the tip of theprobe 148 has entered the interior of the body, the operator can move the tip of the surgical instrument to a desired position while viewing navigation display displayed on thedisplay device 54. - (3-2-4. Positional Shift Examination Processing)
- Next, the processing of examining the positional shift of the
arm unit 30 is described. In the surgical navigation system according to the embodiment, since thereference marker 134 is used, the positional shift of thearm unit 30 due to a movement of themovable cart 3130 or the like can be examined.FIG. 17 is a flow chart showing the processing of examining the positional shift of thearm unit 30. The flow chart is a procedure in which, when thereference marker 134 appears on the screen during an operation or working, the image information of thereference marker 134 is utilized to examine the positional shift of thearm unit 30, and is basically executed after the registration processing shown inFIG. 15 . That is, the processing of positional shift examination may be executed in a state where the origin P0 of the three-dimensional coordinates specified on the basis of thereference marker 134 and the relative positions between the head of thepatient 1 and the stereo camera 14A are determined. - First, in step S202, the camera
information detection unit 114 of theposition computation unit 110 of thecontrol device 100 acquires 3D image information outputted from the stereo camera 14A. Subsequently, in step S204, theposition calculation unit 116 determines whether thereference marker 134 is present in a captured image produced on the basis of the 3D image information acquired by the stereo camera 14A or not. In the case where thereference marker 134 is not present in the captured image (S204: No), the positional shift of thearm unit 30 cannot be examined and thus the procedure returns to step S202. - In the case where the
reference marker 134 is present in the captured image (S204: Yes), in step S206 theposition calculation unit 116 calculates the three-dimensional coordinates of thereference marker 134 with respect to the origin P0. That is, in step S206, the relative position of thereference marker 134 to the origin P0 is calculated. Subsequently, in step S208, theposition calculation unit 116 calculates the difference between the relative position of thereference marker 134 calculated in step S206 and the relative position of thereference marker 134 at the time point when the current origin P0 is set. For example, the difference between the components in each axis direction of the three-dimensional coordinates corresponding to the relative positions is found. When a positional shift of thearm unit 30 has not occurred, the difference between the relative positions mentioned above is zero. - Subsequently, in step S210, the
position calculation unit 116 determines whether the automatic correction mode is ON or not. In the case where the automatic correction mode is OFF (S210: No), in step S212, theposition calculation unit 116 transmits the amount of discrepancy of the relative position of thereference marker 134 found in step S208 to thenavigation control device 60, and causes thedisplay device 54 to display the amount of discrepancy. Thereby, the user can find the presence or absence of positional shift of thearm unit 30; and when the user considers the amount of discrepancy to be large, the user oneself may move thearm unit 30 while setting the automatic correction mode to ON, and can thereby correct the positional shift of thearm unit 30 clearly. - On the other hand, in the case where the automatic correction mode is ON (S210: Yes), in step S214 the
position calculation unit 116 performs the replacement of the posture information of thearm unit 30. The replacement of the posture information of thearm unit 30 may be performed by, for example, correcting the posture information of thearm unit 30 corresponding to the relative position of thereference marker 134 calculated this time. Thereby, after the replacement of the posture information of thearm unit 30 is performed, theposition calculation unit 116 calculates the posture information of thearm unit 30 using the difference with the posture information of thearm unit 30 after the replacement, and utilizes the calculation result for various computations such as position detection. - By executing positional shift examination processing in the above way, the accuracy of the posture information of the
arm unit 30 can be assessed any time by capturing thereference marker 134 in the captured image. Furthermore, for example, when themovable cart 3130 equipped with thearm unit 30 has moved, the position information of thereference marker 134 captured may be utilized to detect the shift of the posture of thearm unit 30, and the posture information of thearm unit 30 may be replaced; thereby, accurate position information can be calculated at all times. - Although in the example of the flow chart shown in
FIG. 17 the positional shift of thearm unit 30 is measured by comparing the relative positions of thereference marker 134, the positional shift of thearm unit 30 may be measured also by using the posture information of thearm unit 30 in a state where thereference marker 134 is captured. - Further, the
control device 100 may operate so as to capture thereference marker 134 in the captured image at an appropriate timing, and may execute the examination of positional shift and the automatic correction of the posture information of thearm unit 30.FIG. 18 shows a flow chart of recalibration processing. First, in step S222, in order to execute recalibration, theposition calculation unit 116 sends a command to the armposture control unit 120 to cause the armposture control unit 120 to change the posture of thearm unit 30 so that thereference marker 134 comes within the captured image of the stereo camera 14A. At this time, the posture control of thearm unit 30 may be performed by the user's manipulation, or automatic posture control of thearm unit 30 may be performed by thecontrol device 100 itself so that thereference marker 134 is detected in the captured image of the stereo camera 14A, on the basis of the currently stored relationship between the position of the head of thepatient 1 and the position of thereference marker 134. - Subsequently, in step S224, the
position calculation unit 116 determines whether thereference marker 134 is present in the captured image acquired by the stereo camera 14A or not. In the case where thereference marker 134 is present in the captured image (S224: Yes), theposition calculation unit 116 performs the replacement of the posture information of thearm unit 30 in accordance with the procedure of step S206, step S208, and step S214 in the flow chart ofFIG. 17 , and subsequently calculates the posture of thearm unit 30 using the difference with the posture information of thearm unit 30 at this time. - On the other hand, in the case where in step S224 the
reference marker 134 is not present in the captured image (S224: No), the procedure goes to step S226, and theposition calculation unit 116 determines whether the angle of view of the stereo camera 14A is at the maximum or not. In the case where the angle of view is already at the maximum (S226: Yes), thereference marker 134 cannot be captured by the stereo camera 14A and calibration cannot be automatically executed; hence, the processing is finished. On the other hand, in the case where the angle of view is not at the maximum (S226: No), in step S228 theposition calculation unit 116 expands the angle of view of the stereo camera 14A to expand the imaging range; and then the procedure returns to step S224, and step S224 and the subsequent steps are repeated. - Thereby, in the case where the
arm unit 30 is not fixed to thebed 40, when themovable cart 3130 equipped with thearm unit 30 has moved, recalibration can be completed automatically when thereference marker 134 is captured in the captured image successfully. When performing calibration, it is also possible to change the posture of thearm unit 30 to move the position of the stereo camera 14A back, instead of or in combination with the expansion of the angle of view of the stereo camera 14A. - <3-3. Conclusions>
- Thus, by the
imaging apparatus 10A and the surgical navigation system according to the embodiment, a predetermined position can be calculated on the basis of the posture information of thearm unit 30 equipped with the stereo camera 14A and the information outputted from the stereo camera 14A. Therefore, a similar effect to theimaging apparatus 10 according to the first embodiment can be obtained. Also in theimaging apparatus 10A according to the embodiment, the relative three-dimensional coordinates of the surgical site, the relative three-dimensional coordinates of the feature point of the surgical site, and the relative three-dimensional coordinates of the position of a surgical instrument or the tip of a surgical instrument can be detected on the basis of the posture information of thearm unit 30 and the information acquired from the stereo camera 14A. Therefore, the control of the processing of grasping a surgical field, registration processing, the processing of detecting the position of the tip of a surgical instrument, etc. can be performed simply and accurately. - Furthermore, the
imaging apparatus 10A and the surgical navigation system according to the embodiment are configured so as to perform position detection processing using thereference marker 134 and thesurgical instrument marker 130, and can therefore, after the completion of registration processing, execute the processing of examining the positional shift of thearm unit 30 due to a movement of themovable cart 3130 or the like and automatic calibration processing. Therefore, even when a positional shift of thearm unit 30 has occurred, the reliability of various position detection processings can be maintained. - It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
- For example, although in each embodiment described above the
arm unit 30 includes themicroscope unit 14 as a camera, the technology of the present disclosure is not limited to such an example. For example, thearm unit 30 may include an eyepiece-equipped microscope and a camera that records a magnified image obtained via the eyepiece-equipped microscope or even a surgical exoscope. - Furthermore, although in each embodiment described above the information of the value of the depth to a predetermined object part is acquired using a stereo camera as the
microscope unit 14, the technology of the present disclosure is not limited to such an example. For example, the information of the depth value may be acquired using a distance sensor together with a monocular camera. - Furthermore, although in the first embodiment the detection of a surgical instrument in the captured image is performed by image processing and in the second embodiment the detection of a surgical instrument in the captured image is performed by the detection of the surgical instrument marker, the method for detecting a surgical instrument in each embodiment may be the opposite. That is, although the first embodiment and the second embodiment are different in the way of the setting of the origin P0 of the three-dimensional coordinates, the method for detecting a surgical instrument is not limited to the examples mentioned above.
- Furthermore, although in the embodiments described above the
control device 100 of the imaging apparatus includes theposition computation unit 110 and the armposture control unit 120, the technology of the present disclosure is not limited to such an example. In thecontrol device 100 according to an embodiment of the present disclosure, it is sufficient that the information of a predetermined position be able to be calculated on the basis of the posture information of thearm unit 30 and the information outputted from the stereo camera 14A, and the armposture control unit 120 may not be provided. In this case, the posture control of thearm unit 30 may be performed by some other control device having the function of the armposture control unit 120. - Moreover, the system configurations and the flow charts described in the embodiments described above are only examples, and the technology of the present disclosure is not limited to such examples. Part of the steps in a flow chart executed by the
control device 100 of the imaging apparatus may be executed on the navigation control device side. For example, in the automatic registration processing shown inFIG. 10 , step S132 to step S136 of thearm unit 30 may be performed by thenavigation control device 60, and the computation result may be transmitted to thecontrol device 100. - The computer program for achieving each function of the imaging apparatus and the surgical navigation system may be installed in any of the control devices and the like. A recording medium readable on a computer in which such a computer program is stored can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. The computer program mentioned above may also be distributed via a network without using a recording medium, for example.
- Further, the effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art based on the description of this specification.
- (1)
- A surgical information processing apparatus, including:
- circuitry configured to
- obtain position information of a surgical imaging device, the position information indicating displacement of the surgical imaging device from a predetermined position,
- in a registration mode, obtain first image information from the surgical imaging device regarding a position of a surgical component,
- determine the position of the surgical component based on the first image information and the position information,
- in an imaging mode, obtain second image information from the surgical imaging device of the surgical component based on the determined position.
- (2)
- The surgical information processing apparatus according to (1), wherein the position determination is further performed by determining a position of the surgical imaging device with respect to the predetermined position based on the position information and by determining a distance between the surgical component and the surgical imaging device.
- (3)
- The surgical information processing apparatus according to (1) to (2), wherein the surgical component is one of a surgical site and a surgical instrument.
- (4)
- The surgical information processing apparatus according to (1) to (3), wherein the circuitry activates the registration mode or the imaging mode based on the position information.
- (5)
- The surgical information processing apparatus according to (1) to (4), wherein the first image information is obtained in the registration mode at a different perspective that the second image information that is obtained in the imaging mode.
- (6)
- The surgical information processing apparatus according to (1) to (5), wherein the position determination is further performed by setting the position of the surgical imaging device as a reference point.
- (7)
- The surgical information processing apparatus according to (1) to (6), wherein the position information of the surgical imaging device is based on arm position information from a supporting arm having attached thereto the surgical imaging device, and
- wherein the arm position information includes information of movement of at least one joint in the supporting arm.
- (8)
- The surgical information processing apparatus according to (7), wherein the information of movement of at least one joint in the supporting arm includes an amount of rotation of each joint.
- (9)
- The surgical information processing apparatus according to (1) to (8), wherein the position determination is further performed by processing images of the surgical component obtained by the surgical imaging device as the first image information.
- (10)
- The surgical information processing apparatus according to (9), wherein the processing of the images of the surgical component obtained by the surgical imaging device is based on the focus points of the images.
- (11)
- The surgical information processing apparatus according to (1) to (10), wherein the position of the surgical component being a reference point for image registration between a previously obtained medical image and images obtained by the surgical imaging device as the second image information.
- (12)
- The surgical information processing apparatus according to (1) to (11), wherein the position of the surgical component being a reference point for superimposing at least one pre-operative image on images obtained by the surgical imaging device as the second image information.
- (13)
- A surgical information processing method implemented using circuitry, including: obtaining first position information of a surgical imaging device, the first position information indicating displacement of the surgical imaging device from a predetermined position;
- generating second position information of a surgical component with respect to the surgical imaging device based on first image information obtained in a registration mode from the surgical imaging device;
- determining the position of a surgical component with respect to the predetermined position based on first position information and the second position information; and
- in an imaging mode, obtaining second image information from the medical imaging device of the surgical component based on the determined position.
- (14)
- The medical image processing method according to (13), wherein the position determination is further performed by determining the first position information indicating a position of the medical imaging device with respect to the predetermined position based on the arm position information and by determining the second position information from a stereoscopic distance between the patient and the medical imaging device.
- (15)
- The medical image processing method according to (13) to (14), wherein the registration mode or the imaging mode is activated based on the position information.
- (16)
- The medical image processing method according to (13) to (15), wherein the first image information is obtained in the registration mode at a different perspective that the second image information that is obtained in the imaging mode.
- (17)
- The medical image processing method according to (13) to (16), wherein the generating of the second position information of the surgical component is further performed by setting the position of the surgical imaging device as a reference point.
- (18)
- The medical image processing method according to (14), wherein the first position information of the surgical imaging device is based on arm position information from a supporting arm having attached thereto the surgical imaging device, and wherein the arm position information includes information of movement of at least one joint in the supporting arm.
- (19)
- The medical image processing method according to (18), wherein the information of movement of at least one joint in the supporting arm includes an amount of rotation of each joint.
- (20)
- The medical image processing method according to (13) to (19), wherein the second position information is further generated by processing images of the surgical component obtained by the surgical imaging device as the first image information.
- (21)
- The medical image processing method according to (20), wherein the processing of the images of the surgical component obtained by the surgical imaging device is based on the focus points of the images.
- (22)
- The medical image processing method according to (13) to (21), wherein the position of the surgical component being a reference point for image registration between a previously obtained medical image and images obtained by the surgical imaging device as the second image information.
- (23)
- The medical image processing method according to (13) to (22), wherein the position of the surgical component being a reference point for superimposing at least one preoperative image on images obtained by the surgical imaging device as the second image information.
- (24)
- A surgical information processing apparatus, including:
- a surgical imaging device configured to obtain images of a patient;
- a supporting arm having attached thereto the surgical imaging device; and
- the surgical information processing apparatus according to
claim 1. - (25)
- The surgical information processing apparatus according to (24), wherein the medical imaging device is a surgical microscope or a surgical exoscope.
- (26)
- The surgical information processing apparatus according to (24) to (25), wherein the supporting arm has an actuator at a joint.
- (27)
- A non-transitory computer readable medium having stored therein a program that when executed by a computer including circuitry causes the computer to implement a surgical information processing method implemented using circuitry, including:
- obtaining first position information of a surgical imaging device, the first position information indicating displacement of the surgical imaging device from a predetermined position;
- generating second position information of a surgical component with respect to the surgical imaging device based on first image information obtained in a registration mode from the surgical imaging device;
- determining the position of a surgical component with respect to the predetermined position based on first position information and the second position information; and
- in an imaging mode, obtaining second image information from the medical imaging device of the surgical component based on the determined position.
- Additionally, the present technology may also be configured as below.
- (1A)
- A medical imaging apparatus including:
- an arm posture information detection unit configured to detect posture information concerning a posture of an arm that includes at least one joint unit and supports a camera;
- a camera information detection unit configured to detect information outputted from the camera; and
- a position calculation unit configured to calculate a predetermined position on the basis of the posture information and the information outputted from the camera.
- (2A)
- The medical imaging apparatus according to (1A),
- wherein the arm is fixed to a support base configured to support a patient and
- the position calculation unit calculates a relative position to a predetermined reference position of which the position does not change even when the posture of the arm changes.
- (3A)
- The medical imaging apparatus according to (1A),
- wherein the arm is mounted on a movable cart, and
- the position calculation unit sets, as a reference position, a predetermined position specified on the basis of a reference marker fixed to a support base configured to support a patient and calculates a relative position to the reference position, in a state where the movable cart is placed in a predetermined position.
- (4A)
- The medical imaging apparatus according to (3A), further including:
- an arm control unit configured to control the arm,
- wherein, when a relative position of the reference marker at the time when a current reference position is set and a relative position of the reference marker calculated are different, the arm control unit corrects the posture information of the arm, with the calculated relative position of the reference marker as a reference.
- (5A)
- The medical imaging apparatus according to any one of (1A) to (4A), wherein the position calculation unit determines whether a predetermined object to be detected is present in an image captured by the camera or not, and calculates a position of the object to be detected in a case where the object to be detected is present.
- (6A)
- The medical imaging apparatus according to (5A), wherein the position calculation unit expands an imaging range of the image in a case where the predetermined object to be detected is not present in the image captured by the camera.
- (7A)
- The medical imaging apparatus according to any one of (1A) to (6A), further including an arm control unit configured to control the arm,
- wherein the arm control unit registers a surgical site of a patient included in an image captured by the camera with a reference image prepared in advance by controlling the posture of the arm.
- (8A)
- The medical imaging apparatus according to (7A), wherein,
- when the surgical site and the reference image are out of registration even when the registration is performed,
- the arm control unit performs registration between the surgical site and the reference image again by adjusting a position of the camera, using a position of a virtual center of the surgical site as a pivot point.
- (9A)
- The medical imaging apparatus according to any one of (1A) to (8A), wherein the predetermined position is information indicating at least one of a focal distance of the camera, a position of a surgical site of a patient, a position of a surgical instrument, a position of a tip of a surgical instrument, and a position of a reference marker.
- (10A)
- The medical imaging apparatus according to any one of (1A) to (9A), wherein the arm posture information detection unit detects the posture information on the basis of an output of an encoder provided in the joint unit.
- (11A)
- The medical imaging apparatus according to any one of (1A) to (10A), wherein the information outputted from the camera includes one of information of a focal distance of the camera and an image signal acquired by the camera.
- (12A)
- The medical imaging apparatus according to any one of (1A) to (11A), further including:
- an output unit configured to output 3D image information produced from an image signal acquired by the camera.
- (13A)
- A surgical navigation system including:
- an arm posture information detection unit configured to detect posture information concerning a posture of an arm that includes at least one joint unit and supports a camera;
- a camera information detection unit configured to detect information outputted from the camera;
- a position calculation unit configured to calculate a predetermined position on the basis of the posture information and the information outputted from the camera;
- an output unit configured to output 3D image information produced from an image signal acquired by the camera; and
- a navigation control unit configured to perform navigation of an operation while causing an image in which a surgical site of a patient included in the 3D image information produced from the image signal is superimposed on a reference image prepared in advance to be displayed.
-
-
- 10, 10A imaging apparatus
- 14 microscope unit
- 14A stereo camera
- 30 arm unit
- 48 probe (surgical instrument)
- 50 navigation apparatus
- 54 display device
- 60 navigation control device
- 100 control device
- 110 position computation unit
- 112 arm posture information detection unit
- 114 camera information detection unit
- 116 position calculation unit
- 120 arm posture control unit
- 130 surgical instrument marker
- 134 reference marker
Claims (27)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015252869A JP6657933B2 (en) | 2015-12-25 | 2015-12-25 | Medical imaging device and surgical navigation system |
JP2015-252869 | 2015-12-25 | ||
PCT/JP2016/084354 WO2017110333A1 (en) | 2015-12-25 | 2016-11-18 | Surgical information processing apparatus and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180263710A1 true US20180263710A1 (en) | 2018-09-20 |
Family
ID=57570279
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/761,507 Abandoned US20180263710A1 (en) | 2015-12-25 | 2016-11-18 | Medical imaging apparatus and surgical navigation system |
Country Status (5)
Country | Link |
---|---|
US (1) | US20180263710A1 (en) |
EP (1) | EP3393385A1 (en) |
JP (1) | JP6657933B2 (en) |
CN (1) | CN108366833B (en) |
WO (1) | WO2017110333A1 (en) |
Cited By (142)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170314911A1 (en) * | 2016-04-27 | 2017-11-02 | Keyence Corporation | Three-Dimensional Coordinate Measuring Device |
US20180133085A1 (en) * | 2016-11-14 | 2018-05-17 | Amcad Biomed Corporation | Positioning apparatus for head and neck assessment or intervention |
US10595887B2 (en) | 2017-12-28 | 2020-03-24 | Ethicon Llc | Systems for adjusting end effector parameters based on perioperative information |
US10695081B2 (en) | 2017-12-28 | 2020-06-30 | Ethicon Llc | Controlling a surgical instrument according to sensed closure parameters |
CN111407406A (en) * | 2020-03-31 | 2020-07-14 | 武汉联影智融医疗科技有限公司 | Head position identification device, intraoperative control system and control method |
US10755813B2 (en) | 2017-12-28 | 2020-08-25 | Ethicon Llc | Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform |
US10758310B2 (en) | 2017-12-28 | 2020-09-01 | Ethicon Llc | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
US10772651B2 (en) | 2017-10-30 | 2020-09-15 | Ethicon Llc | Surgical instruments comprising a system for articulation and rotation compensation |
CN111999879A (en) * | 2019-05-27 | 2020-11-27 | 徕卡仪器(新加坡)有限公司 | Microscope system and method for controlling a surgical microscope |
US10849697B2 (en) | 2017-12-28 | 2020-12-01 | Ethicon Llc | Cloud interface for coupled surgical devices |
EP3753520A1 (en) * | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medical handling device for controlling a handling device |
EP3753519A1 (en) * | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medical handling device |
EP3753521A1 (en) * | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medical handling device for controlling a handling device |
US10892995B2 (en) | 2017-12-28 | 2021-01-12 | Ethicon Llc | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US10892899B2 (en) | 2017-12-28 | 2021-01-12 | Ethicon Llc | Self describing data packets generated at an issuing instrument |
US10898622B2 (en) | 2017-12-28 | 2021-01-26 | Ethicon Llc | Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device |
US10932872B2 (en) | 2017-12-28 | 2021-03-02 | Ethicon Llc | Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set |
US10944728B2 (en) | 2017-12-28 | 2021-03-09 | Ethicon Llc | Interactive surgical systems with encrypted communication capabilities |
US10943454B2 (en) | 2017-12-28 | 2021-03-09 | Ethicon Llc | Detection and escalation of security responses of surgical instruments to increasing severity threats |
US10966791B2 (en) | 2017-12-28 | 2021-04-06 | Ethicon Llc | Cloud-based medical analytics for medical facility segmented individualization of instrument function |
US10973520B2 (en) | 2018-03-28 | 2021-04-13 | Ethicon Llc | Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature |
US10987178B2 (en) | 2017-12-28 | 2021-04-27 | Ethicon Llc | Surgical hub control arrangements |
US11013563B2 (en) | 2017-12-28 | 2021-05-25 | Ethicon Llc | Drive arrangements for robot-assisted surgical platforms |
US11026751B2 (en) | 2017-12-28 | 2021-06-08 | Cilag Gmbh International | Display of alignment of staple cartridge to prior linear staple line |
US11026687B2 (en) | 2017-10-30 | 2021-06-08 | Cilag Gmbh International | Clip applier comprising clip advancing systems |
US11056244B2 (en) | 2017-12-28 | 2021-07-06 | Cilag Gmbh International | Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks |
US11051876B2 (en) | 2017-12-28 | 2021-07-06 | Cilag Gmbh International | Surgical evacuation flow paths |
US11058498B2 (en) | 2017-12-28 | 2021-07-13 | Cilag Gmbh International | Cooperative surgical actions for robot-assisted surgical platforms |
US11069012B2 (en) | 2017-12-28 | 2021-07-20 | Cilag Gmbh International | Interactive surgical systems with condition handling of devices and data capabilities |
US11076921B2 (en) | 2017-12-28 | 2021-08-03 | Cilag Gmbh International | Adaptive control program updates for surgical hubs |
US11090047B2 (en) | 2018-03-28 | 2021-08-17 | Cilag Gmbh International | Surgical instrument comprising an adaptive control system |
US11096693B2 (en) | 2017-12-28 | 2021-08-24 | Cilag Gmbh International | Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing |
US11096688B2 (en) | 2018-03-28 | 2021-08-24 | Cilag Gmbh International | Rotary driven firing members with different anvil and channel engagement features |
US11100631B2 (en) | 2017-12-28 | 2021-08-24 | Cilag Gmbh International | Use of laser light and red-green-blue coloration to determine properties of back scattered light |
US11114195B2 (en) | 2017-12-28 | 2021-09-07 | Cilag Gmbh International | Surgical instrument with a tissue marking assembly |
US11109866B2 (en) | 2017-12-28 | 2021-09-07 | Cilag Gmbh International | Method for circular stapler control algorithm adjustment based on situational awareness |
US11132462B2 (en) | 2017-12-28 | 2021-09-28 | Cilag Gmbh International | Data stripping method to interrogate patient records and create anonymized record |
US11129611B2 (en) | 2018-03-28 | 2021-09-28 | Cilag Gmbh International | Surgical staplers with arrangements for maintaining a firing member thereof in a locked configuration unless a compatible cartridge has been installed therein |
US11147607B2 (en) | 2017-12-28 | 2021-10-19 | Cilag Gmbh International | Bipolar combination device that automatically adjusts pressure based on energy modality |
US11160605B2 (en) | 2017-12-28 | 2021-11-02 | Cilag Gmbh International | Surgical evacuation sensing and motor control |
US11166772B2 (en) | 2017-12-28 | 2021-11-09 | Cilag Gmbh International | Surgical hub coordination of control and communication of operating room devices |
US11179208B2 (en) | 2017-12-28 | 2021-11-23 | Cilag Gmbh International | Cloud-based medical analytics for security and authentication trends and reactive measures |
US11179175B2 (en) | 2017-12-28 | 2021-11-23 | Cilag Gmbh International | Controlling an ultrasonic surgical instrument according to tissue location |
US11202570B2 (en) | 2017-12-28 | 2021-12-21 | Cilag Gmbh International | Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems |
US11207067B2 (en) | 2018-03-28 | 2021-12-28 | Cilag Gmbh International | Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing |
US11219453B2 (en) | 2018-03-28 | 2022-01-11 | Cilag Gmbh International | Surgical stapling devices with cartridge compatible closure and firing lockout arrangements |
US11226477B2 (en) * | 2016-11-16 | 2022-01-18 | Carl Zeiss Meditec Ag | Method for presenting images of a digital surgical microscope and digital surgical microscope system |
US11229436B2 (en) | 2017-10-30 | 2022-01-25 | Cilag Gmbh International | Surgical system comprising a surgical tool and a surgical hub |
US11234756B2 (en) | 2017-12-28 | 2022-02-01 | Cilag Gmbh International | Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter |
US11253315B2 (en) | 2017-12-28 | 2022-02-22 | Cilag Gmbh International | Increasing radio frequency to create pad-less monopolar loop |
US11257589B2 (en) | 2017-12-28 | 2022-02-22 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
US11259807B2 (en) | 2019-02-19 | 2022-03-01 | Cilag Gmbh International | Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device |
US11259830B2 (en) | 2018-03-08 | 2022-03-01 | Cilag Gmbh International | Methods for controlling temperature in ultrasonic device |
US11259806B2 (en) | 2018-03-28 | 2022-03-01 | Cilag Gmbh International | Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein |
US11266468B2 (en) | 2017-12-28 | 2022-03-08 | Cilag Gmbh International | Cooperative utilization of data derived from secondary sources by intelligent surgical hubs |
US11273001B2 (en) | 2017-12-28 | 2022-03-15 | Cilag Gmbh International | Surgical hub and modular device response adjustment based on situational awareness |
US11278281B2 (en) | 2017-12-28 | 2022-03-22 | Cilag Gmbh International | Interactive surgical system |
US11278280B2 (en) | 2018-03-28 | 2022-03-22 | Cilag Gmbh International | Surgical instrument comprising a jaw closure lockout |
US11284936B2 (en) | 2017-12-28 | 2022-03-29 | Cilag Gmbh International | Surgical instrument having a flexible electrode |
US11291495B2 (en) | 2017-12-28 | 2022-04-05 | Cilag Gmbh International | Interruption of energy due to inadvertent capacitive coupling |
US11291510B2 (en) | 2017-10-30 | 2022-04-05 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11298148B2 (en) | 2018-03-08 | 2022-04-12 | Cilag Gmbh International | Live time tissue classification using electrical parameters |
US11304720B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Activation of energy devices |
US11304763B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use |
US11304699B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
US11308075B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity |
US11304745B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Surgical evacuation sensing and display |
US11311306B2 (en) | 2017-12-28 | 2022-04-26 | Cilag Gmbh International | Surgical systems for detecting end effector tissue distribution irregularities |
US11311342B2 (en) | 2017-10-30 | 2022-04-26 | Cilag Gmbh International | Method for communicating with surgical instrument systems |
US11317937B2 (en) | 2018-03-08 | 2022-05-03 | Cilag Gmbh International | Determining the state of an ultrasonic end effector |
US11317919B2 (en) | 2017-10-30 | 2022-05-03 | Cilag Gmbh International | Clip applier comprising a clip crimping system |
US11317915B2 (en) | 2019-02-19 | 2022-05-03 | Cilag Gmbh International | Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers |
USD950728S1 (en) | 2019-06-25 | 2022-05-03 | Cilag Gmbh International | Surgical staple cartridge |
US11324557B2 (en) | 2017-12-28 | 2022-05-10 | Cilag Gmbh International | Surgical instrument with a sensing array |
USD952144S1 (en) | 2019-06-25 | 2022-05-17 | Cilag Gmbh International | Surgical staple cartridge retainer with firing system authentication key |
US11337746B2 (en) | 2018-03-08 | 2022-05-24 | Cilag Gmbh International | Smart blade and power pulsing |
US11357503B2 (en) | 2019-02-19 | 2022-06-14 | Cilag Gmbh International | Staple cartridge retainers with frangible retention features and methods of using same |
US11364075B2 (en) | 2017-12-28 | 2022-06-21 | Cilag Gmbh International | Radio frequency energy device for delivering combined electrical signals |
US11369377B2 (en) | 2019-02-19 | 2022-06-28 | Cilag Gmbh International | Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout |
US11376002B2 (en) | 2017-12-28 | 2022-07-05 | Cilag Gmbh International | Surgical instrument cartridge sensor assemblies |
US11389164B2 (en) | 2017-12-28 | 2022-07-19 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
US11410259B2 (en) | 2017-12-28 | 2022-08-09 | Cilag Gmbh International | Adaptive control program updates for surgical devices |
US11419630B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Surgical system distributed processing |
US11424027B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Method for operating surgical instrument systems |
US11423007B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Adjustment of device control programs based on stratified contextual data in addition to the data |
US11419667B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location |
US11432885B2 (en) | 2017-12-28 | 2022-09-06 | Cilag Gmbh International | Sensing arrangements for robot-assisted surgical platforms |
USD964564S1 (en) | 2019-06-25 | 2022-09-20 | Cilag Gmbh International | Surgical staple cartridge retainer with a closure system authentication key |
US11446052B2 (en) | 2017-12-28 | 2022-09-20 | Cilag Gmbh International | Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue |
US20220301195A1 (en) * | 2020-05-12 | 2022-09-22 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
US11464511B2 (en) | 2019-02-19 | 2022-10-11 | Cilag Gmbh International | Surgical staple cartridges with movable authentication key arrangements |
US11464559B2 (en) | 2017-12-28 | 2022-10-11 | Cilag Gmbh International | Estimating state of ultrasonic end effector and control system therefor |
US11464535B2 (en) | 2017-12-28 | 2022-10-11 | Cilag Gmbh International | Detection of end effector emersion in liquid |
US11471156B2 (en) | 2018-03-28 | 2022-10-18 | Cilag Gmbh International | Surgical stapling devices with improved rotary driven closure systems |
US11504192B2 (en) | 2014-10-30 | 2022-11-22 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
CN115376675A (en) * | 2021-05-19 | 2022-11-22 | 博瑞生物医疗科技(深圳)有限公司 | Image acquisition control method and device, storage medium, processor and terminal equipment |
US11510741B2 (en) | 2017-10-30 | 2022-11-29 | Cilag Gmbh International | Method for producing a surgical instrument comprising a smart electrical system |
US11529187B2 (en) | 2017-12-28 | 2022-12-20 | Cilag Gmbh International | Surgical evacuation sensor arrangements |
US11540855B2 (en) | 2017-12-28 | 2023-01-03 | Cilag Gmbh International | Controlling activation of an ultrasonic surgical instrument according to the presence of tissue |
US11540700B2 (en) * | 2016-11-10 | 2023-01-03 | Sony Corporation | Medical supporting arm and medical system |
US11559308B2 (en) | 2017-12-28 | 2023-01-24 | Cilag Gmbh International | Method for smart energy device infrastructure |
US11559307B2 (en) | 2017-12-28 | 2023-01-24 | Cilag Gmbh International | Method of robotic hub communication, detection, and control |
US11564756B2 (en) | 2017-10-30 | 2023-01-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11571234B2 (en) | 2017-12-28 | 2023-02-07 | Cilag Gmbh International | Temperature control of ultrasonic end effector and control system therefor |
US11576677B2 (en) | 2017-12-28 | 2023-02-14 | Cilag Gmbh International | Method of hub communication, processing, display, and cloud analytics |
US11589888B2 (en) | 2017-12-28 | 2023-02-28 | Cilag Gmbh International | Method for controlling smart energy devices |
US11589932B2 (en) | 2017-12-28 | 2023-02-28 | Cilag Gmbh International | Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures |
US11596291B2 (en) | 2017-12-28 | 2023-03-07 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying of the location of the tissue within the jaws |
US11602393B2 (en) | 2017-12-28 | 2023-03-14 | Cilag Gmbh International | Surgical evacuation sensing and generator control |
US11612444B2 (en) | 2017-12-28 | 2023-03-28 | Cilag Gmbh International | Adjustment of a surgical device function based on situational awareness |
US11659023B2 (en) | 2017-12-28 | 2023-05-23 | Cilag Gmbh International | Method of hub communication |
US11666331B2 (en) | 2017-12-28 | 2023-06-06 | Cilag Gmbh International | Systems for detecting proximity of surgical end effector to cancerous tissue |
US11701087B2 (en) | 2016-11-14 | 2023-07-18 | Amcad Biomed Corporation | Method for head and neck assessment or intervention |
US11744604B2 (en) | 2017-12-28 | 2023-09-05 | Cilag Gmbh International | Surgical instrument with a hardware-only control circuit |
US20230298206A1 (en) * | 2022-03-15 | 2023-09-21 | Carl Zeiss Meditec Ag | Method for determining the three-dimensional positions of points in a target region on a patient in a reference coordinate system of a surgical visualization system and surgical visualization system |
US11771487B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Mechanisms for controlling different electromechanical systems of an electrosurgical instrument |
US11786245B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Surgical systems with prioritized data transmission capabilities |
US11786251B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
US11801098B2 (en) | 2017-10-30 | 2023-10-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US20230351636A1 (en) * | 2022-04-29 | 2023-11-02 | 3Dintegrated Aps | Online stereo calibration |
US11818052B2 (en) | 2017-12-28 | 2023-11-14 | Cilag Gmbh International | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
US11832840B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical instrument having a flexible circuit |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11903601B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Surgical instrument comprising a plurality of drive systems |
US11911045B2 (en) | 2017-10-30 | 2024-02-27 | Cllag GmbH International | Method for operating a powered articulating multi-clip applier |
US11937769B2 (en) | 2017-12-28 | 2024-03-26 | Cilag Gmbh International | Method of hub communication, processing, storage and display |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11998193B2 (en) | 2017-12-28 | 2024-06-04 | Cilag Gmbh International | Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation |
US12029506B2 (en) | 2017-12-28 | 2024-07-09 | Cilag Gmbh International | Method of cloud based data analytics for use with the hub |
US12035890B2 (en) | 2017-12-28 | 2024-07-16 | Cilag Gmbh International | Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub |
US12062442B2 (en) | 2017-12-28 | 2024-08-13 | Cilag Gmbh International | Method for operating surgical instrument systems |
US12126916B2 (en) | 2018-09-27 | 2024-10-22 | Proprio, Inc. | Camera array for a mediated-reality system |
US12127729B2 (en) | 2017-12-28 | 2024-10-29 | Cilag Gmbh International | Method for smoke evacuation for surgical hub |
US12133773B2 (en) | 2017-12-28 | 2024-11-05 | Cilag Gmbh International | Surgical hub and modular device response adjustment based on situational awareness |
US12178403B2 (en) | 2016-11-24 | 2024-12-31 | University Of Washington | Light field capture and rendering for head-mounted displays |
US12226151B2 (en) | 2017-12-28 | 2025-02-18 | Cilag Gmbh International | Capacitive coupled return path pad with separable array elements |
US12261988B2 (en) | 2021-11-08 | 2025-03-25 | Proprio, Inc. | Methods for generating stereoscopic views in multicamera systems, and associated devices and systems |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018012080A1 (en) | 2016-07-12 | 2018-01-18 | ソニー株式会社 | Image processing device, image processing method, program, and surgery navigation system |
JP6216863B1 (en) * | 2016-11-11 | 2017-10-18 | アムキャッド・バイオメッド・コーポレイションAmCad Biomed Corporation | Positioning device for head or neck evaluation or intervention |
TWI730242B (en) * | 2017-12-27 | 2021-06-11 | 醫百科技股份有限公司 | Surgical instrument positioning system and positioning method thereof |
DE102018206406B3 (en) * | 2018-04-25 | 2019-09-12 | Carl Zeiss Meditec Ag | Microscopy system and method for operating a microscopy system |
US10983604B2 (en) | 2018-05-16 | 2021-04-20 | Alcon Inc. | Foot controlled cursor |
US20190354200A1 (en) * | 2018-05-16 | 2019-11-21 | Alcon Inc. | Virtual foot pedal |
WO2019220555A1 (en) * | 2018-05-16 | 2019-11-21 | 株式会社島津製作所 | Imaging device |
US11298186B2 (en) | 2018-08-02 | 2022-04-12 | Point Robotics Medtech Inc. | Surgery assistive system and method for obtaining surface information thereof |
JP2021040987A (en) * | 2019-09-12 | 2021-03-18 | ソニー株式会社 | Medical support arm and medical system |
US11461929B2 (en) * | 2019-11-28 | 2022-10-04 | Shanghai United Imaging Intelligence Co., Ltd. | Systems and methods for automated calibration |
JP6901160B2 (en) * | 2019-12-05 | 2021-07-14 | 炳碩生醫股▲フン▼有限公司 | How to get surgical support system and its surface information |
CN110897717B (en) | 2019-12-09 | 2021-06-18 | 苏州微创畅行机器人有限公司 | Navigation operation system, registration method thereof and electronic equipment |
KR102315803B1 (en) * | 2019-12-16 | 2021-10-21 | 쓰리디메디비젼 주식회사 | Supporter for medical camera |
EP3848899B1 (en) * | 2020-01-09 | 2023-01-25 | Stryker European Operations Limited | Technique of determining a pose of a surgical registration device |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4101951B2 (en) * | 1998-11-10 | 2008-06-18 | オリンパス株式会社 | Surgical microscope |
JP4674948B2 (en) | 2000-09-29 | 2011-04-20 | オリンパス株式会社 | Surgical navigation device and method of operating surgical navigation device |
GB2428110A (en) * | 2005-07-06 | 2007-01-17 | Armstrong Healthcare Ltd | A robot and method of registering a robot. |
WO2008058520A2 (en) * | 2006-11-13 | 2008-05-22 | Eberhard-Karls-Universität Universitätsklinikum Tübingen | Apparatus for supplying images to an operator |
JP2008210140A (en) * | 2007-02-26 | 2008-09-11 | Sony Corp | Information extraction method, registration device, collation device, and program |
DE102007055203A1 (en) * | 2007-11-19 | 2009-05-20 | Kuka Roboter Gmbh | A robotic device, medical workstation and method for registering an object |
BR112014005451B1 (en) * | 2011-09-13 | 2021-11-09 | Koninklijke Philips N.V. | REGISTRATION SYSTEM |
EP3679881A1 (en) * | 2012-08-14 | 2020-07-15 | Intuitive Surgical Operations, Inc. | Systems and methods for registration of multiple vision systems |
CN106061427B (en) * | 2014-02-28 | 2020-10-27 | 索尼公司 | Robot arm apparatus, robot arm control method, and program |
US10070931B2 (en) * | 2014-03-17 | 2018-09-11 | Intuitive Surgical Operations, Inc. | System and method for maintaining a tool pose |
-
2015
- 2015-12-25 JP JP2015252869A patent/JP6657933B2/en not_active Expired - Fee Related
-
2016
- 2016-11-18 WO PCT/JP2016/084354 patent/WO2017110333A1/en active Application Filing
- 2016-11-18 US US15/761,507 patent/US20180263710A1/en not_active Abandoned
- 2016-11-18 EP EP16813041.7A patent/EP3393385A1/en not_active Ceased
- 2016-11-18 CN CN201680073878.5A patent/CN108366833B/en not_active Expired - Fee Related
Cited By (254)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US11504192B2 (en) | 2014-10-30 | 2022-11-22 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US10619994B2 (en) * | 2016-04-27 | 2020-04-14 | Keyence Corporation | Three-dimensional coordinate measuring device |
US20170314911A1 (en) * | 2016-04-27 | 2017-11-02 | Keyence Corporation | Three-Dimensional Coordinate Measuring Device |
US11540700B2 (en) * | 2016-11-10 | 2023-01-03 | Sony Corporation | Medical supporting arm and medical system |
US20180133085A1 (en) * | 2016-11-14 | 2018-05-17 | Amcad Biomed Corporation | Positioning apparatus for head and neck assessment or intervention |
US11701087B2 (en) | 2016-11-14 | 2023-07-18 | Amcad Biomed Corporation | Method for head and neck assessment or intervention |
US11226477B2 (en) * | 2016-11-16 | 2022-01-18 | Carl Zeiss Meditec Ag | Method for presenting images of a digital surgical microscope and digital surgical microscope system |
US12178403B2 (en) | 2016-11-24 | 2024-12-31 | University Of Washington | Light field capture and rendering for head-mounted displays |
US11564756B2 (en) | 2017-10-30 | 2023-01-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11026712B2 (en) | 2017-10-30 | 2021-06-08 | Cilag Gmbh International | Surgical instruments comprising a shifting mechanism |
US10772651B2 (en) | 2017-10-30 | 2020-09-15 | Ethicon Llc | Surgical instruments comprising a system for articulation and rotation compensation |
US11819231B2 (en) | 2017-10-30 | 2023-11-21 | Cilag Gmbh International | Adaptive control programs for a surgical system comprising more than one type of cartridge |
US11801098B2 (en) | 2017-10-30 | 2023-10-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11793537B2 (en) | 2017-10-30 | 2023-10-24 | Cilag Gmbh International | Surgical instrument comprising an adaptive electrical system |
US11759224B2 (en) | 2017-10-30 | 2023-09-19 | Cilag Gmbh International | Surgical instrument systems comprising handle arrangements |
US11925373B2 (en) | 2017-10-30 | 2024-03-12 | Cilag Gmbh International | Surgical suturing instrument comprising a non-circular needle |
US11696778B2 (en) | 2017-10-30 | 2023-07-11 | Cilag Gmbh International | Surgical dissectors configured to apply mechanical and electrical energy |
US10932806B2 (en) | 2017-10-30 | 2021-03-02 | Ethicon Llc | Reactive algorithm for surgical system |
US11648022B2 (en) | 2017-10-30 | 2023-05-16 | Cilag Gmbh International | Surgical instrument systems comprising battery arrangements |
US11602366B2 (en) | 2017-10-30 | 2023-03-14 | Cilag Gmbh International | Surgical suturing instrument configured to manipulate tissue using mechanical and electrical power |
US10959744B2 (en) | 2017-10-30 | 2021-03-30 | Ethicon Llc | Surgical dissectors and manufacturing techniques |
US11109878B2 (en) | 2017-10-30 | 2021-09-07 | Cilag Gmbh International | Surgical clip applier comprising an automatic clip feeding system |
US11564703B2 (en) | 2017-10-30 | 2023-01-31 | Cilag Gmbh International | Surgical suturing instrument comprising a capture width which is larger than trocar diameter |
US10980560B2 (en) | 2017-10-30 | 2021-04-20 | Ethicon Llc | Surgical instrument systems comprising feedback mechanisms |
US12035983B2 (en) | 2017-10-30 | 2024-07-16 | Cilag Gmbh International | Method for producing a surgical instrument comprising a smart electrical system |
US11510741B2 (en) | 2017-10-30 | 2022-11-29 | Cilag Gmbh International | Method for producing a surgical instrument comprising a smart electrical system |
US11026713B2 (en) | 2017-10-30 | 2021-06-08 | Cilag Gmbh International | Surgical clip applier configured to store clips in a stored state |
US11911045B2 (en) | 2017-10-30 | 2024-02-27 | Cllag GmbH International | Method for operating a powered articulating multi-clip applier |
US12059218B2 (en) | 2017-10-30 | 2024-08-13 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11026687B2 (en) | 2017-10-30 | 2021-06-08 | Cilag Gmbh International | Clip applier comprising clip advancing systems |
US11045197B2 (en) | 2017-10-30 | 2021-06-29 | Cilag Gmbh International | Clip applier comprising a movable clip magazine |
US11413042B2 (en) | 2017-10-30 | 2022-08-16 | Cilag Gmbh International | Clip applier comprising a reciprocating clip advancing member |
US11406390B2 (en) | 2017-10-30 | 2022-08-09 | Cilag Gmbh International | Clip applier comprising interchangeable clip reloads |
US11317919B2 (en) | 2017-10-30 | 2022-05-03 | Cilag Gmbh International | Clip applier comprising a clip crimping system |
US11051836B2 (en) | 2017-10-30 | 2021-07-06 | Cilag Gmbh International | Surgical clip applier comprising an empty clip cartridge lockout |
US11311342B2 (en) | 2017-10-30 | 2022-04-26 | Cilag Gmbh International | Method for communicating with surgical instrument systems |
US11291510B2 (en) | 2017-10-30 | 2022-04-05 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11071560B2 (en) | 2017-10-30 | 2021-07-27 | Cilag Gmbh International | Surgical clip applier comprising adaptive control in response to a strain gauge circuit |
US11291465B2 (en) | 2017-10-30 | 2022-04-05 | Cilag Gmbh International | Surgical instruments comprising a lockable end effector socket |
US11229436B2 (en) | 2017-10-30 | 2022-01-25 | Cilag Gmbh International | Surgical system comprising a surgical tool and a surgical hub |
US12121255B2 (en) | 2017-10-30 | 2024-10-22 | Cilag Gmbh International | Electrical power output control based on mechanical forces |
US11207090B2 (en) | 2017-10-30 | 2021-12-28 | Cilag Gmbh International | Surgical instruments comprising a biased shifting mechanism |
US11141160B2 (en) | 2017-10-30 | 2021-10-12 | Cilag Gmbh International | Clip applier comprising a motor controller |
US11103268B2 (en) | 2017-10-30 | 2021-08-31 | Cilag Gmbh International | Surgical clip applier comprising adaptive firing control |
US11129636B2 (en) | 2017-10-30 | 2021-09-28 | Cilag Gmbh International | Surgical instruments comprising an articulation drive that provides for high articulation angles |
US11123070B2 (en) | 2017-10-30 | 2021-09-21 | Cilag Gmbh International | Clip applier comprising a rotatable clip magazine |
US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
US11903587B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Adjustment to the surgical stapling control based on situational awareness |
US11132462B2 (en) | 2017-12-28 | 2021-09-28 | Cilag Gmbh International | Data stripping method to interrogate patient records and create anonymized record |
US11114195B2 (en) | 2017-12-28 | 2021-09-07 | Cilag Gmbh International | Surgical instrument with a tissue marking assembly |
US12256995B2 (en) | 2017-12-28 | 2025-03-25 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11100631B2 (en) | 2017-12-28 | 2021-08-24 | Cilag Gmbh International | Use of laser light and red-green-blue coloration to determine properties of back scattered light |
US11147607B2 (en) | 2017-12-28 | 2021-10-19 | Cilag Gmbh International | Bipolar combination device that automatically adjusts pressure based on energy modality |
US11160605B2 (en) | 2017-12-28 | 2021-11-02 | Cilag Gmbh International | Surgical evacuation sensing and motor control |
US11166772B2 (en) | 2017-12-28 | 2021-11-09 | Cilag Gmbh International | Surgical hub coordination of control and communication of operating room devices |
US12239320B2 (en) | 2017-12-28 | 2025-03-04 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
US11179204B2 (en) | 2017-12-28 | 2021-11-23 | Cilag Gmbh International | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
US11179208B2 (en) | 2017-12-28 | 2021-11-23 | Cilag Gmbh International | Cloud-based medical analytics for security and authentication trends and reactive measures |
US11179175B2 (en) | 2017-12-28 | 2021-11-23 | Cilag Gmbh International | Controlling an ultrasonic surgical instrument according to tissue location |
US12232729B2 (en) | 2017-12-28 | 2025-02-25 | Cilag Gmbh International | Systems for detecting proximity of surgical end effector to cancerous tissue |
US11202570B2 (en) | 2017-12-28 | 2021-12-21 | Cilag Gmbh International | Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems |
US12226166B2 (en) | 2017-12-28 | 2025-02-18 | Cilag Gmbh International | Surgical instrument with a sensing array |
US12226151B2 (en) | 2017-12-28 | 2025-02-18 | Cilag Gmbh International | Capacitive coupled return path pad with separable array elements |
US11213359B2 (en) | 2017-12-28 | 2022-01-04 | Cilag Gmbh International | Controllers for robot-assisted surgical platforms |
US12207817B2 (en) | 2017-12-28 | 2025-01-28 | Cilag Gmbh International | Safety systems for smart powered surgical stapling |
US12193636B2 (en) | 2017-12-28 | 2025-01-14 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11096693B2 (en) | 2017-12-28 | 2021-08-24 | Cilag Gmbh International | Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing |
US12193766B2 (en) | 2017-12-28 | 2025-01-14 | Cilag Gmbh International | Situationally aware surgical system configured for use during a surgical procedure |
US11234756B2 (en) | 2017-12-28 | 2022-02-01 | Cilag Gmbh International | Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter |
US11253315B2 (en) | 2017-12-28 | 2022-02-22 | Cilag Gmbh International | Increasing radio frequency to create pad-less monopolar loop |
US11257589B2 (en) | 2017-12-28 | 2022-02-22 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
US10595887B2 (en) | 2017-12-28 | 2020-03-24 | Ethicon Llc | Systems for adjusting end effector parameters based on perioperative information |
US12144518B2 (en) | 2017-12-28 | 2024-11-19 | Cilag Gmbh International | Surgical systems for detecting end effector tissue distribution irregularities |
US12137991B2 (en) | 2017-12-28 | 2024-11-12 | Cilag Gmbh International | Display arrangements for robot-assisted surgical platforms |
US11266468B2 (en) | 2017-12-28 | 2022-03-08 | Cilag Gmbh International | Cooperative utilization of data derived from secondary sources by intelligent surgical hubs |
US12133660B2 (en) | 2017-12-28 | 2024-11-05 | Cilag Gmbh International | Controlling a temperature of an ultrasonic electromechanical blade according to frequency |
US11273001B2 (en) | 2017-12-28 | 2022-03-15 | Cilag Gmbh International | Surgical hub and modular device response adjustment based on situational awareness |
US11278281B2 (en) | 2017-12-28 | 2022-03-22 | Cilag Gmbh International | Interactive surgical system |
US12133773B2 (en) | 2017-12-28 | 2024-11-05 | Cilag Gmbh International | Surgical hub and modular device response adjustment based on situational awareness |
US11284936B2 (en) | 2017-12-28 | 2022-03-29 | Cilag Gmbh International | Surgical instrument having a flexible electrode |
US12133709B2 (en) | 2017-12-28 | 2024-11-05 | Cilag Gmbh International | Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems |
US11076921B2 (en) | 2017-12-28 | 2021-08-03 | Cilag Gmbh International | Adaptive control program updates for surgical hubs |
US12127729B2 (en) | 2017-12-28 | 2024-10-29 | Cilag Gmbh International | Method for smoke evacuation for surgical hub |
US11291495B2 (en) | 2017-12-28 | 2022-04-05 | Cilag Gmbh International | Interruption of energy due to inadvertent capacitive coupling |
US11069012B2 (en) | 2017-12-28 | 2021-07-20 | Cilag Gmbh International | Interactive surgical systems with condition handling of devices and data capabilities |
US10695081B2 (en) | 2017-12-28 | 2020-06-30 | Ethicon Llc | Controlling a surgical instrument according to sensed closure parameters |
US12096916B2 (en) | 2017-12-28 | 2024-09-24 | Cilag Gmbh International | Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub |
US12096985B2 (en) | 2017-12-28 | 2024-09-24 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11304720B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Activation of energy devices |
US11304763B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use |
US11304699B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
US11308075B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity |
US11304745B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Surgical evacuation sensing and display |
US11311306B2 (en) | 2017-12-28 | 2022-04-26 | Cilag Gmbh International | Surgical systems for detecting end effector tissue distribution irregularities |
US11058498B2 (en) | 2017-12-28 | 2021-07-13 | Cilag Gmbh International | Cooperative surgical actions for robot-assisted surgical platforms |
US12076010B2 (en) | 2017-12-28 | 2024-09-03 | Cilag Gmbh International | Surgical instrument cartridge sensor assemblies |
US11051876B2 (en) | 2017-12-28 | 2021-07-06 | Cilag Gmbh International | Surgical evacuation flow paths |
US12059169B2 (en) | 2017-12-28 | 2024-08-13 | Cilag Gmbh International | Controlling an ultrasonic surgical instrument according to tissue location |
US12059124B2 (en) | 2017-12-28 | 2024-08-13 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11324557B2 (en) | 2017-12-28 | 2022-05-10 | Cilag Gmbh International | Surgical instrument with a sensing array |
US12062442B2 (en) | 2017-12-28 | 2024-08-13 | Cilag Gmbh International | Method for operating surgical instrument systems |
US12053159B2 (en) | 2017-12-28 | 2024-08-06 | Cilag Gmbh International | Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub |
US12048496B2 (en) | 2017-12-28 | 2024-07-30 | Cilag Gmbh International | Adaptive control program updates for surgical hubs |
US12042207B2 (en) | 2017-12-28 | 2024-07-23 | Cilag Gmbh International | Estimating state of ultrasonic end effector and control system therefor |
US12035890B2 (en) | 2017-12-28 | 2024-07-16 | Cilag Gmbh International | Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub |
US10755813B2 (en) | 2017-12-28 | 2020-08-25 | Ethicon Llc | Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform |
US11364075B2 (en) | 2017-12-28 | 2022-06-21 | Cilag Gmbh International | Radio frequency energy device for delivering combined electrical signals |
US12029506B2 (en) | 2017-12-28 | 2024-07-09 | Cilag Gmbh International | Method of cloud based data analytics for use with the hub |
US11376002B2 (en) | 2017-12-28 | 2022-07-05 | Cilag Gmbh International | Surgical instrument cartridge sensor assemblies |
US11382697B2 (en) | 2017-12-28 | 2022-07-12 | Cilag Gmbh International | Surgical instruments comprising button circuits |
US12009095B2 (en) | 2017-12-28 | 2024-06-11 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
US11389164B2 (en) | 2017-12-28 | 2022-07-19 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
US11998193B2 (en) | 2017-12-28 | 2024-06-04 | Cilag Gmbh International | Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11056244B2 (en) | 2017-12-28 | 2021-07-06 | Cilag Gmbh International | Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks |
US11410259B2 (en) | 2017-12-28 | 2022-08-09 | Cilag Gmbh International | Adaptive control program updates for surgical devices |
US11045591B2 (en) | 2017-12-28 | 2021-06-29 | Cilag Gmbh International | Dual in-series large and small droplet filters |
US11419630B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Surgical system distributed processing |
US11424027B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Method for operating surgical instrument systems |
US11423007B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Adjustment of device control programs based on stratified contextual data in addition to the data |
US11419667B2 (en) | 2017-12-28 | 2022-08-23 | Cilag Gmbh International | Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location |
US11432885B2 (en) | 2017-12-28 | 2022-09-06 | Cilag Gmbh International | Sensing arrangements for robot-assisted surgical platforms |
US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
US11446052B2 (en) | 2017-12-28 | 2022-09-20 | Cilag Gmbh International | Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue |
US11937769B2 (en) | 2017-12-28 | 2024-03-26 | Cilag Gmbh International | Method of hub communication, processing, storage and display |
US11931110B2 (en) | 2017-12-28 | 2024-03-19 | Cilag Gmbh International | Surgical instrument comprising a control system that uses input from a strain gage circuit |
US10758310B2 (en) | 2017-12-28 | 2020-09-01 | Ethicon Llc | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
US11464559B2 (en) | 2017-12-28 | 2022-10-11 | Cilag Gmbh International | Estimating state of ultrasonic end effector and control system therefor |
US11464535B2 (en) | 2017-12-28 | 2022-10-11 | Cilag Gmbh International | Detection of end effector emersion in liquid |
US11918302B2 (en) | 2017-12-28 | 2024-03-05 | Cilag Gmbh International | Sterile field interactive control displays |
US11903601B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Surgical instrument comprising a plurality of drive systems |
US11026751B2 (en) | 2017-12-28 | 2021-06-08 | Cilag Gmbh International | Display of alignment of staple cartridge to prior linear staple line |
US11109866B2 (en) | 2017-12-28 | 2021-09-07 | Cilag Gmbh International | Method for circular stapler control algorithm adjustment based on situational awareness |
US11013563B2 (en) | 2017-12-28 | 2021-05-25 | Ethicon Llc | Drive arrangements for robot-assisted surgical platforms |
US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11529187B2 (en) | 2017-12-28 | 2022-12-20 | Cilag Gmbh International | Surgical evacuation sensor arrangements |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11540855B2 (en) | 2017-12-28 | 2023-01-03 | Cilag Gmbh International | Controlling activation of an ultrasonic surgical instrument according to the presence of tissue |
US10987178B2 (en) | 2017-12-28 | 2021-04-27 | Ethicon Llc | Surgical hub control arrangements |
US11559308B2 (en) | 2017-12-28 | 2023-01-24 | Cilag Gmbh International | Method for smart energy device infrastructure |
US11559307B2 (en) | 2017-12-28 | 2023-01-24 | Cilag Gmbh International | Method of robotic hub communication, detection, and control |
US11890065B2 (en) | 2017-12-28 | 2024-02-06 | Cilag Gmbh International | Surgical system to limit displacement |
US10966791B2 (en) | 2017-12-28 | 2021-04-06 | Ethicon Llc | Cloud-based medical analytics for medical facility segmented individualization of instrument function |
US11571234B2 (en) | 2017-12-28 | 2023-02-07 | Cilag Gmbh International | Temperature control of ultrasonic end effector and control system therefor |
US11576677B2 (en) | 2017-12-28 | 2023-02-14 | Cilag Gmbh International | Method of hub communication, processing, display, and cloud analytics |
US10849697B2 (en) | 2017-12-28 | 2020-12-01 | Ethicon Llc | Cloud interface for coupled surgical devices |
US11589888B2 (en) | 2017-12-28 | 2023-02-28 | Cilag Gmbh International | Method for controlling smart energy devices |
US11589932B2 (en) | 2017-12-28 | 2023-02-28 | Cilag Gmbh International | Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11596291B2 (en) | 2017-12-28 | 2023-03-07 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying of the location of the tissue within the jaws |
US11601371B2 (en) | 2017-12-28 | 2023-03-07 | Cilag Gmbh International | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US11602393B2 (en) | 2017-12-28 | 2023-03-14 | Cilag Gmbh International | Surgical evacuation sensing and generator control |
US10943454B2 (en) | 2017-12-28 | 2021-03-09 | Ethicon Llc | Detection and escalation of security responses of surgical instruments to increasing severity threats |
US11612408B2 (en) | 2017-12-28 | 2023-03-28 | Cilag Gmbh International | Determining tissue composition via an ultrasonic system |
US11612444B2 (en) | 2017-12-28 | 2023-03-28 | Cilag Gmbh International | Adjustment of a surgical device function based on situational awareness |
US11864845B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Sterile field interactive control displays |
US11633237B2 (en) | 2017-12-28 | 2023-04-25 | Cilag Gmbh International | Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures |
US10944728B2 (en) | 2017-12-28 | 2021-03-09 | Ethicon Llc | Interactive surgical systems with encrypted communication capabilities |
US11659023B2 (en) | 2017-12-28 | 2023-05-23 | Cilag Gmbh International | Method of hub communication |
US11666331B2 (en) | 2017-12-28 | 2023-06-06 | Cilag Gmbh International | Systems for detecting proximity of surgical end effector to cancerous tissue |
US11672605B2 (en) | 2017-12-28 | 2023-06-13 | Cilag Gmbh International | Sterile field interactive control displays |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11844579B2 (en) | 2017-12-28 | 2023-12-19 | Cilag Gmbh International | Adjustments based on airborne particle properties |
US11678881B2 (en) | 2017-12-28 | 2023-06-20 | Cilag Gmbh International | Spatial awareness of surgical hubs in operating rooms |
US11696760B2 (en) | 2017-12-28 | 2023-07-11 | Cilag Gmbh International | Safety systems for smart powered surgical stapling |
US10932872B2 (en) | 2017-12-28 | 2021-03-02 | Ethicon Llc | Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set |
US11701185B2 (en) | 2017-12-28 | 2023-07-18 | Cilag Gmbh International | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
US11832840B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical instrument having a flexible circuit |
US10898622B2 (en) | 2017-12-28 | 2021-01-26 | Ethicon Llc | Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device |
US11818052B2 (en) | 2017-12-28 | 2023-11-14 | Cilag Gmbh International | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US10892995B2 (en) | 2017-12-28 | 2021-01-12 | Ethicon Llc | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US11712303B2 (en) | 2017-12-28 | 2023-08-01 | Cilag Gmbh International | Surgical instrument comprising a control circuit |
US11737668B2 (en) | 2017-12-28 | 2023-08-29 | Cilag Gmbh International | Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems |
US11744604B2 (en) | 2017-12-28 | 2023-09-05 | Cilag Gmbh International | Surgical instrument with a hardware-only control circuit |
US11786251B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
US11751958B2 (en) | 2017-12-28 | 2023-09-12 | Cilag Gmbh International | Surgical hub coordination of control and communication of operating room devices |
US10892899B2 (en) | 2017-12-28 | 2021-01-12 | Ethicon Llc | Self describing data packets generated at an issuing instrument |
US11786245B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Surgical systems with prioritized data transmission capabilities |
US11771487B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Mechanisms for controlling different electromechanical systems of an electrosurgical instrument |
US11775682B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Data stripping method to interrogate patient records and create anonymized record |
US11779337B2 (en) | 2017-12-28 | 2023-10-10 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
US11678901B2 (en) | 2018-03-08 | 2023-06-20 | Cilag Gmbh International | Vessel sensing for adaptive advanced hemostasis |
US11844545B2 (en) | 2018-03-08 | 2023-12-19 | Cilag Gmbh International | Calcified vessel identification |
US11707293B2 (en) | 2018-03-08 | 2023-07-25 | Cilag Gmbh International | Ultrasonic sealing algorithm with temperature control |
US11399858B2 (en) | 2018-03-08 | 2022-08-02 | Cilag Gmbh International | Application of smart blade technology |
US11986233B2 (en) | 2018-03-08 | 2024-05-21 | Cilag Gmbh International | Adjustment of complex impedance to compensate for lost power in an articulating ultrasonic device |
US11701139B2 (en) | 2018-03-08 | 2023-07-18 | Cilag Gmbh International | Methods for controlling temperature in ultrasonic device |
US11337746B2 (en) | 2018-03-08 | 2022-05-24 | Cilag Gmbh International | Smart blade and power pulsing |
US11317937B2 (en) | 2018-03-08 | 2022-05-03 | Cilag Gmbh International | Determining the state of an ultrasonic end effector |
US11701162B2 (en) | 2018-03-08 | 2023-07-18 | Cilag Gmbh International | Smart blade application for reusable and disposable devices |
US11839396B2 (en) | 2018-03-08 | 2023-12-12 | Cilag Gmbh International | Fine dissection mode for tissue classification |
US11389188B2 (en) | 2018-03-08 | 2022-07-19 | Cilag Gmbh International | Start temperature of blade |
US11678927B2 (en) | 2018-03-08 | 2023-06-20 | Cilag Gmbh International | Detection of large vessels during parenchymal dissection using a smart blade |
US11464532B2 (en) | 2018-03-08 | 2022-10-11 | Cilag Gmbh International | Methods for estimating and controlling state of ultrasonic end effector |
US11617597B2 (en) | 2018-03-08 | 2023-04-04 | Cilag Gmbh International | Application of smart ultrasonic blade technology |
US11589915B2 (en) | 2018-03-08 | 2023-02-28 | Cilag Gmbh International | In-the-jaw classifier based on a model |
US11298148B2 (en) | 2018-03-08 | 2022-04-12 | Cilag Gmbh International | Live time tissue classification using electrical parameters |
US12121256B2 (en) | 2018-03-08 | 2024-10-22 | Cilag Gmbh International | Methods for controlling temperature in ultrasonic device |
US11534196B2 (en) | 2018-03-08 | 2022-12-27 | Cilag Gmbh International | Using spectroscopy to determine device use state in combo instrument |
US11344326B2 (en) | 2018-03-08 | 2022-05-31 | Cilag Gmbh International | Smart blade technology to control blade instability |
US11457944B2 (en) | 2018-03-08 | 2022-10-04 | Cilag Gmbh International | Adaptive advanced tissue treatment pad saver mode |
US11259830B2 (en) | 2018-03-08 | 2022-03-01 | Cilag Gmbh International | Methods for controlling temperature in ultrasonic device |
US11213294B2 (en) | 2018-03-28 | 2022-01-04 | Cilag Gmbh International | Surgical instrument comprising co-operating lockout features |
US11090047B2 (en) | 2018-03-28 | 2021-08-17 | Cilag Gmbh International | Surgical instrument comprising an adaptive control system |
US11471156B2 (en) | 2018-03-28 | 2022-10-18 | Cilag Gmbh International | Surgical stapling devices with improved rotary driven closure systems |
US11259806B2 (en) | 2018-03-28 | 2022-03-01 | Cilag Gmbh International | Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein |
US11931027B2 (en) | 2018-03-28 | 2024-03-19 | Cilag Gmbh Interntional | Surgical instrument comprising an adaptive control system |
US11219453B2 (en) | 2018-03-28 | 2022-01-11 | Cilag Gmbh International | Surgical stapling devices with cartridge compatible closure and firing lockout arrangements |
US11937817B2 (en) | 2018-03-28 | 2024-03-26 | Cilag Gmbh International | Surgical instruments with asymmetric jaw arrangements and separate closure and firing systems |
US11278280B2 (en) | 2018-03-28 | 2022-03-22 | Cilag Gmbh International | Surgical instrument comprising a jaw closure lockout |
US10973520B2 (en) | 2018-03-28 | 2021-04-13 | Ethicon Llc | Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature |
US11589865B2 (en) | 2018-03-28 | 2023-02-28 | Cilag Gmbh International | Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems |
US11096688B2 (en) | 2018-03-28 | 2021-08-24 | Cilag Gmbh International | Rotary driven firing members with different anvil and channel engagement features |
US11406382B2 (en) | 2018-03-28 | 2022-08-09 | Cilag Gmbh International | Staple cartridge comprising a lockout key configured to lift a firing member |
US11207067B2 (en) | 2018-03-28 | 2021-12-28 | Cilag Gmbh International | Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing |
US11197668B2 (en) | 2018-03-28 | 2021-12-14 | Cilag Gmbh International | Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout |
US11986185B2 (en) | 2018-03-28 | 2024-05-21 | Cilag Gmbh International | Methods for controlling a surgical stapler |
US11166716B2 (en) | 2018-03-28 | 2021-11-09 | Cilag Gmbh International | Stapling instrument comprising a deactivatable lockout |
US11129611B2 (en) | 2018-03-28 | 2021-09-28 | Cilag Gmbh International | Surgical staplers with arrangements for maintaining a firing member thereof in a locked configuration unless a compatible cartridge has been installed therein |
US12126916B2 (en) | 2018-09-27 | 2024-10-22 | Proprio, Inc. | Camera array for a mediated-reality system |
US11517309B2 (en) | 2019-02-19 | 2022-12-06 | Cilag Gmbh International | Staple cartridge retainer with retractable authentication key |
US11291445B2 (en) | 2019-02-19 | 2022-04-05 | Cilag Gmbh International | Surgical staple cartridges with integral authentication keys |
US11925350B2 (en) | 2019-02-19 | 2024-03-12 | Cilag Gmbh International | Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge |
US11331100B2 (en) | 2019-02-19 | 2022-05-17 | Cilag Gmbh International | Staple cartridge retainer system with authentication keys |
US11751872B2 (en) | 2019-02-19 | 2023-09-12 | Cilag Gmbh International | Insertable deactivator element for surgical stapler lockouts |
US11331101B2 (en) | 2019-02-19 | 2022-05-17 | Cilag Gmbh International | Deactivator element for defeating surgical stapling device lockouts |
US11464511B2 (en) | 2019-02-19 | 2022-10-11 | Cilag Gmbh International | Surgical staple cartridges with movable authentication key arrangements |
US11272931B2 (en) | 2019-02-19 | 2022-03-15 | Cilag Gmbh International | Dual cam cartridge based feature for unlocking a surgical stapler lockout |
US11357503B2 (en) | 2019-02-19 | 2022-06-14 | Cilag Gmbh International | Staple cartridge retainers with frangible retention features and methods of using same |
US11298129B2 (en) | 2019-02-19 | 2022-04-12 | Cilag Gmbh International | Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge |
US11369377B2 (en) | 2019-02-19 | 2022-06-28 | Cilag Gmbh International | Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout |
US11298130B2 (en) | 2019-02-19 | 2022-04-12 | Cilag Gmbh International | Staple cartridge retainer with frangible authentication key |
US11317915B2 (en) | 2019-02-19 | 2022-05-03 | Cilag Gmbh International | Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers |
US11291444B2 (en) | 2019-02-19 | 2022-04-05 | Cilag Gmbh International | Surgical stapling assembly with cartridge based retainer configured to unlock a closure lockout |
US11259807B2 (en) | 2019-02-19 | 2022-03-01 | Cilag Gmbh International | Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device |
CN111999879A (en) * | 2019-05-27 | 2020-11-27 | 徕卡仪器(新加坡)有限公司 | Microscope system and method for controlling a surgical microscope |
US11963830B2 (en) | 2019-06-19 | 2024-04-23 | Karl Storz Se & Co. Kg | Medical handling device and method for controlling a handling device |
EP3753520A1 (en) * | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medical handling device for controlling a handling device |
US11963728B2 (en) | 2019-06-19 | 2024-04-23 | Karl Storz Se & Co. Kg | Medical handling device and method for controlling a handling device |
EP3753519A1 (en) * | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medical handling device |
EP3753521A1 (en) * | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medical handling device for controlling a handling device |
US11981035B2 (en) | 2019-06-19 | 2024-05-14 | Karl Storz Se & Co. Kg | Medical handling device and method for controlling a handling device |
USD950728S1 (en) | 2019-06-25 | 2022-05-03 | Cilag Gmbh International | Surgical staple cartridge |
USD964564S1 (en) | 2019-06-25 | 2022-09-20 | Cilag Gmbh International | Surgical staple cartridge retainer with a closure system authentication key |
USD952144S1 (en) | 2019-06-25 | 2022-05-17 | Cilag Gmbh International | Surgical staple cartridge retainer with firing system authentication key |
CN111407406A (en) * | 2020-03-31 | 2020-07-14 | 武汉联影智融医疗科技有限公司 | Head position identification device, intraoperative control system and control method |
US20220301195A1 (en) * | 2020-05-12 | 2022-09-22 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
US12051214B2 (en) * | 2020-05-12 | 2024-07-30 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
CN115376675A (en) * | 2021-05-19 | 2022-11-22 | 博瑞生物医疗科技(深圳)有限公司 | Image acquisition control method and device, storage medium, processor and terminal equipment |
US12261988B2 (en) | 2021-11-08 | 2025-03-25 | Proprio, Inc. | Methods for generating stereoscopic views in multicamera systems, and associated devices and systems |
US20230298206A1 (en) * | 2022-03-15 | 2023-09-21 | Carl Zeiss Meditec Ag | Method for determining the three-dimensional positions of points in a target region on a patient in a reference coordinate system of a surgical visualization system and surgical visualization system |
US12254652B2 (en) * | 2022-03-15 | 2025-03-18 | Carl Zeiss Meditec Ag | Method for determining the three-dimensional positions of points in a target region on a patient in a reference coordinate system of a surgical visualization system and surgical visualization system |
US20230351636A1 (en) * | 2022-04-29 | 2023-11-02 | 3Dintegrated Aps | Online stereo calibration |
Also Published As
Publication number | Publication date |
---|---|
WO2017110333A1 (en) | 2017-06-29 |
JP2017113343A (en) | 2017-06-29 |
CN108366833B (en) | 2021-10-12 |
EP3393385A1 (en) | 2018-10-31 |
JP6657933B2 (en) | 2020-03-04 |
CN108366833A (en) | 2018-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180263710A1 (en) | Medical imaging apparatus and surgical navigation system | |
US11278369B2 (en) | Control device, control method, and surgical system | |
US11419696B2 (en) | Control device, control method, and medical system | |
US11969144B2 (en) | Medical observation system, medical observation apparatus and medical observation method | |
JP7115493B2 (en) | Surgical arm system and surgical arm control system | |
US20220192777A1 (en) | Medical observation system, control device, and control method | |
US11109927B2 (en) | Joint driving actuator and medical system | |
US11638000B2 (en) | Medical observation apparatus | |
WO2018088105A1 (en) | Medical support arm and medical system | |
US20230126611A1 (en) | Information processing apparatus, information processing system, and information processing method | |
US10992852B2 (en) | Medical observation device and control method | |
US20220354347A1 (en) | Medical support arm and medical system | |
EP3843608A2 (en) | Medical observation system configured to generate three-dimensional information and to calculate an estimated region and a corresponding method | |
US20220400938A1 (en) | Medical observation system, control device, and control method | |
US20200015655A1 (en) | Medical observation apparatus and observation visual field correction method | |
CN113015474A (en) | System, method and computer program for verifying scene features | |
US12136196B2 (en) | Medical system, information processing device, and information processing method | |
JP4716747B2 (en) | Medical stereoscopic image observation device | |
US20230026585A1 (en) | Method and system for determining a pose of at least one object in an operating theatre | |
WO2022172733A1 (en) | Observation device for medical treatment, observation device, observation method and adapter | |
US20240155241A1 (en) | Medical observation system, information processing device, and information processing method | |
WO2020050187A1 (en) | Medical system, information processing device, and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKAGUCHI, TATSUMI;KASAI, TAKARA;SIGNING DATES FROM 20180307 TO 20180308;REEL/FRAME:045285/0942 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |