US10885349B2 - Method and apparatus for image processing - Google Patents
Method and apparatus for image processing Download PDFInfo
- Publication number
- US10885349B2 US10885349B2 US16/347,925 US201716347925A US10885349B2 US 10885349 B2 US10885349 B2 US 10885349B2 US 201716347925 A US201716347925 A US 201716347925A US 10885349 B2 US10885349 B2 US 10885349B2
- Authority
- US
- United States
- Prior art keywords
- movement
- movement signals
- image
- signals
- spatial dispersion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000012545 processing Methods 0.000 title claims description 25
- 239000006185 dispersion Substances 0.000 claims abstract description 61
- 238000000513 principal component analysis Methods 0.000 claims abstract description 7
- 238000012544 monitoring process Methods 0.000 claims description 15
- 230000003750 conditioning effect Effects 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 9
- 238000009499 grossing Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 3
- 238000000926 separation method Methods 0.000 claims description 3
- 238000005206 flow analysis Methods 0.000 claims description 2
- 238000012880 independent component analysis Methods 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 claims description 2
- 230000009467 reduction Effects 0.000 claims description 2
- 239000011159 matrix material Substances 0.000 description 14
- 238000004458 analytical method Methods 0.000 description 11
- 230000029058 respiratory gaseous exchange Effects 0.000 description 6
- 230000036541 health Effects 0.000 description 5
- 238000011068 loading method Methods 0.000 description 3
- 206010053398 Clonic convulsion Diseases 0.000 description 2
- 206010010904 Convulsion Diseases 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000003064 k means clustering Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 206010044565 Tremor Diseases 0.000 description 1
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000010561 standard procedure Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- G06K9/00771—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
- G06F18/2135—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
-
- G06K9/6247—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/772—Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- the present invention relates to a method and apparatus for image processing, and in particular for processing video images.
- the automatic analysis and processing of video images is of interest in a wide variety of fields.
- proposals have been made for monitoring the subject with a video camera and analysing the video images to detect the vital signs (such as heart rate, breathing rate, blood oxygen saturation) of the subject.
- vital signs such as heart rate, breathing rate, blood oxygen saturation
- Such analysis may be based on the PPGi signal in the image as in WO-A2-2013/027027 or on detecting fine movement associated with breathing or heart beat or a combination of the two.
- the present invention provides a method of determining whether a video image of a scene contains movement of a subject within the scene by assuming that movement the video image is generated from a mixture of an underlying set of “hidden” movement sources which cannot be directly inferred from the pixel values of the video and to retrieve each of these movement sources and a measure of their spatial extents across the video image—or “spatial dispersions”. This can be achieved by:
- a method of determining whether a video image of a scene contains movement of a subject within the scene comprising the steps of:
- the spatial dispersion measure therefore looks at whether the movement signals (i.e. the change with time in position in the image frame of detected image features) which are strong contributors to movement are spread-out over the image frame (high spatial dispersion) or concentrated in one area (low spatial dispersion). High spatial dispersion is more suggestive of the presence of noise or image artefact, whereas low spatial dispersion is more indicative of subject movement.
- the step of analysing the movement signals to find related movement signals may comprise analysing the movement signals using a blind signal separation method such as principal component analysis, independent component analysis or clustering.
- the step of detecting movement of a plurality of image features through the sequence of image frames may comprise detecting a plurality of image features within each image frame and tracking the position of the plurality of image features through the sequence of image frames to form a corresponding plurality of track signals constituting said movement signals.
- the track signals may be the x and y coordinates of the features or the principal components of each feature's movement.
- the step of detecting movement of a plurality of image features through the sequence of image frames may comprise superpixel detection and tracking or dense optical flow analysis.
- the step of analysing the movement signals to find related movement signals may comprise finding analytically-similar movement signals, such as temporally-similar movement signals, that it is to say with common movement frequency content and it may comprise finding the strength of the common components of the movement signals, and clustering or grouping them in accordance with the movement frequency content.
- Related track signals may also be found by performing a standard clustering process such as K-means clustering, spectral clustering or matrix or hierarchical clustering.
- principal component analysis may be used to find related track signals.
- the method may comprise analysing the movement signals to find principal components of the plurality of movement signals and determining for each obtained principal component the score of each movement signal for that principal component.
- the step of calculating a spatial dispersion measure for the related movement signals may comprise for each of a plurality of the most significant of the obtained principal components calculating the spatial dispersion measure representing the spatial dispersion in the image of the movement signals with a strong score for that principal component.
- the step of comparing the lowest of the calculated spatial dispersion measures with a predetermined first threshold may comprise comparing the lowest of the spatial dispersion measures of the said plurality of principal components with a predetermined first threshold, and if it is lower than the predetermined first threshold determining that video image as containing subject movement.
- the spatial dispersion measure for each principal component may be calculated from the scores for each movement signal for that principal component and a distance in the video image between the movement signals contributing to that principal component.
- the spatial dispersion measure for a principal component may be calculated from a product of the scores for each movement signal for the principal component and a distance in the image between the movement signals.
- the spatial dispersion measure for each principal component may be calculated by, for a plurality of different pairs of the movement signals, calculating the product of the scores from each of the pair of movement signals for that principal component and a distance in the video image between the pair of movement signals.
- the spatial dispersion measure for each principal component may be calculated by calculating the sum of all the products of the scores from each of the pair of movement signals for that principal component and a distance in the video image between the pair of movement signals, and dividing it the sum by the sum of all the scores for each of the pair of movement signals for that principal component.
- the distances in the image between the movement signals are preferably one of: the average over the sequence of image frames of the distance between the image features forming the movement signals, the distance in a predetermined frame of the sequence between the image features forming the movement signals.
- the predetermined frame may be one of: the last frame, the middle frame, the first frame.
- the step of analysing the movement signals to find related movement signals comprises performing clustering on the movement signals and the spatial dispersion measure is calculated for each cluster.
- the method of the invention may further comprised the step of conditioning the movement signals before the step of analysing the movement signals to find related movement signals.
- the step of conditioning the movement signals may comprise at least one of noise reduction and removing non-linear trends.
- the step of conditioning the movement signals may comprise at least one of smoothing the movement signals to reduce pixel noise and differentiating the movement signals to remove non-linear trends.
- the invention also provides an apparatus for analysing video image in accordance with the method above, the apparatus comprising a video image processor programmed to execute the method on an input video image.
- the apparatus may be part of a video monitoring system including a video camera to capture video images and a display to display the captured video together with the results of the image analysis. This may form part of a health or welfare monitoring system or a security monitoring system.
- the video camera is preferably a standard digital video camera so that the video image sequence is a conventional frame sequence with each frame comprising an array of pixel intensities.
- the camera may be monochrome or may be a colour camera providing pixel intensities in the red, green and blue channels.
- this aspect of the invention may provide apparatus for monitoring a subject in a room to provide status or alerting of a subject's condition, the apparatus comprising: a video camera configured to capture a video image sequence of the room; a data processor configured to automatically process the video image sequence to determine whether a video image of a scene contains movement of a subject within the scene in accordance with the method above; and a display or other output device which under the control of the data processor outputs a visible or audible indication of the determination.
- the invention may be embodied in a signal processing method, or in a signal processing apparatus which may be constructed as dedicated hardware or by means of a programmed general purpose computer or programmable digital signal processor.
- the invention may also be embodied in a computer program for processing a captured video image sequence in accordance with the invention and for outputting the resulting determination.
- this aspect of the invention may provide a computer program comprising program code means for executing on a computer system the processing of a captured video image sequence of a scene to automatically determine whether the video image sequence contains movement of a subject within the scene, the method being in as described above.
- FIG. 1 illustrates schematically a welfare monitoring system including an image analysis apparatus in accordance with an embodiment of the invention
- FIG. 2 schematically illustrates example image frames from a video image
- FIG. 3 schematically illustrates track signals extracted from a video image
- FIG. 4 schematically illustrates a vital signs monitoring method
- FIG. 5 schematically illustrates parts of the vital signs monitoring method of FIG. 4 ;
- FIG. 6 schematically illustrates processing of movement signals from the video image in accordance with one embodiment of the invention
- FIG. 7 schematically illustrates processing of video signals in accordance with an embodiment of the invention
- FIG. 8 schematically illustrates processing of video signals in accordance with another embodiment of the invention.
- FIG. 9 is a screenshot of a scene including a human subject and with tracked feature points indicated together with their principal component score with the lowest spatial dispersion.
- FIG. 10 is a screenshot of a scene not including a human subject and with tracked feature points indicated together with their principal component score with the lowest spatial dispersion.
- FIG. 1 schematically illustrates an apparatus in accordance with an embodiment of the invention being used to monitor a subject 3 in a room 1 .
- the room 1 can be a secure room such as a police or prison cell or some other detention facility, or could be a room in a hospital or other care facility such as a care home, sheltered accommodation or the subject's own home.
- the subject 3 is monitored by a video camera 5 whose output is processed by a video signal processor 7 and the results of the analysis are displayed on a display 9 which is visible to staff of the facility.
- the video signal processor 7 may be a dedicated signal processor or a programmed general purpose computer.
- the room may be naturally lit and/or may be artificially illuminated using a visible light source 11 or infrared light source 13 .
- the video camera 5 is a standard digital video camera outputting video data in the form of a sequence of image frames, each frame being an image of the scene in the form of a pixel array of intensities in red, green, blue channels.
- the red, green and blue channels also give a response in the infrared range allowing the production of an infra-red (IR) image useful when the room is dark.
- Video cameras of this type typically output the signal at fifteen or twenty frames per second, though of course different frame rates are possible.
- the display 9 preferably displays the video image of the room and also displays automatically-generated information regarding the health or safety of the subject 3 .
- this information is preferably:—
- Staff monitoring the subject by way of the display 9 can therefore tell at any given time whether the subject is considered safe, for example because they are moving or because the vital signs are being detected and are in a physiologically normal range, or whether the system is unable to detect vital signs and safe movement is detected (and for how long that situation has persisted), or that no vital signs and no movement is detected, in which case an alert is generated prompting staff to monitor the subject. If the lack of vital signs detection persists for more than a configurable amount of time an alert may be generated to call on staff to check the subject. Alerts can included a range of electronic notification methods including automated telephone message, pager, SMS, as well as indication on the display 9 with the alert containing the condition and location of the subject and the condition being alerted.
- FIG. 4 schematically illustrates the overall processing by video signal processor 7 .
- the video analysis system 7 analysis the video image captured by camera 5 to detect in step 104 the vital signs of the subject (such as heart rate and breathing rate), and in parallel in step 102 analyses the video image to detect whether it contains gross movement of the subject, fine movement of the subject or no movement of the subject.
- gross movement is, for example, the subject walking or running or jumping, and in the case of gross movement it is normal to suspend analysis of the video to detect vital signs such as heart rate or breathing rate because it is difficult to perform such analysis in the presence of gross movement.
- step 100 the video is acquired and is subject both to movement detection in step 102 and vital signs detection or estimation in step 104 .
- Step 104 may be suspended if gross movement is detected in the video sequence.
- the results of the movement detection and vital signs detection are interpreted and validated in step 106 and displayed in step 108 on display 9 .
- FIG. 5 schematically illustrates the interpretation and validation step 106 .
- step 200 the first determination is made as to whether gross subject movement is present in the video image. If gross movement is present then the subject status is classified as safe and a corresponding display made in step 202 . If no gross movement is present then, firstly, if valid vital signs have already been detected (in the previous pass through the process), then these are displayed in step 206 . If not, then a determination is made as to whether fine movement is present in the image. If fine movement is present, but no vital signs have been detected, then a display “no vital signs” is made in step 210 . On the other hand if no fine movement is detected, and if this situation persists for a predetermined time, then a display “no vital signs and no movement” alert is made in step 212 , which may indicate that the subject is in a serious condition.
- step 208 an important part of the process is determining in step 208 whether or not fine movement is present. If some image artefact is mistaken for fine movement of the subject, the system will not report correctly on the status of the subject.
- the present invention is therefore designed to improve the reliability of the automated detection of fine movement of a subject in a scene being video imaged, and to distinguish such movement from image artefacts such as pixel noise.
- FIGS. 6, 7 and 8 The image processing in accordance with one embodiment of the invention is illustrated in FIGS. 6, 7 and 8 .
- This embodiment of the invention is based on detecting fine movement by tracking movement of image features in the video image of the room.
- the video camera 5 provides an image not only of the subject, but of the whole scene, i.e. also of the room and articles in it.
- the movement may be movement of parts of the body of the subject 3 or movement of articles associated with or in contact with the subject such as clothing or bedding.
- a first step 600 the image is smoothed—e.g. using Gaussian smoothing and then in step 601 image features (also known as feature points) in the video sequence are detected. This may be conducted on a grayscale image formed from the RGB channels, e.g. by averaging them.
- image features also known as feature points
- feature points consisting of recognisable geometrical shapes such as corners or edges can be detected based, for example, on the gradient of intensity variation in one or two dimensions and any such algorithm which identifies image feature points can be used in this invention.
- Feature point detecting algorithms usable with this invention are ones which detects Harris features, SIFT features, ORB/SURF features for example.
- feature points are preferred which are strong and relatively evenly spaced. This is achieved in an iterative process in step 602 by selecting feature points found by the feature point detecting algorithm based on two metrics: one based on the strength of the point as measured and output by the feature point detecting algorithm (the strength might be, for example the intensity gradient, but all algorithms output some measure of strength), and one which is the distance to already selected feature points. Thus a first feature point from those generated by the feature point detecting algorithm is selected, e.g. the strongest, and then the distance from all other feature points to the selected feature point is measured.
- a weighted combination of each feature point's strength and distance to the already selected point is found, and the one with the highest value is selected.
- the minimum distances of the remaining feature points to the closest of the two already selected feature points are recalculated, the weighted combination of strength and minimum distance recalculated, and the feature point with the highest value is selected. This process of calculating distances to the closest of the already selected feature points and selecting the one with the highest combination of strength and distance continues until the desired number of feature points, e.g 400 or 600 has been selected.
- This process of selecting feature points can be repeated any time the number of feature points falls below a desired value as a result of some subsequent step, e.g sometimes features can fail to be successfully tracked from one frame to another resulting in a decrease in the number of feature points being tracked.
- a standard feature tracking algorithm such as KLT tracking may be used.
- step 603 a sequence of video frames is taken for processing as a batch, such as four hundred frames (corresponding to twenty seconds of video at a conventional frame rate of twenty frames per second).
- the x coordinates and y coordinates of all of the selected feature points that are present in all one four hundred frames of the sequence are then taken.
- the variation in position (x coordinate and y coordinate) of each feature point through the frame sequence is then taken as a track signal to be processed.
- step 104 will output n track signals where n is the number of feature points tracked (e.g. 600 ).
- FIG. 2 illustrates schematically three frames of a video image frame sequence, the frames being at t ⁇ 1, t and t+1, with a number of detected feature points 20 , 21 , 22 in them.
- Each detected feature 20 , 21 , 22 point will contribute an x, y coordinate pair and the variation of the x and y coordinate of each of these pairs through the sequence (i.e. as a function of time) is taken as a track signal.
- the variation in position with time of each feature point is a 2D time signal.
- One option for creating such a signal is to perform principal component analysis on each individual signal and to choose the most significant principal component as the new track signal.
- the variation in x coordinate with time and the variation in y coordinate with time of each feature point could be used as separate track signals—each feature point track contributing two track signals to the subsequent processing steps.
- the track signals my be passed as they are to the movement analysis stage, but in this embodiment the track signals are subject to further signal conditioning to improve subsequent processing and dependent on the requirements of that processing—in particular its sensitivity to noise.
- further signal conditioning to improve subsequent processing and dependent on the requirements of that processing—in particular its sensitivity to noise.
- techniques which look at the signals' linear similarities such as linear principal components analysis or clustering using Pearson correlation mentioned below, are sensitive to noise in the signals and thus benefit to signal conditioning.
- alternative methods which look at mutual information or non-linear PCA can omit such signal conditioning.
- step 605 the track signals are smoothed using a standard smoothing filter such as a Savitsky-Golay filter or a Gaussian smoothing filter which are effective to remove pixel noise.
- step 606 the smoothed signals are differentiated to remove non-linear trends and in step 607 the results are normalised to have a standard deviation equal to one. The result is to provide well-conditioned track signals for the subsequent processing steps.
- steps 608 the resulting track signals p′ 1 to p′ n are passed to the movement analysis stage of FIG. 7 or 8 .
- the aim of the processing in FIG. 7 or 8 is to examine the spatial dispersion of the main feature movements in the image. In particular, it examines whether the main movement signals are concentrated in one part of the image (and thus suggestive of movement of a subject in scene), or whether they are spread out all over the image which is likely to be indicative of noise or some other image artefact.
- a spatial dispersion measure which is based on the combination of the distance between pairs of track signals in the image and the similarity or commonality of the track signals of the pair is calculated. If track signals having a high commonality or similarity are closely grouped—it is suggestive of subject movement. If they are well-dispersed or there is little commonality, it is less suggestive of subject movement.
- this assessment is made by finding the most significant principal components of the whole set of track signals and examining the distance between the track signals with the strongest scores in principal components.
- a spatial dispersion measure which is based on the combination of the distance between pairs of track signals in the image and the scores of the most significant principal components of the track signals of the pair is calculated.
- FIG. 7 illustrates one way of calculating a spatial dispersion measure.
- an n by m matrix of the absolute values of the scores from all n track signals for all m principal components pc 1 to pc m is formed.
- step 703 the distance between all pairs of the n signals in the image is calculated and formed into a distances matrix.
- the distance is measured in pixels and it may be the average distance between the tracked features in the sequence of image frames under consideration, or alternatively the distance between the tracked features in any one of the frames (for example the last frame, the middle frame or the first frame for convenience). As the distance between a tracked feature and itself is zero, the matrix will have a zero diagonal.
- the distances matrix (2) is an n by n matrix giving the distances between every pair of tracked feature points. If this is combined with the knowledge of which track signals have a greater absolute score for the more significant principal components, the result is an indication of the spatial dispersion of the movement signals in the image.
- steps 704 to 709 are conducted for each of the m principal components.
- step 705 one principal component's column from the scores matrix (1) is taken;
- step 706 an element-wise multiplication of the n by n distances matrix (2) and the n by n outer product (4) from step 705 is performed.
- an element-wise multiplication is meant that each element of one matrix is multiplied by its corresponding element in the other matrix.
- n by n values each of which is equal to the product of the distance between two tracked features and the principal component score of each of the two tracked features for the principal component under consideration.
- step 707 all of the resulting values are summed and the sum is divided by the sum of the elements of the outer product of the principal component scores matrix. The result is a spatial dispersion measure which is higher if the distance between tracked features contributing strongly to the principal component are spaced far apart, and lower if they are spaced closed together.
- step 708 it is checked whether a spatial dispersion measure has been obtained for all principal components, and if not steps 705 to 707 are repeated for the next principal component until all m of them have had a spatial dispersion measure calculated.
- the spatial dispersion measure is low if the tracked movements contributing to the most significant principal components are close together (and thus more likely to represent movement of a subject in the image)
- the spatial dispersion measures for each of the principal components are compared and the minimum of the spatial dispersion measures is taken.
- this minimum spatial dispersion measure is compared to a threshold TH 1 and if it is less than the threshold then the section of video under consideration (current time window of step 603 ) is categorised as containing fine movement of the subject. Processing then returns to step 603 so that the process can be repeated for the next time window.
- the window is moved on by a small fraction of the window length, for example a 20 second time window may be moved on by one second each time.
- the threshold TH 1 in step 711 is predetermined and is set as an empirical threshold in a training or tuning process by examining video images of a scene containing gross movement, fine movement and no movement (and naturally-present random noise) and setting the threshold to distinguish fine movement from noise.
- FIG. 8 illustrates an alternative way of determining a spatial dispersion measure.
- step 800 the track signals from the processing of FIG. 6 are grouped.
- the signals are grouped according to their similarity.
- One variety of techniques for performing such grouping is clustering, which may be hard or soft clustering, based on some similarity measurement. For example, standard techniques such as K-means clustering, spectral clustering or clustering on a matrix or hierarchical clustering may be used.
- K-means clustering spectral clustering or clustering on a matrix or hierarchical clustering
- Soft clustering signals may be regarded as having a certain percentage of belonging to several different clusters.
- the signals maybe subjected to clustering according to the absolute values of their correlation (e.g. Pearson correlation, or Spearman or Kendall correlation) or on the basis of their mutual information.
- correlation e.g. Pearson correlation, or Spearman or Kendall correlation
- a matrix is formed of the pairwise distances between the signals in a subset of the groups comprising those groups which satisfy some suitability criterion such as containing more than a predetermined number of tracks or being one of the largest N groups where N is predetermined.
- the average of the pairwise distances within each of the subset of groups is calculated and this average of the pairwise distances is taken as a spatial dispersion measure.
- the minimum of the average distances is taken and in step 805 it is compared to a threshold TH 1 , which is an empirical predetermined threshold set in the same way as the threshold TH 1 in step 711 .
- the time window of frames is noted as containing fine subject movement if the spatial dispersion measure is less than the threshold TH 1 .
- the processing passes back to step 603 to repeat for the next time window of frames.
- the result of the processing above is an improved and more robust detection of fine movement of a subject, such as a human or animal, in the video image.
- FIG. 9 is a screenshot illustrating the spatial dispersion measures calculated for feature points for a video sequence where a human subject is in a room.
- the black squares show feature points which are detected and tracked through the sequence but which have a low score for the principle component (loadings vector) with the lowest spatial dispersion measure.
- the pink circular points are tracked features which have a high score for the loadings vector with the lowest spatial dispersion measure. It can be seen that the pink points are highly associated with the human subject. Thus the minimum spatial dispersion measure is small because the pink marked points are close together and it is below the threshold TH 1 .
- This scene therefore includes a subject which is associated with fine movement.
- FIG. 10 illustrates the same scene, but with no human subject in the room.
- the pink circular points are those tracked features which have a high score for the loadings vector with the lowest spatial dispersion. However, these points are very far apart, distributed across the scene, and so the smallest spatial dispersion measure in this scene is above the threshold TH 1 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
-
- 1) detecting movement in the video,
- 2) describing that movement as a set of signals
- 3) optionally processing the signals if necessary, e.g. by signal conditioning, for the next step,
- 4) blindly separating the movement sources in this set of signals; and
- 5) determining the spatial extent of these movement sources.
-
- acquiring a sequence of image frames forming the video image;
- detecting movement of a plurality of image features through the sequence of image frames to form a corresponding plurality of movement signals;
- analysing the movement signals to find related movement signals, these being signals analysed as likely to relate to the same movement source;
- calculating a spatial dispersion measure for the related movement signals, the spatial dispersion measure representing the spatial dispersion in the image of the related movement signals;
- comparing the lowest of the calculated spatial dispersion measures with a predetermined first threshold, and if it is lower than the predetermined first threshold determining that video image as containing subject movement.
-
- Whether movement is detected.
- Whether vital signs are being acquired.
- Whether the subject is judged to be safe.
- Current values of estimated vital signs such as heart rate and breathing rate.
- Whether no vital signs have been detected and the time for which no vital signs have been detected.
- A no movement and no vital signs alert or alarm.
Claims (24)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1618828.6 | 2016-11-08 | ||
GB201618828 | 2016-11-08 | ||
PCT/GB2017/053343 WO2018087528A1 (en) | 2016-11-08 | 2017-11-07 | Method and apparatus for image processing |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190294888A1 US20190294888A1 (en) | 2019-09-26 |
US10885349B2 true US10885349B2 (en) | 2021-01-05 |
Family
ID=60331643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/347,925 Active 2038-01-01 US10885349B2 (en) | 2016-11-08 | 2017-11-07 | Method and apparatus for image processing |
Country Status (3)
Country | Link |
---|---|
US (1) | US10885349B2 (en) |
EP (1) | EP3539082A1 (en) |
WO (1) | WO2018087528A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6894725B2 (en) * | 2017-03-09 | 2021-06-30 | キヤノン株式会社 | Image processing device and its control method, program, storage medium |
US11829559B2 (en) * | 2021-08-27 | 2023-11-28 | International Business Machines Corporation | Facilitating interactions on a mobile device interface based on a captured image |
CN117493921B (en) * | 2024-01-03 | 2024-03-19 | 智洁云服(大连)信息技术有限公司 | Artificial intelligence energy-saving management method and system based on big data |
Citations (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0615245A2 (en) | 1993-03-08 | 1994-09-14 | Nec Corporation | Method for detecting a scene change and image editing apparatus |
EP0919184A1 (en) | 1997-11-21 | 1999-06-02 | Toshiba Engineering Corporation | Region-of-interest setting apparatus for respiration monitoring and a respiration monotoring system |
US20020106709A1 (en) | 2000-08-18 | 2002-08-08 | Potts Russell O. | Methods and devices for prediction of hypoglycemic events |
US20020180870A1 (en) | 2001-04-13 | 2002-12-05 | Hsiao-Ping Chen | Method for detecting moving objects by comparing video images |
US20030138149A1 (en) | 2002-01-21 | 2003-07-24 | Yoshio Iizuka | Image distribution apparatus, communication terminal apparatus, and control method thereof |
US20030228032A1 (en) | 2002-06-07 | 2003-12-11 | Yong Rui | System and method for mode-based multi-hypothesis tracking using parametric contours |
EP1571594A2 (en) | 2004-03-02 | 2005-09-07 | Siemens Corporate Research, Inc. | Illumination invariant change detection |
US20050197590A1 (en) | 1997-01-06 | 2005-09-08 | Ivan Osorio | System for the prediction, rapid detection, warning, prevention, or control of changes in activity states in the brain of a subject |
US20060058618A1 (en) | 2004-08-30 | 2006-03-16 | Masahide Nishiura | Medical kinematic analysis apparatus and a medical kinematic analysis method |
US7133537B1 (en) * | 1999-05-28 | 2006-11-07 | It Brokerage Services Pty Limited | Method and apparatus for tracking a moving object |
US20070156060A1 (en) | 2005-12-29 | 2007-07-05 | Cervantes Miguel A | Real-time video based automated mobile sleep monitoring using state inference |
US20070195931A1 (en) | 2006-02-20 | 2007-08-23 | Satoru Ohishi | Image diagnostic apparatus, image processing apparatus, and program |
US20080292151A1 (en) | 2007-05-22 | 2008-11-27 | Kurtz Andrew F | Capturing data for individual physiological monitoring |
US20090216499A1 (en) | 2006-02-20 | 2009-08-27 | Andreas Tobola | Adaptive filtering for more reliably determining physiological parameters |
US20090312985A1 (en) * | 2008-06-12 | 2009-12-17 | Eliazar Austin I D | Multiple hypothesis tracking |
US20100049064A1 (en) | 2008-08-20 | 2010-02-25 | Burnham Institute For Medical Research | Compositions and methods for screening cardioactive drugs |
US20100045799A1 (en) * | 2005-02-04 | 2010-02-25 | Bangjun Lei | Classifying an Object in a Video Frame |
US20100074475A1 (en) | 2006-10-04 | 2010-03-25 | Tomoaki Chouno | Medical image diagnostic device |
WO2010100593A1 (en) | 2009-03-06 | 2010-09-10 | Koninklijke Philips Electronics N.V. | Method of controlling a function of a device and system for detecting the presence of a living being |
WO2010115939A2 (en) | 2009-04-07 | 2010-10-14 | National University Of Ireland, Cork | A method for the real-time identification of seizures in an electroencephalogram (eeg) signal |
US20100298656A1 (en) | 2009-05-20 | 2010-11-25 | Triage Wireless, Inc. | Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds |
WO2011021128A2 (en) | 2009-08-20 | 2011-02-24 | Koninklijke Philips Electronics N.V. | Method and system for image analysis |
US20110046498A1 (en) | 2007-05-02 | 2011-02-24 | Earlysense Ltd | Monitoring, predicting and treating clinical episodes |
US20110150274A1 (en) | 2009-12-23 | 2011-06-23 | General Electric Company | Methods for automatic segmentation and temporal tracking |
JP2011130996A (en) | 2009-12-25 | 2011-07-07 | Denso Corp | Biological activity measuring apparatus |
US20110251493A1 (en) | 2010-03-22 | 2011-10-13 | Massachusetts Institute Of Technology | Method and system for measurement of physiological parameters |
US20120213405A1 (en) | 2011-02-23 | 2012-08-23 | Denso Corporation | Moving object detection apparatus |
US20120242819A1 (en) | 2011-03-25 | 2012-09-27 | Tk Holdings Inc. | System and method for determining driver alertness |
WO2013027027A2 (en) | 2011-08-22 | 2013-02-28 | Isis Innovation Limited | Remote monitoring of vital signs |
US20130138009A1 (en) | 2011-11-25 | 2013-05-30 | Persyst Development Corporation | Method And System For Displaying EEG Data |
US20130324875A1 (en) | 2012-06-01 | 2013-12-05 | Xerox Corporation | Processing a video for respiration rate estimation |
US20130330060A1 (en) * | 2010-11-29 | 2013-12-12 | Hans-Peter Seidel | Computer-implemented method and apparatus for tracking and reshaping a human shaped figure in a digital world video |
US20140003690A1 (en) | 2012-07-02 | 2014-01-02 | Marco Razeto | Motion correction apparatus and method |
US20140037166A1 (en) | 2011-04-14 | 2014-02-06 | Koninklijke Philips N.V. | Device and method for extracting information from characteristic signals |
US20140037163A1 (en) | 2009-10-06 | 2014-02-06 | Koninklijke Philips N.V. | Formation of a time-varying signal representative of at least variations in a value based on pixel values |
EP2767233A1 (en) | 2013-02-15 | 2014-08-20 | Koninklijke Philips N.V. | Device for obtaining respiratory information of a subject |
WO2014125250A1 (en) | 2013-02-12 | 2014-08-21 | Isis Innovation Limited | Analysing video images of a subject to identify spatial image areas which contain periodic intensity variations |
WO2014131850A1 (en) | 2013-02-28 | 2014-09-04 | Koninklijke Philips N.V. | Apparatus and method for determining vital sign information from a subject |
WO2014140994A1 (en) | 2013-03-13 | 2014-09-18 | Koninklijke Philips N.V. | Apparatus and method for determining vital signs from a subject |
US20140276104A1 (en) | 2013-03-14 | 2014-09-18 | Nongjian Tao | System and method for non-contact monitoring of physiological parameters |
US20140276099A1 (en) | 2013-03-14 | 2014-09-18 | Koninklijke Philips N.V. | Device and method for determining vital signs of a subject |
US8855384B2 (en) | 2012-06-20 | 2014-10-07 | Xerox Corporation | Continuous cardiac pulse rate estimation from multi-channel source video data |
US20140334697A1 (en) | 2013-05-08 | 2014-11-13 | Koninklijke Philips N.V. | Device for obtaining a vital sign of a subject |
US20140371599A1 (en) | 2013-06-14 | 2014-12-18 | Medtronic, Inc. | Motion analysis for behavior identification |
US20140371635A1 (en) | 2010-12-07 | 2014-12-18 | Earlysense Ltd. | Monitoring a sleeping subject |
US20140378842A1 (en) | 2013-06-19 | 2014-12-25 | Xerox Corporation | Video acquisition system and method for monitoring a subject for a desired physiological function |
US20150005646A1 (en) | 2013-06-26 | 2015-01-01 | Massachusetts Institute Of Technology | Pulse detection from head motions in video |
WO2015004915A1 (en) | 2013-07-12 | 2015-01-15 | セイコーエプソン株式会社 | Biometric information processing device and biometric information processing method |
US8965090B1 (en) | 2014-07-06 | 2015-02-24 | ARC Devices, Ltd | Non-touch optical detection of vital signs |
US20150063708A1 (en) | 2013-08-29 | 2015-03-05 | Analog Devices Technology | Facial detection |
WO2015049150A1 (en) | 2013-10-01 | 2015-04-09 | Koninklijke Philips N.V. | Improved signal selection for obtaining a remote photoplethysmographic waveform |
WO2015055709A1 (en) | 2013-10-17 | 2015-04-23 | Koninklijke Philips N.V. | Device and method for obtaining a vital sign of a subject |
US9036877B2 (en) | 2012-06-20 | 2015-05-19 | Xerox Corporation | Continuous cardiac pulse rate estimation from multi-channel source video data with mid-point stitching |
US20150148687A1 (en) | 2013-11-22 | 2015-05-28 | Samsung Electronics Co., Ltd. | Method and apparatus for measuring heart rate |
WO2015078735A1 (en) | 2013-11-27 | 2015-06-04 | Koninklijke Philips N.V. | Device and method for obtaining pulse transit time and/or pulse wave velocity information of a subject |
WO2015091582A1 (en) | 2013-12-19 | 2015-06-25 | Koninklijke Philips N.V. | A baby monitoring device |
US20150208987A1 (en) | 2012-08-02 | 2015-07-30 | Koninklijke Philips N.V. | Device and method for extracting physiological information |
US20150221069A1 (en) | 2014-02-05 | 2015-08-06 | Elena Shaburova | Method for real time video processing involving changing a color of an object on a human face in a video |
US20150250391A1 (en) | 2014-03-07 | 2015-09-10 | Xerox Corporation | Cardiac pulse rate estimation from source video data |
WO2015172735A1 (en) | 2014-05-16 | 2015-11-19 | Mediatek Inc. | Detection devices and methods for detecting regions of interest |
US20150363361A1 (en) | 2014-06-16 | 2015-12-17 | Mitsubishi Electric Research Laboratories, Inc. | Method for Kernel Correlation-Based Spectral Data Processing |
EP2976998A1 (en) | 2014-07-21 | 2016-01-27 | Withings | Method and device for monitoring a baby and for interaction |
EP2988274A2 (en) | 2014-08-21 | 2016-02-24 | Sony Corporation | Method and system for video data processing |
US20160106340A1 (en) | 2014-10-21 | 2016-04-21 | Xerox Corporation | System and method for determining respiration rate from a video |
US20160125260A1 (en) | 2014-11-04 | 2016-05-05 | Canon Kabushiki Kaisha | Selecting features from image data |
US20160132732A1 (en) | 2014-11-10 | 2016-05-12 | Utah State University | Remote Heart Rate Estimation |
WO2016092290A1 (en) | 2014-12-08 | 2016-06-16 | Oxehealth Limited | Method and apparatus for physiological monitoring |
WO2016094749A1 (en) | 2014-12-11 | 2016-06-16 | Rdi, Llc | Method of analyzing, displaying, organizing and responding to vital signals |
WO2016159151A1 (en) | 2015-03-31 | 2016-10-06 | 株式会社エクォス・リサーチ | Pulse wave detection device and pulse wave detection program |
US20170007185A1 (en) | 2015-07-09 | 2017-01-12 | National Taiwan University Of Science And Technology | Non-contact method for detecting physiological signal and motion in real time |
US20170042432A1 (en) | 2014-04-28 | 2017-02-16 | Massachusetts Institute Of Technology | Vital signs monitoring via radio reflections |
WO2017125763A1 (en) | 2016-01-21 | 2017-07-27 | Oxehealth Limited | Method and apparatus for estimating heart rate |
WO2017125743A1 (en) | 2016-01-21 | 2017-07-27 | Oxehealth Limited | Method and apparatus for health and safety monitoring of a subject in a room |
WO2017125744A1 (en) | 2016-01-21 | 2017-07-27 | Oxehealth Limited | Method and apparatus for estimating breathing rate |
US20170224256A1 (en) | 2014-10-13 | 2017-08-10 | Koninklijke Philips N.V. | Device and method for detecting vital sign information of a subject |
EP3207862A1 (en) | 2016-02-19 | 2017-08-23 | Covidien LP | Methods for video-based monitoring of vital signs |
US20180053294A1 (en) * | 2016-08-19 | 2018-02-22 | Sony Corporation | Video processing system and method for deformation insensitive tracking of objects in a sequence of image frames |
US10034979B2 (en) | 2011-06-20 | 2018-07-31 | Cerner Innovation, Inc. | Ambient sensing of patient discomfort |
US20180279885A1 (en) | 2015-10-08 | 2018-10-04 | Koninklijke Philips N.V | Device, system and method for obtaining vital sign information of a subject |
US20190000391A1 (en) | 2016-01-15 | 2019-01-03 | Koninklijke Philips N.V. | Device, system and method for generating a photoplethysmographic image carrying vital sign information of a subject |
US20190267040A1 (en) | 2014-12-15 | 2019-08-29 | Sony Corporation | Information processing method and image processing apparatus |
-
2017
- 2017-11-07 WO PCT/GB2017/053343 patent/WO2018087528A1/en unknown
- 2017-11-07 EP EP17798270.9A patent/EP3539082A1/en active Pending
- 2017-11-07 US US16/347,925 patent/US10885349B2/en active Active
Patent Citations (93)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0615245A2 (en) | 1993-03-08 | 1994-09-14 | Nec Corporation | Method for detecting a scene change and image editing apparatus |
US20050197590A1 (en) | 1997-01-06 | 2005-09-08 | Ivan Osorio | System for the prediction, rapid detection, warning, prevention, or control of changes in activity states in the brain of a subject |
EP0919184A1 (en) | 1997-11-21 | 1999-06-02 | Toshiba Engineering Corporation | Region-of-interest setting apparatus for respiration monitoring and a respiration monotoring system |
US7133537B1 (en) * | 1999-05-28 | 2006-11-07 | It Brokerage Services Pty Limited | Method and apparatus for tracking a moving object |
US20020106709A1 (en) | 2000-08-18 | 2002-08-08 | Potts Russell O. | Methods and devices for prediction of hypoglycemic events |
US20020180870A1 (en) | 2001-04-13 | 2002-12-05 | Hsiao-Ping Chen | Method for detecting moving objects by comparing video images |
US20030138149A1 (en) | 2002-01-21 | 2003-07-24 | Yoshio Iizuka | Image distribution apparatus, communication terminal apparatus, and control method thereof |
US20030228032A1 (en) | 2002-06-07 | 2003-12-11 | Yong Rui | System and method for mode-based multi-hypothesis tracking using parametric contours |
EP1571594A2 (en) | 2004-03-02 | 2005-09-07 | Siemens Corporate Research, Inc. | Illumination invariant change detection |
US20060058618A1 (en) | 2004-08-30 | 2006-03-16 | Masahide Nishiura | Medical kinematic analysis apparatus and a medical kinematic analysis method |
US20100045799A1 (en) * | 2005-02-04 | 2010-02-25 | Bangjun Lei | Classifying an Object in a Video Frame |
US20070156060A1 (en) | 2005-12-29 | 2007-07-05 | Cervantes Miguel A | Real-time video based automated mobile sleep monitoring using state inference |
US20070195931A1 (en) | 2006-02-20 | 2007-08-23 | Satoru Ohishi | Image diagnostic apparatus, image processing apparatus, and program |
US20090216499A1 (en) | 2006-02-20 | 2009-08-27 | Andreas Tobola | Adaptive filtering for more reliably determining physiological parameters |
US20100074475A1 (en) | 2006-10-04 | 2010-03-25 | Tomoaki Chouno | Medical image diagnostic device |
US20110046498A1 (en) | 2007-05-02 | 2011-02-24 | Earlysense Ltd | Monitoring, predicting and treating clinical episodes |
US20080292151A1 (en) | 2007-05-22 | 2008-11-27 | Kurtz Andrew F | Capturing data for individual physiological monitoring |
US20090312985A1 (en) * | 2008-06-12 | 2009-12-17 | Eliazar Austin I D | Multiple hypothesis tracking |
US20100049064A1 (en) | 2008-08-20 | 2010-02-25 | Burnham Institute For Medical Research | Compositions and methods for screening cardioactive drugs |
WO2010100593A1 (en) | 2009-03-06 | 2010-09-10 | Koninklijke Philips Electronics N.V. | Method of controlling a function of a device and system for detecting the presence of a living being |
US20110311143A1 (en) | 2009-03-06 | 2011-12-22 | Koninklijke Philips Electronics N.V. | Method of controlling a function of a device and system for detecting the presence of a living being |
US20140023235A1 (en) | 2009-03-06 | 2014-01-23 | Koninklijke Philips N.V. | Method of controlling a function of a device and system for detecting the presence of a living being |
WO2010115939A2 (en) | 2009-04-07 | 2010-10-14 | National University Of Ireland, Cork | A method for the real-time identification of seizures in an electroencephalogram (eeg) signal |
US20100298656A1 (en) | 2009-05-20 | 2010-11-25 | Triage Wireless, Inc. | Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds |
WO2011021128A2 (en) | 2009-08-20 | 2011-02-24 | Koninklijke Philips Electronics N.V. | Method and system for image analysis |
US20120141000A1 (en) | 2009-08-20 | 2012-06-07 | Koninklijke Philips Electronics N.V. | Method and system for image analysis |
US20140037163A1 (en) | 2009-10-06 | 2014-02-06 | Koninklijke Philips N.V. | Formation of a time-varying signal representative of at least variations in a value based on pixel values |
US20110150274A1 (en) | 2009-12-23 | 2011-06-23 | General Electric Company | Methods for automatic segmentation and temporal tracking |
JP2011130996A (en) | 2009-12-25 | 2011-07-07 | Denso Corp | Biological activity measuring apparatus |
US20110251493A1 (en) | 2010-03-22 | 2011-10-13 | Massachusetts Institute Of Technology | Method and system for measurement of physiological parameters |
US20130330060A1 (en) * | 2010-11-29 | 2013-12-12 | Hans-Peter Seidel | Computer-implemented method and apparatus for tracking and reshaping a human shaped figure in a digital world video |
US20140371635A1 (en) | 2010-12-07 | 2014-12-18 | Earlysense Ltd. | Monitoring a sleeping subject |
US20120213405A1 (en) | 2011-02-23 | 2012-08-23 | Denso Corporation | Moving object detection apparatus |
US20120242819A1 (en) | 2011-03-25 | 2012-09-27 | Tk Holdings Inc. | System and method for determining driver alertness |
US20140037166A1 (en) | 2011-04-14 | 2014-02-06 | Koninklijke Philips N.V. | Device and method for extracting information from characteristic signals |
US10034979B2 (en) | 2011-06-20 | 2018-07-31 | Cerner Innovation, Inc. | Ambient sensing of patient discomfort |
WO2013027027A2 (en) | 2011-08-22 | 2013-02-28 | Isis Innovation Limited | Remote monitoring of vital signs |
US20130138009A1 (en) | 2011-11-25 | 2013-05-30 | Persyst Development Corporation | Method And System For Displaying EEG Data |
US20130324875A1 (en) | 2012-06-01 | 2013-12-05 | Xerox Corporation | Processing a video for respiration rate estimation |
US9036877B2 (en) | 2012-06-20 | 2015-05-19 | Xerox Corporation | Continuous cardiac pulse rate estimation from multi-channel source video data with mid-point stitching |
US8855384B2 (en) | 2012-06-20 | 2014-10-07 | Xerox Corporation | Continuous cardiac pulse rate estimation from multi-channel source video data |
US20140003690A1 (en) | 2012-07-02 | 2014-01-02 | Marco Razeto | Motion correction apparatus and method |
US20150208987A1 (en) | 2012-08-02 | 2015-07-30 | Koninklijke Philips N.V. | Device and method for extracting physiological information |
WO2014125250A1 (en) | 2013-02-12 | 2014-08-21 | Isis Innovation Limited | Analysing video images of a subject to identify spatial image areas which contain periodic intensity variations |
US20140236036A1 (en) | 2013-02-15 | 2014-08-21 | Koninklijke Philips N.V. | Device for obtaining respiratory information of a subject |
EP2767233A1 (en) | 2013-02-15 | 2014-08-20 | Koninklijke Philips N.V. | Device for obtaining respiratory information of a subject |
WO2014131850A1 (en) | 2013-02-28 | 2014-09-04 | Koninklijke Philips N.V. | Apparatus and method for determining vital sign information from a subject |
WO2014140994A1 (en) | 2013-03-13 | 2014-09-18 | Koninklijke Philips N.V. | Apparatus and method for determining vital signs from a subject |
US20140276104A1 (en) | 2013-03-14 | 2014-09-18 | Nongjian Tao | System and method for non-contact monitoring of physiological parameters |
US20140276099A1 (en) | 2013-03-14 | 2014-09-18 | Koninklijke Philips N.V. | Device and method for determining vital signs of a subject |
US20140334697A1 (en) | 2013-05-08 | 2014-11-13 | Koninklijke Philips N.V. | Device for obtaining a vital sign of a subject |
US20140371599A1 (en) | 2013-06-14 | 2014-12-18 | Medtronic, Inc. | Motion analysis for behavior identification |
US20140378842A1 (en) | 2013-06-19 | 2014-12-25 | Xerox Corporation | Video acquisition system and method for monitoring a subject for a desired physiological function |
US20150005646A1 (en) | 2013-06-26 | 2015-01-01 | Massachusetts Institute Of Technology | Pulse detection from head motions in video |
WO2015004915A1 (en) | 2013-07-12 | 2015-01-15 | セイコーエプソン株式会社 | Biometric information processing device and biometric information processing method |
US20150063708A1 (en) | 2013-08-29 | 2015-03-05 | Analog Devices Technology | Facial detection |
WO2015049150A1 (en) | 2013-10-01 | 2015-04-09 | Koninklijke Philips N.V. | Improved signal selection for obtaining a remote photoplethysmographic waveform |
US20160220128A1 (en) | 2013-10-01 | 2016-08-04 | Koninklijke Philips N.V. | Improved signal selection for obtaining a remote photoplethysmographic waveform |
WO2015055709A1 (en) | 2013-10-17 | 2015-04-23 | Koninklijke Philips N.V. | Device and method for obtaining a vital sign of a subject |
US20160253820A1 (en) | 2013-10-17 | 2016-09-01 | Koninklijke Philips N.V. | Device and method for obtaining a vital signal of a subject |
US20150148687A1 (en) | 2013-11-22 | 2015-05-28 | Samsung Electronics Co., Ltd. | Method and apparatus for measuring heart rate |
WO2015078735A1 (en) | 2013-11-27 | 2015-06-04 | Koninklijke Philips N.V. | Device and method for obtaining pulse transit time and/or pulse wave velocity information of a subject |
EP3073905A1 (en) | 2013-11-27 | 2016-10-05 | Koninklijke Philips N.V. | Device and method for obtaining pulse transit time and/or pulse wave velocity information of a subject |
US10292662B2 (en) | 2013-11-27 | 2019-05-21 | Koninklijke Philips N.V. | Device and method for obtaining pulse transit time and/or pulse wave velocity information of a subject |
US20160310067A1 (en) | 2013-12-19 | 2016-10-27 | Koninklijke Philips N.V. | A baby monitoring device |
WO2015091582A1 (en) | 2013-12-19 | 2015-06-25 | Koninklijke Philips N.V. | A baby monitoring device |
US20150221069A1 (en) | 2014-02-05 | 2015-08-06 | Elena Shaburova | Method for real time video processing involving changing a color of an object on a human face in a video |
US20150250391A1 (en) | 2014-03-07 | 2015-09-10 | Xerox Corporation | Cardiac pulse rate estimation from source video data |
US20170042432A1 (en) | 2014-04-28 | 2017-02-16 | Massachusetts Institute Of Technology | Vital signs monitoring via radio reflections |
WO2015172735A1 (en) | 2014-05-16 | 2015-11-19 | Mediatek Inc. | Detection devices and methods for detecting regions of interest |
US20150363361A1 (en) | 2014-06-16 | 2015-12-17 | Mitsubishi Electric Research Laboratories, Inc. | Method for Kernel Correlation-Based Spectral Data Processing |
US8965090B1 (en) | 2014-07-06 | 2015-02-24 | ARC Devices, Ltd | Non-touch optical detection of vital signs |
EP2976998A1 (en) | 2014-07-21 | 2016-01-27 | Withings | Method and device for monitoring a baby and for interaction |
EP2988274A2 (en) | 2014-08-21 | 2016-02-24 | Sony Corporation | Method and system for video data processing |
US20170224256A1 (en) | 2014-10-13 | 2017-08-10 | Koninklijke Philips N.V. | Device and method for detecting vital sign information of a subject |
US20160106340A1 (en) | 2014-10-21 | 2016-04-21 | Xerox Corporation | System and method for determining respiration rate from a video |
US20160125260A1 (en) | 2014-11-04 | 2016-05-05 | Canon Kabushiki Kaisha | Selecting features from image data |
US20160132732A1 (en) | 2014-11-10 | 2016-05-12 | Utah State University | Remote Heart Rate Estimation |
WO2016092290A1 (en) | 2014-12-08 | 2016-06-16 | Oxehealth Limited | Method and apparatus for physiological monitoring |
WO2016094749A1 (en) | 2014-12-11 | 2016-06-16 | Rdi, Llc | Method of analyzing, displaying, organizing and responding to vital signals |
US20190267040A1 (en) | 2014-12-15 | 2019-08-29 | Sony Corporation | Information processing method and image processing apparatus |
WO2016159151A1 (en) | 2015-03-31 | 2016-10-06 | 株式会社エクォス・リサーチ | Pulse wave detection device and pulse wave detection program |
US20180085010A1 (en) | 2015-03-31 | 2018-03-29 | Equos Research Co., Ltd. | Pulse wave detection device and pulse wave detection program |
US20170007185A1 (en) | 2015-07-09 | 2017-01-12 | National Taiwan University Of Science And Technology | Non-contact method for detecting physiological signal and motion in real time |
US20180279885A1 (en) | 2015-10-08 | 2018-10-04 | Koninklijke Philips N.V | Device, system and method for obtaining vital sign information of a subject |
US20190000391A1 (en) | 2016-01-15 | 2019-01-03 | Koninklijke Philips N.V. | Device, system and method for generating a photoplethysmographic image carrying vital sign information of a subject |
WO2017125744A1 (en) | 2016-01-21 | 2017-07-27 | Oxehealth Limited | Method and apparatus for estimating breathing rate |
WO2017125763A1 (en) | 2016-01-21 | 2017-07-27 | Oxehealth Limited | Method and apparatus for estimating heart rate |
WO2017125743A1 (en) | 2016-01-21 | 2017-07-27 | Oxehealth Limited | Method and apparatus for health and safety monitoring of a subject in a room |
EP3207862A1 (en) | 2016-02-19 | 2017-08-23 | Covidien LP | Methods for video-based monitoring of vital signs |
US20170238805A1 (en) | 2016-02-19 | 2017-08-24 | Covidien Lp | Systems and methods for video-based monitoring of vital signs |
US20170238842A1 (en) | 2016-02-19 | 2017-08-24 | Covidien Lp | Systems and methods for video-based monitoring of vital signs |
US20180053294A1 (en) * | 2016-08-19 | 2018-02-22 | Sony Corporation | Video processing system and method for deformation insensitive tracking of objects in a sequence of image frames |
Non-Patent Citations (53)
Title |
---|
Amelard Robert et al. "Illumination-compensated non-contact imaging photoplethysmography via dual-mode temporally coded illumination". Progress in Biomedical Optics and Imaging, SPIE-International Society for Optical Engineering, Bellingham, WA, US., vol. 9316, Mar. 5, 2015. |
Amelard Robert et al. "Illumination-compensated non-contact imaging photoplethysmography via dual-mode temporally coded illumination". Progress in Biomedical Optics and Imaging, SPIE—International Society for Optical Engineering, Bellingham, WA, US., vol. 9316, Mar. 5, 2015. |
Arindam Sikdar et al, "Computer-Vision-Guided Human Pulse Rate Estimation: A Review", IEEE Reviews in Biomedical Engineering, vol. 9, Sep. 16, 2016 (Sep. 16, 2016), pp. 91-105. |
Beleznai et al, "Multiple Object Tracking Using Local PCA", 2006, The 18th International Conference on Pattern Recognition (ICPR'06), 4 pages (Year: 2006). * |
Blocker Timon et al, "An online PPGI approach for camera based heart rate monitoring using beat-to-beat detection", 2017 IEEE Sensors Applications Symposium (SAS), IEEE, Mar. 13, 2017. |
British Search Report regarding Application No. 1900034.8 dated Jun. 13, 2019. |
European Search Report regarding Application No. EP 19 15 8085 dated Jul. 10, 2019. |
Extended EP Search Report regarding Application No. 19220090.5 dated Feb. 24, 2020. |
Extended European Search Report regarding application No. 18168310.3-1115 dated Oct. 1, 2018. |
Hisato Aota et al, "Extracting objects by clustering of full pixel trajectories", Signal Processing and Multimedia Applications (SIGMAP), Proceedings of the 2010 International Conference on, IEEE, Jul. 26, 2010 (Jul. 26, 2010), pp. 65-72. |
International Preliminary Report on Patentability and Written Opinion regarding Application No. PCT/GB2017/052779 dated Mar. 19, 2019. |
International Preliminary Report on Patentability regarding Application No. PCT/GB2017/053343 dated May 14, 2019. |
International Search Report and Written Opinion for PCT/GB2017/052779, dated Nov. 10, 2017; ISA/EP. |
International Search Report and Written Opinion for PCT/GB2017/053343, dated Jan. 4, 2018; ISA/EP. |
International Search Report for PCT/GB2017/050126, ISA/EP, Rijswijk, NL, dated Apr. 20, 2017. |
International Search Report for PCT/GB2017/050127, ISA/EP, Rijswijk, NL, dated Mar. 28, 2017. |
International Search Report for PCT/GB2017/050128, ISA/EP, Rijswijk, NL, dated Apr. 13, 2017. |
International Search Report for PCT/GB2017/050162, ISA/EP, Rijswijk, NL, dated Jul. 6, 2017. |
Konstantinos Avgerinakis et al, "Activity detection and recognition of daily living events", Proceedings of the 1st ACM International Workshop on Multimedia Indexing and Information Retrieval for Healthcare, MIIRH '13, Oct. 22, 2013 (Oct. 22, 2013), pp. 1-7. |
Kumar-DistancePPG: Robust non-contact vital signs monitoring using a camera, Optical Society of America (2015). |
Nakajima, Kazuki, Yoshiaki Matsumoto, and Toshiyo Tamura. "Development of real-time image sequence analysis for evaluating posture change and respiratory rate of a subject in bed." Physiological Measurement 22.3 (2001). |
Nathalie M. el Nabbout et al, "Automatically Detecting and Tracking People Walking through a Transparent Door with Vision", Computer and Robot Vision, 2008. CRV '08. Canadian Conference on, IEEE, Piscataway, NJ, USA, May 28, 2008 (May 28, 2008), pp. 171-178. |
Pisani-Real-time Automated Detection of Clonic Seizures in Newborns, Clinical Neurophysiology 125 (2014) 1533-1540. |
Qiang Zhu et al, "Learning a Sparse, Corner-Based Representation for Corner-Based Representation for Time-varying Background Modeling" , Computer Vision, 2005. ICCV 2005. Tenth IEEE International Conference on Beijing, China Oct. 17-20, 2005, Piscataway, NJ, USA, IEEE, Los Alamitos, CA, USA, vol. 1, Oct. 17, 2005 (Oct. 17, 2005), pp. 678-685. |
Search Report for GB Application No. 1615899.0, dated Feb. 28, 2017. |
Search Report for GB Application No. 1618828.6, dated Mar. 31, 2017. |
Search Report for Priority Application GB1601140.5, UK IPO, Newport, South Wales, dated Jul. 21, 2016. |
Search Report of UKIPO regarding Application No. GB1900033.0 dated Jun. 13, 2019. |
Search Report regarding United Kingdom Patent Application No. GB1706449.4, dated Oct. 25, 2017. |
Search Report under Section 17(5) for priority application GB1601142.1, UKIPO, Newport, South Wales, dated Jun. 28, 2016. |
Shandong Wu et al, "A hierarchical motion trajectory signature descriptor", 2008 IEEE International Conference on Robotics and Automation. The Half-Day Workshop on: Towards Autonomous Agriculture of Tomorrow, IEEE-Piscataway, NJ, USA, Piscataway, NJ, USA, May 19, 2008 (May 19, 2008), pp. 3070-3075. |
Shandong Wu et al, "A hierarchical motion trajectory signature descriptor", 2008 IEEE International Conference on Robotics and Automation. The Half-Day Workshop on: Towards Autonomous Agriculture of Tomorrow, IEEE—Piscataway, NJ, USA, Piscataway, NJ, USA, May 19, 2008 (May 19, 2008), pp. 3070-3075. |
Tarassenko et al, "Non-contact video-based vital sign monitoring using ambient light and auto-regressive models", 2014 Physiol. Meas. 35 807, pp. 807-831. |
Tongchi Zhou et al, "A study of relative motion point trajectories for action recognition", 2015 International Conference on Wireless Communications & Signal Processing (WCSP), IEEE, Oct. 15, 2015 (Oct. 15, 2015), pp. 1-5. |
U.S. Appl. No. 15/961,279, filed Apr. 24, 2018, Nicholas Dunkley Hutchinson. |
U.S. Appl. No. 16/071,542, filed Jul. 20, 2018, Nicholas Dunkley Hutchinson. |
U.S. Appl. No. 16/071,570, filed Jul. 20, 2018, Simon Mark Chave Jones. |
U.S. Appl. No. 16/071,591, filed Jul. 20, 2018, Muhammad Fraz. |
U.S. Appl. No. 16/071,611, filed Jul. 20, 2018, Nicholas Dunkley Hutchinson. |
U.S. Appl. No. 16/291,728, filed Mar. 4, 2019, Nicholas Dunkley Hutchinson. |
U.S. Appl. No. 16/334,211, filed Mar. 18, 2019, Mohamed Elm ikaty. |
U.S. Appl. No. 16/732,769, filed Jan. 2, 2020, Nicholas Dunkley Hutchinson. |
U.S. Appl. No. 16/732,979, filed Jan. 2, 2020, Nicholas Dunkley Hutchinson. |
U.S. Appl. No. 16/733,065, filed Jan. 2, 2020, Nicholas Dunkley Hutchinson. |
UK IPO Search Report for GB priority application 1601217.1, Newport, South Wales, dated Jul. 25, 2016. |
UK IPO Search Report under Section 17(5) for priority application GB1061143.9, dated Mar. 30, 2016. |
Verkruysse-Remote Plethysmographic Imaging using Ambient Light, Optics Express (Dec. 22, 2008) vol. 16, No. 26. |
Written Opinion of the ISA for PCT/GB2017/050126, ISA/EP, Rijswijk, NL, dated Apr. 20, 2017. |
Written Opinion of the ISA for PCT/GB2017/050127, ISA/EP, Rijswijk, NL, dated Mar. 28, 2017. |
Written Opinion of the ISA for PCT/GB2017/050128, ISA/EP, Rijswijk, NL, dated Apr. 13, 2017. |
Written Opinion of the ISA for PCT/GB2017/050162, ISA/EP, Rijswijk, NL, dated Jul. 6, 2017. |
Wu et al, Eulerian Video Magnification for Revealing Subtle Changes in the World, 2012. |
Yu Sun et al,"Photoplethysmography Revisited: From Contact to Noncontact, From Point to Imaging", IEEE Transactions on Biomedical Engineering, IEEE Service Center, Piscataway, NJ, USA, vol. 63, No. 3, Mar. 1, 2016 (Mar. 1, 2016), pp. 463-477. |
Also Published As
Publication number | Publication date |
---|---|
EP3539082A1 (en) | 2019-09-18 |
WO2018087528A1 (en) | 2018-05-17 |
US20190294888A1 (en) | 2019-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10796140B2 (en) | Method and apparatus for health and safety monitoring of a subject in a room | |
US10952683B2 (en) | Method and apparatus for estimating breathing rate | |
EP3405105B1 (en) | Method and apparatus for estimating heart rate | |
US10121062B2 (en) | Device, system and method for automated detection of orientation and/or location of a person | |
Fan et al. | Fall detection via human posture representation and support vector machine | |
US8538063B2 (en) | System and method for ensuring the performance of a video-based fire detection system | |
US11403754B2 (en) | Method and apparatus for monitoring of a human or animal subject | |
US11563920B2 (en) | Method and apparatus for monitoring of a human or animal subject field | |
US10885349B2 (en) | Method and apparatus for image processing | |
US11690536B2 (en) | Method and apparatus for monitoring of a human or animal subject | |
TWI493510B (en) | Falling down detection method | |
CN105027166A (en) | Apparatus and method for detecting subjects on the basis of vital signs | |
KR101357721B1 (en) | Apparatus for detecting of single elderly persons’ abnormal situation using image and method thereof | |
JP2017005485A (en) | Picture monitoring device and picture monitoring method | |
EP3536234B1 (en) | Improvements in or relating to video monitoring | |
US11182910B2 (en) | Method and apparatus for image processing | |
Mirmahboub et al. | View-invariant fall detection system based on silhouette area and orientation | |
Goel et al. | Human-Fall detection using Video surveillance in an Indoor Domain | |
Mokashi et al. | Deep Learning Approach for Drowsiness Detection Using Facial Features | |
Pippa¹ et al. | Investigation of Sensor Placement for Accurate Fall |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OXEHEALTH LIMITED, UNITED KINGDOM Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JONES, SIMON MARK CHAVE;HUTCHINSON, NICHOLAS DUNKLEY;REEL/FRAME:049099/0209 Effective date: 20190409 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |