US7672538B2 - Generation of still image from a plurality of frame images - Google Patents
Generation of still image from a plurality of frame images Download PDFInfo
- Publication number
- US7672538B2 US7672538B2 US10/541,479 US54147905A US7672538B2 US 7672538 B2 US7672538 B2 US 7672538B2 US 54147905 A US54147905 A US 54147905A US 7672538 B2 US7672538 B2 US 7672538B2
- Authority
- US
- United States
- Prior art keywords
- frame image
- image
- image area
- area
- synthesis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
Definitions
- the present invention relates to a technique for synthesizing a plurality of frame images included in video to create a still image.
- a still image is created by capturing a single scene from video shot with a digital video camera or the like so that the still image has higher resolution than the frame images.
- Such a still image is created through a synthesis process that involves overlapping several frames contained in the video image.
- JP2000-244851A discloses a technique wherein a single frame image is selected as a reference image from among a number (n+1) of successive images, motion vectors for the other n frame images (processing-object-images) with respect to this reference image are calculated. and on the basis of the motion vectors, the (n+1) frame images are synthesized to produce a still image.
- JP06-350974A discloses another technique for created a still image from interlacing video images, wherein one field is selected as a reference image from among a pair of interlacing fields, with the other filed being selected as a processing-object-image, then determining on a field-by-field basis whether the processing-object-image is suitable for synthesis, synthesizing them where determined to be appropriate.
- a field-by-field basis whether the processing-object-image is suitable for synthesis, synthesizing them where determined to be appropriate.
- FIG. 1 is an illustration of a method for synthesizing a reference image and an object image for synthesis.
- a reference image and an object image for synthesis At top in FIG. 1 are shown a reference image and an object image for synthesis, positioned so as to compensate for image shift.
- positional relationships among pixels of the reference image, object image for synthesis, and synthesized image At bottom in FIG. 1 , “ ⁇ ” symbols denote pixels of the reference image.
- “•” symbols denote pixels of the object image for synthesis.
- the hatched circles shown on the broken gridlines denote pixels of the synthesized image.
- resolution of the reference image and object image for synthesis are shown as being the same, with frame image resolution being increased by 1.5 ⁇ in the x axis direction and y axis direction.
- pixel g 1 of the synthesized image will be considered. This pixel g 1 coincides with pixel t 1 of the reference image.
- a tone value at the position of pixel g 1 is calculated by a bilinear method, and this tone value is then averaged with the tone value of pixel t 1 of the reference image to obtain a tone value for pixel g 1 .
- a tone value for pixel g 2 of the synthesized image is determined by the following procedure. On the basis of tone values of the four pixels t 2 -t 5 of the reference image surrounding pixel g 2 , a tone value at the position of pixel g 2 is calculated by a bilinear method. Next, on the basis of tone values of the four pixels s 4 -s 7 of the object image for synthesis surrounding pixel g 2 , a tone value at the position of pixel g 2 is calculated by a bilinear method. The two are then averaged to obtain a tone value for pixel g 2 .
- Tone values for other pixels can be determined in the manner described above.
- the description assumes that resolution of the reference image and processing-object-image is the same; however, where reference image and processing-object-image resolutions are different, a similar process may be carried out after enlargement or reduction, as appropriate.
- FIG. 2 is an illustration showing a synthesis method for use where there is zero image shift between a reference image and an object image for synthesis.
- a reference image and an object image for synthesis are shown at top in FIG. 2 .
- a reference image and an object image for synthesis are shown at top in FIG. 2 .
- positional relationships among pixels of the reference image, object image for synthesis, and synthesized image are shown at bottom in FIG. 2 . Since the reference image and object image for synthesis overlap in their entirety, the pixels of the reference image and those of the object image for synthesis are situated at the same locations.
- a tone value for pixel g 2 of the synthesized image is determined by the following procedure. First, on the basis of tone values of the four pixels t 2 -t 5 of the reference image surrounding pixel g 2 , a tone value at the position of pixel g 2 is calculated by a bilinear method. Next, on the basis of tone values of the four pixels s 2 -s 5 of the object image for synthesis surrounding pixel g 2 , a tone value at the position of pixel g 2 is calculated by a bilinear method. The two are then averaged to obtain a tone value for pixel g 2 .
- tone values of pixels t 2 -t 5 and tone values of pixels s 2 -s 5 are the same values, the tone value at the position of pixel g 2 calculated by bilinear method on the basis of pixels t 2 -t 5 and the tone value at the position of pixel g 2 calculated by bilinear method on the basis of pixels s 2 -s 5 are identical values. That is, the average value thereof will also be the same as tone value at the position of pixel g 2 calculated by bilinear method on the basis of pixels t 2 -t 5 , and tone value at the position of image g 2 calculated by bilinear method on the basis of pixels s 2 -s 5 .
- the image generating device of the invention is an image generating device for generating a still image from a plurality of frame images contained in a video image, characterized by comprising: a synthesis object setting module for setting, from among areas included in frame images other than a reference frame image selected from among the plurality of frame images, one or more areas as object frame image areas for synthesis, the object frame image areas being selected according to a predetermined rule relating to a reference frame image area within the reference frame image; a comparison reference extracting module for extracting one comparison reference frame image area from among the reference frame image area and the object frame image areas for synthesis; a target extracting module for extracting one target frame image area from among the object frame image areas for synthesis other than the comparison reference frame image area; a comparing module for comparing the comparison reference frame image area with the selected target frame image area to calculate a pre-selected parameter; an excluding module that in the event that the parameter does not meet a predetermined criterion, excludes the target frame image area
- frame image refers to a still image displayable in progressive format (also known as non-interlaced format). Accordingly, in interlaced format, the frame image of the invention would correspond to an image composed of a number of field images (odd-numbered fields and even-numbered fields) with different raster lines.
- a frame image area not meeting a predetermined criterion is excluded as an object of synthesis, and the number of frame image areas to be synthesized is maintained at or above a predetermined number, whereby picture quality of the synthesized image can be improved efficiently.
- the predetermined criterion may be whether image shift is above a threshold value. A more detailed description will be made later.
- the predetermined number of frame image areas can be set arbitrarily, but in preferred practice will be 2 or greater.
- the synthesis object setting module may set areas of plural frame images that succeed in a time series immediately after the reference frame image in the video, as the object frame image areas for synthesis, or set the object frame image areas for synthesis at an interval of several frames.
- the image synthesis process performed by the synthesized image generating module there may be employed a nearest neighbor method, bilinear method, bi-cubic method, or any other of various known methods for image interpolation.
- methods enabling processing at higher speeds involve simpler procedures, and thus have poorer interpolation accuracy and picture quality than do methods involving more complicated procedures.
- the nearest neighbor method, bilinear method, and bi-cubic method are increasingly complex in that order, and have correspondingly longer processing times.
- interpolation accuracy is high and picture quality is improved.
- the synthesized image generating module may perform a fast synthesis process by means of the nearest neighbor method or the like; or in the event of a smaller number it may carry out a synthesis process with a high degree of interpolation accuracy by means of the bi-cubic method or the like.
- the synthesized image generating module may additionally comprise a setting module for setting as the reference frame image area an area within the reference frame image, to serve as a reference for synthesis; and a frame number controlling module for repeating the processes of the synthesis object setting module, the comparison reference extracting module, the object extracting module, the comparing module and the excluding module, until the total number of the reference frame image area and the object frame image areas for synthesis meeting the criterion reaches a predetermined number or greater.
- the image generating device of the invention may comprise a specification receiving module for receiving specification of the reference frame image, with the setting module setting the specified frame image as the reference frame image.
- the user will be able to select from a video a frame image from which to make a still image, and to designate it as the reference frame image.
- the comparison reference extracting module may set the reference frame image area as the comparison reference frame image area.
- the reference frame image area is an area of the image that will serve as reference for synthesis, it is preferable in the first instance to use the reference frame image area as the comparison reference frame image area.
- the reference frame image area it may be decided whether an object frame image area for synthesis qualifies for synthesis, and one area selected from object frame image areas for synthesis that have been qualified for synthesis can be used as the next comparison reference frame image area.
- frame image area 1 is the reference frame image area
- frame image area 2 and frame image area 3 are object frame image areas for synthesis.
- the shift amount of frame image area 2 and the shift amount of frame image area 3 are calculated. If each shift amount calculated in this manner is equal to or greater than a predetermined value, frame image area 2 is then used as the comparison reference frame image area.
- the image generating device herein may comprise an eliminating module for eliminating, from among the object frame image areas for synthesis, an area of a frame image for which a characteristic of the frame image area meets a predetermined condition.
- the predetermined condition may be, for example, a high level noise, out-of-focus, abnormal color tone due to a hand covering the lens, or the like.
- the eliminating module is able in advance to eliminate such frame image areas from synthesis objects.
- the parameter may be an image shift amount.
- the image shift amount is due, for example, to shaking or turning of the camera. If the image shift amount is too small, the object frame image area for synthesis will be substantially unable to raise picture quality of a synthesized image area. In the present invention, object frame image areas for synthesis that are not very useful in terms of improving picture quality of a synthesized image area can be excluded as synthesis targets.
- the image shift amount may include at least either an amount of translational shift or an amount of rotational shift. Translational shift can be detected by any of various methods such as a block matching or gradient-based method, or a method which is a combination of these. Rotational shift can be detected by means of geometric calculations. Where the parameter is an image shift amount, the predetermined criterion mentioned earlier may be whether the image shift amount exceeds a threshold value.
- the comparing module may comprise a frame shift calculating module for calculating the image shift amount of a target frame image containing the target frame image area, with respect to a comparison reference frame image containing the comparison reference frame image area; and an area shift calculating module for calculating the image shift amount of the target frame image area with respect to the comparison reference frame image area, on the basis of the image shift amount calculated by the frame shift calculating module.
- the shift amount of an area can be calculated easily from the image shift amount of a frame image.
- shift amount of individual areas can be approximated by translational shift. Even an image that, taken as a frame image as a whole, is not qualified for synthesis, may be useable for synthesis if divided into areas.
- the amount of area shift can be calculated directly, without calculating the image shift amount of the frame image.
- the parameter may be an image difference derived from a comparison of pixel characteristic values at identical locations in the target frame image area and the comparison reference frame image area.
- This characteristic value may be color tone or luminance.
- a comparison reference frame image area through comparison with a comparison reference frame image area, it is possible to exclude as synthesis objects frame image areas having substantially no image difference. Since synthesizing together frame image areas of the same content simply gives a frame image area also having the same content, picture quality will not be improved; therefore, frame image areas having the same content as the comparison reference frame image area may be deliberately excluded from the object frame image area for synthesis.
- the invention is particularly effective in cases where the frame rate in video has been varied, or other such instances where there is a sequence of frames having identical content, in order to exclude as synthesis object frame image areas having identical content. In such instances, it is possible to determine whether a frame image area qualifies for synthesis simply by determining whether there is a substantial image difference, without having to calculate the shift amount. Since it is sufficient to merely calculate an image difference, the procedure is simple. Where the parameter is an image difference, the predetermined criterion mentioned earlier may be whether the image difference is not zero.
- the parameter may be a correlation of average values of pixel characteristic values in the object frame image area and in the comparison reference frame image area.
- the synthesized frame image will also assume abnormal picture quality, and thus frame image areas that are clearly abnormal are deliberately excluded from the object frame image area for synthesis.
- This approach may be particularly effective, for example, in an instance where a dark-toned frame image area is excluded as synthesis object when synthesizing a still image area of a bright-toned scene.
- the parameter is a comparison of difference in average of characteristic values of pixels
- the predetermined criterion mentioned earlier may be whether comparison of average of the characteristic values of pixels is large.
- the reference frame image area and the object frame image area for synthesis may be areas derived by dividing each the frame image in an identical manner; and the target extracting module may extract a target frame image area at a same location corresponding to the comparison reference frame image area.
- the determination as to whether to make an area an object of synthesis can be carried out on an area-by-area basis for areas of frame images divided into identical shapes.
- an area-by-area basis By making determination on an area-by-area basis, even frame images that have been uniformly excluded from targets for synthesis may, in certain areas thereof, be designated as objects for synthesis. As a result, picture quality of the synthesized image can be improved.
- the present invention in another aspect thereof may take the form of an image generating method invention.
- the invention may also be realized in various other aspects, for example, a computer program or a recording medium having such a computer program recorded thereon.
- the various additional elements described hereinabove may be implemented in any of these aspects.
- the invention is provided as a computer program or a recording medium having such a computer program recorded thereon, it may be provided as an entire program for controlling operation of an image generating device, or as an arrangement of modules for carrying out the functions of the invention only.
- recording media there could be employed any of various kinds of computer-readable media, such as a flexible disk, CD-ROM, DVD-ROM, magneto-optical disk, IC card, ROM cartridge, punch card, printed matter having a bar code or other symbols imprinted thereon, a computer internal storage device (RAM, ROM or other such memory), or an external storage device.
- FIG. 1 is an illustration of a method for synthesizing a reference image and an object image for synthesis.
- FIG. 2 is an illustration showing a synthesis method for use in a case of zero shift between a reference image and an object image for synthesis.
- FIG. 3 is an illustration of a simplified arrangement of an image generating device 100 as Embodiment 1 of the invention.
- FIG. 4 is a conceptual illustration showing a plurality of frame images being synthesized to create a still image in Embodiment 1.
- FIG. 5 is an illustration of the shift amount between a comparison reference image and a processing-object-image.
- FIGS. 6( a ) and 6 ( b ) illustrate a method for calculating translational shift by means of the gradient-based method.
- FIG. 7 is an illustration of a method for calculating rotational shift.
- FIG. 8 is a flowchart showing the flow of a still image generating process in Embodiment 1.
- FIG. 9 is a flowchart showing a frame image input process.
- FIG. 10 is an illustration of a simplified arrangement of an image generating device 100 A as Embodiment 2 of the invention.
- FIG. 11 is an illustration of a block shift amount between a comparison reference image and a processing-object-image.
- FIG. 12 is an illustration depicting a frame image divided into blocks.
- FIG. 13 is a flowchart showing the flow of a still image generating process in Embodiment 2.
- FIG. 14 is an illustration depicting generation of a panorama image.
- FIG. 3 is an illustration of a simplified arrangement of an image generating device 100 as Embodiment 1 of the invention.
- This image generating device 100 is a device that synthesizes plurality of frame images contained in a video, to create a still image of higher resolution than the frame images.
- Image generating device 100 is composed of predetermined application software installed on a general-purpose personal computer; the illustrated functional blocks are implemented by means of software.
- the personal computer comprises a CPU, ROM, and RAM, as well as a hard disk and an interface for input of motion video from a DVD-ROM, memory card, or other recording medium.
- a function for playback of input video is provided as well.
- a frame image input module 20 inputs or obtains frame images contained in the video.
- frame image input module 20 inputs four successive frame images in a time series starting at input timing.
- the number of frame images input is the number of frame images to be used to synthesize a still image.
- frame image input module 20 While inputting four frame images, frame image input module 20 also inputs 20 frame images succeeding these in the time series, and stores them separately in a frame image storage module 30 . These 20 frame images serve as backup frame images for use as candidates for new synthesis in the event that the preceding four frame images are unsuitable for synthesizing a still image. Hereinafter these 20 frame images will be referred to as “backup frame images.” The preceding 4 frame images will be referred to as “selected frame images.” The frame image input module 20 may perform a process to convert backup frame images into selected frame images.
- Input frame images need not be sequential in a time series. Frame images may be input at the timings of input instructions to the command input module so that the input images constitute the second frame image, the third frame image, and so on in a time series.
- the frame image storage module 30 stores a plurality of frame images input from the frame image input module 20 . From among the selected frame images stored in the frame image storage module 30 , an eliminating module 50 eliminates those frame images that are deemed abnormal when each frame image is evaluated. For example, frame images with high noise, out-of-focus frame images, or selected frame images with abnormal color tone due to a hand covering the lens or the like are eliminated.
- the frame image input module 20 converts a backup frame image to a new selected frame image.
- the converted backup frame image will be the backup frame image that succeeds the previous selected frame image in the time series.
- the eliminating module 50 examines the image newly set as the selected frame image, and excludes any abnormal selected frame image. The process of excluding a selected frame image and converting a backup frame image into a new selected frame image is repeated until the number of selected frame images determined by the eliminating module 50 to be normal finally reaches four.
- a reference image specification receiving module 25 displays the selected frame images on the monitor. The user can then specify, from among the displayed selected frame images, a frame image to serve as a reference image. The reference image specification receiving module 25 receives this specification. A reference image setting module 40 sets the selected frame image received by the reference image specification receiving module 25 as the reference image.
- the first selected frame image input by the frame image input module 20 may be designated as the reference image.
- the image generating device 100 there may also be provided a functional block for carrying out analysis of a characteristic value (for example, edge amount) for each selected frame image, and setting a reference image on the basis of the analysis.
- a characteristic value for example, edge amount
- a comparison target setting module 45 sets the selected frame images other than the reference image as comparison subject images.
- a comparison reference image setting module 90 sets the reference image or one of the comparison target images as a comparison reference image. Initially, the reference image is set as the comparison reference image.
- a comparison target resetting module 85 updates the comparison target images by excluding the comparison reference image from the comparison target images.
- An object image setting module 65 sets one of the comparison target images as a processing-object-image, by way of a target for detecting the shift amount from the comparison reference image. In this Embodiment, as will be described later, comparison target images are set as processing-object-images in the order in which they were input or converted by the frame image input module 20 .
- a shift detecting module 60 detects the shift amount of a processing-object-image with respect to the reference image. In this Embodiment, the amount of translational shift is detected. Detection of shift will be described later.
- An excluding module 80 excludes a processing-object-image from the comparison target images if the shift amount detected by shift detecting module 60 does not meet a predetermined criterion.
- a decision module 70 decides whether the comparison reference image and comparison target images total four.
- a synthesized image generating module 75 carries out resolution conversion, and while correcting for the shift amount detected by the shift detecting module 60 synthesizes the reference image and comparison target images to generate the synthesized image.
- the reference image serves as a basis of the criterion during synthesis; the synthesis method is as described previously. However, since four images are being synthesized, the average of four tone values is calculated for each pixel of the synthesized image. In the event that there are not four images, the frame image input module will again convert a backup frame image to a selected frame image.
- FIG. 4 is a conceptual illustration showing a plurality of frame images being synthesized to create a still image in Embodiment 1. As described previously, in this Embodiment successive frame images in a time series are used to generate a still image.
- the lead frame image 1 is the reference image set by the reference image setting module 40
- frame image 2 to frame image 4 are comparison target images set by comparison target setting module 45 . None of these frame images were eliminated by the eliminating module 50 .
- the comparison reference image setting module 90 first sets the reference image, namely frame image 1 , as the comparison reference image.
- the comparison target resetting module 85 resets the residual comparison target images other than the comparison reference image, i.e. frame image 2 to frame image 4 , to the comparison target images.
- the object image setting module 65 first selects frame image 2 as the processing-object-image.
- the shift detecting module 60 detects the shift amount between the comparison reference image (frame image 1 ) and the processing-object-image (frame image 2 ).
- the excluding module 80 determines whether the shift amount meets a predetermined criterion. Here, it does not meet the predetermined criterion, and is accordingly shown with a mark “x”. That is, frame image 2 is excluded from the comparison target images by the excluding module 80 .
- the object image setting module 65 sets the frame image 3 as the processing-object-image.
- the shift amount between the comparison reference image (frame image 1 ) and the processing-object-image (frame image 3 ) is detected, and determination is made as to whether the shift amount meets the predetermined criterion.
- frame image 3 meets the predetermined criterion and is accordingly shown with a mark “o”. That is, frame image 3 is not excluded from the comparison target images by the excluding module 80 .
- frame image 4 is selected as the processing-object-image, and the shift amount between the comparison reference image (frame image 1 ) and the processing-object-image (frame image 4 ) is detected. Since the shift amount meets the predetermined criterion it is accordingly shown with a mark “o”.
- Step 2 is now described. Step 2 is the next process carried out after Step 1 .
- the comparison reference image setting module 90 establishes one of these (frame image 3 ) as the comparison reference image. Since the reference image (frame image 1 ) was previously served as the comparison reference image in Step 1 , it is not used here as the comparison reference image. In order to distinguish between them, the previous frame image 1 will be termed “comparison reference image 1 ”, and frame image 3 termed “comparison reference image 2 .”
- the comparison target resetting module 85 then newly sets the residual comparison target images (frame image 4 ) other than the comparison reference image (frame image 3 ) to the comparison target image(s).
- frame image 4 residual comparison target images
- frame image 3 comparison reference image
- the object image setting module 65 sets one of the comparison target image(s) as the processing-object-image.
- frame image 4 will be the processing-object-image.
- the shift detecting module 60 detects the shift amount between the comparison reference image 2 (frame image 3 ) and the processing-object-image (frame image 4 ).
- the excluding module 80 determines whether the shift amount meets the predetermined criterion.
- frame image 4 does not meet the predetermined criterion, and is accordingly shown with an “x”. That is, frame image 4 is excluded from the comparison target image(s) by the excluding module 80 .
- the decision module 70 decides whether the comparison reference images and the comparison target images total four.
- the decision module 70 carries out the decision when after shift detection the number of comparison target image reaches one or fewer. Since there are two comparison reference images, namely frame image 1 and frame image 3 , and no comparison target images, the total is 2. Since this number is not 4, the frame image input module 20 converts backup frame images to selected frame images so that the total number of frame images is 4. That is, two backup frame images are converted to selected frame images. Of the two, if even one is eliminated by the eliminating module 50 , conversion of a frame image is performed again.
- Step 3 is the next process carried out after Step 2 .
- the comparison target image setting module 45 sets frame image 5 and frame image 6 as comparison target images.
- the comparison reference image setting module 90 sets frame image 1 as the comparison reference image 1
- the comparison target resetting module 85 sets frame image 3 , frame image 5 , and frame image 6 as the comparison target images.
- Each of Frame image 3 , frame image 5 , and frame image 6 is then selected in that order as the processing-object-image, for detecting the shift amount.
- the shift amount meets the predetermined criterion, and is accordingly shown with a mark “o”. Since the results of shift detection for frame image 1 and frame image 3 are shown in Step 1 , these are omitted from the illustration in Step 3 .
- one (frame image 3 ) of the comparison target images (frame image 3 , frame image 5 , frame image 6 ) is selected as the comparison reference image 2 , with the remaining comparison target images (frame image 5 , frame image 6 ) being set as comparison target images.
- Each of frame image 5 and frame image 6 is then selected in that order as the processing-object-image, for detecting the shift amount.
- the shift amount meets the predetermined criterion, and is accordingly shown with a “ ⁇ ”.
- one (frame image 5 ) of the comparison target images (frame image 5 , frame image 6 ) is set as the comparison reference image 3 , with the remaining comparison target image (frame image 6 ) being set as the comparison target image.
- Frame image 6 is then set as the processing-object-image, for detecting the shift amount.
- the shift amount of frame image 6 meets the predetermined criterion, and is accordingly shown with a mark “o”.
- the decision module 70 carries out the decision.
- the comparison reference images number is three (frame image 1 , frame image 3 , frame image 5 ), and the comparison target images number is one (frame image 6 ); the total makes four. Since the total number of comparison reference images and comparison target images has reached four, synthesized image generating module 75 performs resolution conversion, and while compensating the shift amount detected by the shift detecting module 60 , synthesizes the comparison reference images and comparison target images to generate the synthesized image.
- a description of detection of the shift amount between a comparison reference image and a processing-object-image follows.
- FIG. 5 is an illustration of the shift amount between a comparison reference image and a processing-object-image. It is assumed that the comparison reference image coordinates (x 1 , y 1 ) are shifted away from the processing-object-image coordinates (x 2 , y 2 ). Here, the extent of translational shift (u, v) and of rotational shift ⁇ are used as the shift amount.
- FIGS. 6( a ) and 6 ( b ) illustrate a method for calculating translational shift by means of the gradient-based method.
- FIG. 6( a ) pixels and their luminance of a comparison reference image and of a processing-object-image are shown.
- (x 1 i , y 1 i ) represents the coordinates of a pixel of the comparison reference image
- B 1 (x 1 i , y 1 i ) represents the luminance of the coordinate.
- the general principle of the gradient-based method is illustrated in FIG. 6( b ).
- the pixel of coordinates (x 2 i , y 2 i ) of the processing-object-image is situated between pixels (x 1 i ⁇ x 1 i+ 1, y 1 i ⁇ y 1 i+ 1) of the comparison reference image, that is at inter-pixel coordinates (x 1 i + ⁇ x, y 1 i + ⁇ y).
- ⁇ x, ⁇ y are calculated for each pixel, and the average taken within the overall image.
- FIG. 7 is an illustration of a method for calculating rotational shift. Here, it is assumed that translational shift between the comparison reference image and processing-object-image has been compensated.
- FIG. 8 is a flowchart showing the flow of a still image generating process in Embodiment 1.
- the frame image input module 20 inputs or obtains a frame image (Step S 20 ).
- FIG. 9 is a flowchart showing a frame image input process.
- the frame image input module 20 first inputs from the video image four selected frame images and twenty backup frame images, and stores these in the frame image storage module 30 (Step S 21 ).
- the eliminating module 50 determines whether one frame among the selected frame images has an abnormality such as a high level of noise, being out-of-focus, abnormal color tone due to a hand covering the lens, or the like (Step S 23 ). If there is an abnormality (Step S 23 ), the selected frame image is eliminated from the frame-image storage module 30 (Step S 24 ), and one backup frame image is converted to a selected frame image (Step S 25 ). The routine of Step S 23 to Step S 25 is repeated unit it has been determined that all selected frame images are normal.
- an abnormality such as a high level of noise, being out-of-focus, abnormal color tone due to a hand covering the lens, or the like. If there is an abnormality (Step S 23 ), the selected frame image is eliminated from the frame-image storage module 30 (Step S 24 ), and one backup frame image is converted to a selected frame image (Step S 25 ).
- the routine of Step S 23 to Step S 25 is repeated unit it has been determined that all selected frame images are normal.
- the reference image specification receiving module 25 displays all of the selected frame images on the monitor (Step S 27 ), and receives specification of a reference image by the user (Step S 28 ). Specification of a reference image is limited to one image.
- the reference image setting module 40 sets the one selected frame image specified by the user as the reference image (Step S 29 ).
- the comparison target setting module 45 then sets the residual selected frame images other than the reference image as comparison target images (Step S 30 ). This completes the description of the frame image input process; referring back to FIG. 8 , the description now turns to the flow of the still image generating process.
- the reference image or one of the comparison target images is set as the comparison reference image (Step S 35 ), and the residual comparison target images other than the comparison reference image are reset to comparison target images (Step S 40 ). Then, one of the comparison target images is set as the processing-object-image, and the shift amount of the processing-object-image with respect to the comparison reference image is detected (Step S 50 ).
- (0.3, 0.2).
- Step S 55 if the shift amount ( ⁇ u, ⁇ v) is equal to or less than a threshold value (0.1, 0.1) (Step S 55 ), the processing-object-image is deemed unsuitable for synthesis, and excluded from the comparison target images (Step S 60 ). If detection of the shift amount from the comparison reference image has not been completed for all comparison target images (Step S 65 ), the next comparison target image is set as the processing-object-image, and the routine from Step S 45 to Step S 60 is repeated.
- a threshold value 0.1, 0.1
- Step S 65 a decision is made as to whether the number of comparison target images is now one or fewer (Step S 70 ). If not yet one or fewer, a new comparison target image is set as the new comparison reference image, and the routine from Step S 35 to Step S 65 is repeated.
- Step S 70 If the number is one or fewer (Step S 70 ), a decision is made as to whether the comparison reference images and comparison target images total four (Step S 75 ). If they do not yet total four, backup frame images in a number corresponding to this deficit are converted to selected frame images (Step S 85 ), and the elimination process is carried out for the newly added selected frame image (Step S 31 in FIG. 9 ). The newly added selected frame image is then set as the comparison target image (Step S 86 ). If they do total four, the comparison reference images and comparison target images are synthesized to create a synthesized image (Step S 80 ).
- FIG. 10 is an illustration of a simplified arrangement of an image generating device 100 A as Embodiment 2 of the invention.
- the arrangement of image generating device 100 A is substantially identical to that of the image generating device 100 of Embodiment 1.
- a processing-object-image is divided into a plurality of blocks, and shift amount with respect to a comparison reference image is calculated for each block. Blocks having only small shift with respect to the comparison reference image are excluded from targets for synthesis.
- Dividing module 95 divides all selected frame images into blocks of 16 ⁇ 16 pixels each.
- FIG. 11 is an illustration of the amount of block shift between a comparison reference image and a processing-object-image. It is assumed that the comparison reference image coordinates (x 1 , y 1 ) are shifted away from the processing-object-image coordinates (x 2 , y 2 ).
- shift amount of a frame image is composed of the three parameters of translational shift (u, v) and rotational shift 6 .
- FIG. 12 is an illustration depicting a frame image divided into blocks.
- the frame image in the drawing is divided into 5 ⁇ 8 blocks. Even where the frame image as a whole is rotated in the direction indicated by the arrows, the shift amount of individual blocks can be represented in terms of translational shift (u, v) only.
- the shift amount of each block can be calculated from the translational shift and rotational shift of the frame image. Although in this embodiment the shift amount of each block is calculated from translational shift and rotational shift of a frame image, the shift amount of each block may instead be detected directly.
- FIG. 13 is a flowchart showing the flow of a still image generating process in Embodiment 2. The process is executed by the CPU of image generating device 100 A.
- Step S 95 -Step S 150 all selected frame images are divided into blocks of 16 ⁇ 16 pixels each.
- Step S 95 -Step S 150 A process similar to that of Embodiment 1 is then performed on each block (Step S 95 -Step S 150 ). That is, processing is carried out on each block treated as if it were an independent frame image. At this time, processing is carried out on blocks at identical locations within selected frame images (block 1 of selected frame image 1 , block 1 of selected frame image 2 , block 1 of selected frame image 3 , block 1 of selected frame image 4 ).
- Step S 95 -Step S 150 is repeated in the same manner for blocks 2 , blocks 3 . . . for all blocks (Step S 155 ).
- Step S 155 Once a backup frame image has been converted to a selected frame image (Step S 155 ), subjected to the elimination process (Step S 30 ), and the selected frame image set as the comparison target image (Step S 156 ), the comparison target image is divided into blocks of 16 ⁇ 16 pixels each (Step S 158 ).
- shift amount of blocks can be calculated easily on the basis of the shift amount of frame images. Dividing a frame image into blocks allows even a frame image that taken as a whole cannot be used in synthesis to now be used in synthesis.
- the invention has been described hereinabove in terms of a number of embodiments, the invention is not limited to the embodiments herein, and could be reduced to practice in various ways without departing from the scope and spirit thereof.
- various settings are possible for the number of frame images to be synthesized, or the threshold value for shift.
- the criterion for deciding whether a processing-object-image will be used in synthesis may instead be a image difference that is a summation of differences in pixel characteristic values at identical locations in a comparison reference image and a processing-object-image, or a difference in average pixel characteristic values. Variations such as the following are also possible.
- the image generating device 100 of Embodiment 1 can also generate panorama images.
- FIG. 14 is an illustration depicting generation of a panorama image.
- frame image 1 is made the reference image
- generation of a synthesized image was not possible since there is no area of overlap with frame image 5 .
- the image generating device 100 of Embodiment 1 by switching the comparison reference image in the order frame image 1 ⁇ frame image 2 ⁇ frame image 3 ⁇ frame image 4 , it becomes possible to synthesize more frame images to generate a panorama image.
- the invention is applicable to devices that synthesize a plurality of frame images of video or still images.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
Px=B1(x1i+1,y1i)−B1(x1i,y1i) (1)
Py=B1(x1i,y1i+1)−B1(x1i,y1i) (2)
where
B1=B1(x1i,y1i) (3)
B2=B2(x1i,y1i) (4)
The following relationships are true:
Px·Δx=B2−B1 (5)
Py·Δy=B2−B1 (6)
Therefore, Δx, Δy can be calculated so as to fulfill the expressions:
{Px·Δx−(B2−B1)}2=0 (7)
{Px·Δy−(B2−B1)}2=0 (8)
S 2 =Σ{Px·Δx+Py·Δy−(B2−B1)}2 (9).
r=(x12 +y12)1/2 (10)
θ=tan−1(x1/y1) (11).
x2−x1≈r·δ·sin δ=·y1 (12)
y2−y1≈r·δ·cos δ=·x1 (13).
Δx=u−δ·y1 (14)
Δy=v+δ·x1 (15).
S 2 =Σ{Px·(u−δy1)+Py·(v+δ·x1)−(B2−B1)}2 (16).
S 2 =Σ{Px·u+Py·v−(B2−B1)}2 (17),
the amount of translational shift between a comparison reference image and a processing-object-image is detected, with an error of less than one pixel
Claims (18)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003112392 | 2003-04-17 | ||
JP2003-112392 | 2003-04-17 | ||
PCT/JP2004/005514 WO2004093011A1 (en) | 2003-04-17 | 2004-04-16 | Generation of still image from a plurality of frame images |
Publications (2)
Publication Number | Publication Date |
---|---|
US20060171687A1 US20060171687A1 (en) | 2006-08-03 |
US7672538B2 true US7672538B2 (en) | 2010-03-02 |
Family
ID=33296055
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/541,479 Expired - Fee Related US7672538B2 (en) | 2003-04-17 | 2004-04-16 | Generation of still image from a plurality of frame images |
Country Status (5)
Country | Link |
---|---|
US (1) | US7672538B2 (en) |
EP (1) | EP1538562A4 (en) |
JP (1) | JP4120677B2 (en) |
CN (1) | CN100351868C (en) |
WO (1) | WO2004093011A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090129704A1 (en) * | 2006-05-31 | 2009-05-21 | Nec Corporation | Method, apparatus and program for enhancement of image resolution |
US20100265314A1 (en) * | 2009-04-16 | 2010-10-21 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method capable of transmission/reception and recording of image file obtained by panoramic image shot |
US20110299795A1 (en) * | 2009-02-19 | 2011-12-08 | Nec Corporation | Image processing system, image processing method, and image processing program |
US20140152765A1 (en) * | 2012-12-05 | 2014-06-05 | Samsung Electronics Co., Ltd. | Imaging device and method |
US11423224B2 (en) | 2019-06-14 | 2022-08-23 | Kyocera Document Solutions Inc. | Image-to-text recognition for a sequence of images |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070248240A1 (en) * | 2004-05-05 | 2007-10-25 | Koninklijke Philips Electronics, N.V. | Selective Video Blanking |
JP4497001B2 (en) * | 2005-03-22 | 2010-07-07 | 株式会社ニコン | Image processing apparatus, electronic camera, and image processing program |
JP4496537B2 (en) * | 2005-03-24 | 2010-07-07 | カシオ計算機株式会社 | Image composition apparatus and image composition processing program |
EP1744278B1 (en) * | 2005-07-13 | 2008-05-21 | LaserSoft Imaging AG | Providing a digital copy of a source image |
US20090119596A1 (en) * | 2005-11-10 | 2009-05-07 | Yuji Iwahara | Viewer device, slide show display method in viewer device, and program |
US20080043259A1 (en) * | 2006-08-18 | 2008-02-21 | Roger Lee Triplett | Method and system for hardcopy output of snapshots and video |
JP4215267B2 (en) * | 2006-10-19 | 2009-01-28 | パナソニック株式会社 | Image generating apparatus and image generating method |
US8717412B2 (en) * | 2007-07-18 | 2014-05-06 | Samsung Electronics Co., Ltd. | Panoramic image production |
JP4480760B2 (en) * | 2007-12-29 | 2010-06-16 | 株式会社モルフォ | Image data processing method and image processing apparatus |
US20090244301A1 (en) * | 2008-04-01 | 2009-10-01 | Border John N | Controlling multiple-image capture |
JP2010034964A (en) * | 2008-07-30 | 2010-02-12 | Sharp Corp | Image composition apparatus, image composition method and image composition program |
TWI401617B (en) * | 2009-05-19 | 2013-07-11 | Ipanel Technologies Ltd | Method and device for obtaining deviation position of picture |
EP2309452A1 (en) | 2009-09-28 | 2011-04-13 | Alcatel Lucent | Method and arrangement for distance parameter calculation between images |
WO2011040864A1 (en) | 2009-10-01 | 2011-04-07 | Scalado Ab | Method relating to digital images |
US8558913B2 (en) | 2010-02-08 | 2013-10-15 | Apple Inc. | Capture condition selection from brightness and motion |
SE534551C2 (en) | 2010-02-15 | 2011-10-04 | Scalado Ab | Digital image manipulation including identification of a target area in a target image and seamless replacement of image information from a source image |
JP5424930B2 (en) * | 2010-02-19 | 2014-02-26 | キヤノン株式会社 | Image editing apparatus, control method thereof, and program |
CN102075679A (en) * | 2010-11-18 | 2011-05-25 | 无锡中星微电子有限公司 | Method and device for acquiring image |
SE1150505A1 (en) | 2011-05-31 | 2012-12-01 | Mobile Imaging In Sweden Ab | Method and apparatus for taking pictures |
WO2013012370A1 (en) | 2011-07-15 | 2013-01-24 | Scalado Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
AU2011253779A1 (en) * | 2011-12-01 | 2013-06-20 | Canon Kabushiki Kaisha | Estimation of shift and small image distortion |
WO2014010324A1 (en) * | 2012-07-13 | 2014-01-16 | 富士フイルム株式会社 | Image deformation device and method for controlling actuation of same |
JP6249596B2 (en) * | 2012-12-05 | 2017-12-20 | 三星電子株式会社Samsung Electronics Co.,Ltd. | Imaging apparatus and imaging method |
US9304089B2 (en) * | 2013-04-05 | 2016-04-05 | Mitutoyo Corporation | System and method for obtaining images with offset utilized for enhanced edge resolution |
CN103699897A (en) * | 2013-12-10 | 2014-04-02 | 深圳先进技术研究院 | Robust face alignment method and device |
CN105865451B (en) * | 2016-04-19 | 2019-10-01 | 深圳市神州云海智能科技有限公司 | Method and apparatus for mobile robot indoor positioning |
CN107483839B (en) | 2016-07-29 | 2020-08-07 | Oppo广东移动通信有限公司 | Multi-frame image synthesis method and device |
US9940695B2 (en) * | 2016-08-26 | 2018-04-10 | Multimedia Image Solution Limited | Method for ensuring perfect stitching of a subject's images in a real-site image stitching operation |
CN106303292B (en) * | 2016-09-30 | 2019-05-03 | 努比亚技术有限公司 | A kind of generation method and terminal of video data |
CN108063920A (en) * | 2017-12-26 | 2018-05-22 | 深圳开立生物医疗科技股份有限公司 | A kind of freeze frame method, apparatus, equipment and computer readable storage medium |
CN111131688B (en) * | 2018-10-31 | 2021-04-23 | Tcl科技集团股份有限公司 | Image processing method and device and mobile terminal |
JP7247682B2 (en) * | 2019-03-18 | 2023-03-29 | 株式会社リコー | Image processing device, image processing method, image processing program, electronic device, and photographing device |
JP6562492B1 (en) * | 2019-05-16 | 2019-08-21 | 株式会社モルフォ | Image processing apparatus, image processing method, and program |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0210680A (en) | 1988-06-28 | 1990-01-16 | Sanyo Electric Co Ltd | Electric foot warmer |
JPH0210680U (en) | 1988-07-01 | 1990-01-23 | ||
JPH06350974A (en) | 1993-04-13 | 1994-12-22 | Matsushita Electric Ind Co Ltd | Frame still picture generator |
WO1998002844A1 (en) | 1996-07-17 | 1998-01-22 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
JPH1069537A (en) | 1996-08-28 | 1998-03-10 | Nec Corp | Image synthesis method and image synthesizer |
US5987164A (en) * | 1997-08-01 | 1999-11-16 | Microsoft Corporation | Block adjustment method and apparatus for construction of image mosaics |
JP2000152250A (en) | 1998-11-10 | 2000-05-30 | Canon Inc | Image processing unit, method and computer readable storage medium |
EP1008956A1 (en) | 1998-12-08 | 2000-06-14 | Synoptics Limited | Automatic image montage system |
JP2000244851A (en) | 1999-02-18 | 2000-09-08 | Canon Inc | Picture processor and method and computer readable storage medium |
US20020126913A1 (en) * | 2001-03-07 | 2002-09-12 | Daisuke Kotake | Image processing apparatus and method |
US20050013466A1 (en) * | 2001-11-14 | 2005-01-20 | Beun Robbie Daniel Pieter | Determination of a motion of a background in a series of images |
-
2004
- 2004-04-16 US US10/541,479 patent/US7672538B2/en not_active Expired - Fee Related
- 2004-04-16 WO PCT/JP2004/005514 patent/WO2004093011A1/en active Application Filing
- 2004-04-16 CN CNB2004800016127A patent/CN100351868C/en not_active Expired - Fee Related
- 2004-04-16 EP EP04728040A patent/EP1538562A4/en not_active Withdrawn
- 2004-04-16 JP JP2005505487A patent/JP4120677B2/en not_active Expired - Fee Related
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0210680A (en) | 1988-06-28 | 1990-01-16 | Sanyo Electric Co Ltd | Electric foot warmer |
JPH0210680U (en) | 1988-07-01 | 1990-01-23 | ||
JPH06350974A (en) | 1993-04-13 | 1994-12-22 | Matsushita Electric Ind Co Ltd | Frame still picture generator |
US6075905A (en) * | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
WO1998002844A1 (en) | 1996-07-17 | 1998-01-22 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
JPH1069537A (en) | 1996-08-28 | 1998-03-10 | Nec Corp | Image synthesis method and image synthesizer |
US5987164A (en) * | 1997-08-01 | 1999-11-16 | Microsoft Corporation | Block adjustment method and apparatus for construction of image mosaics |
JP2000152250A (en) | 1998-11-10 | 2000-05-30 | Canon Inc | Image processing unit, method and computer readable storage medium |
EP1008956A1 (en) | 1998-12-08 | 2000-06-14 | Synoptics Limited | Automatic image montage system |
JP2000244851A (en) | 1999-02-18 | 2000-09-08 | Canon Inc | Picture processor and method and computer readable storage medium |
US20020126913A1 (en) * | 2001-03-07 | 2002-09-12 | Daisuke Kotake | Image processing apparatus and method |
US20050013466A1 (en) * | 2001-11-14 | 2005-01-20 | Beun Robbie Daniel Pieter | Determination of a motion of a background in a series of images |
US7376249B2 (en) * | 2001-11-14 | 2008-05-20 | Nederlandse Organisatie Voortoegepast-Natuurwetenschappeljk, Onderzoek Tno | Determination of a motion of a background in a series of images |
Non-Patent Citations (6)
Title |
---|
Abstract of Japanese Patent Publication No. 02-010680, Pub. Date: Jan. 16, 1990, Patent Abstracts of Japan. |
Abstract of Japanese Patent Publication No. 06-350974, Pub. Date: Dec. 22, 1994, Patent Abstracts of Japan. |
Abstract of Japanese Patent Publication No. 10-069537, Pub. Date: Mar. 10, 1998, Patent Abstracts of Japan. |
Abstract of Japanese Patent Publication No. 2000-152250, Pub. Date: May 30, 2000, Patent Abstracts of Japan. |
Abstract of Japanese Patent Publication No. 2000-244851, Pub. Date: Sep. 8, 2000, Patent Abstracts of Japan. |
S. Mann and R. Picard, "Video Orbits of the Projective Group: A Simple Approach to Featureless Estimation of Parameters," IEEE Trans. on Image Processing, vol. 6, No. 9, Sep. 1997, pp. 1281-1295. |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090129704A1 (en) * | 2006-05-31 | 2009-05-21 | Nec Corporation | Method, apparatus and program for enhancement of image resolution |
US8374464B2 (en) * | 2006-05-31 | 2013-02-12 | Nec Corporation | Method, apparatus and program for enhancement of image resolution |
US20110299795A1 (en) * | 2009-02-19 | 2011-12-08 | Nec Corporation | Image processing system, image processing method, and image processing program |
US8903195B2 (en) * | 2009-02-19 | 2014-12-02 | Nec Corporation | Specification of an area where a relationship of pixels between images becomes inappropriate |
US20100265314A1 (en) * | 2009-04-16 | 2010-10-21 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method capable of transmission/reception and recording of image file obtained by panoramic image shot |
US8442354B2 (en) * | 2009-04-16 | 2013-05-14 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method capable of transmission/reception and recording of image file obtained by panoramic image shot |
US20140152765A1 (en) * | 2012-12-05 | 2014-06-05 | Samsung Electronics Co., Ltd. | Imaging device and method |
US11423224B2 (en) | 2019-06-14 | 2022-08-23 | Kyocera Document Solutions Inc. | Image-to-text recognition for a sequence of images |
Also Published As
Publication number | Publication date |
---|---|
US20060171687A1 (en) | 2006-08-03 |
CN1717702A (en) | 2006-01-04 |
EP1538562A1 (en) | 2005-06-08 |
CN100351868C (en) | 2007-11-28 |
WO2004093011A1 (en) | 2004-10-28 |
JP4120677B2 (en) | 2008-07-16 |
JPWO2004093011A1 (en) | 2006-07-06 |
EP1538562A4 (en) | 2005-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7672538B2 (en) | Generation of still image from a plurality of frame images | |
CN105165002B (en) | Image processing apparatus and image processing method | |
JP4489033B2 (en) | Frame rate conversion device, pan / tilt determination device and video device | |
CN103493473B (en) | Image processing apparatus, image processing method, image processing program and recording medium | |
KR101692227B1 (en) | A panorama image generation method using FAST algorithm | |
JP5359783B2 (en) | Image processing apparatus and method, and program | |
JP4480760B2 (en) | Image data processing method and image processing apparatus | |
US7834907B2 (en) | Image-taking apparatus and image processing method | |
JPH1091765A (en) | Device for synthesizing picture and method therefor | |
US20100123792A1 (en) | Image processing device, image processing method and program | |
WO1998012866A1 (en) | Image synthesizing device and method | |
CN109328454B (en) | Image processing apparatus | |
US6784927B1 (en) | Image processing apparatus and image processing method, and storage medium | |
JPH10178564A (en) | Panorama image generator and recording medium | |
US20120269444A1 (en) | Image compositing apparatus, image compositing method and program recording device | |
US8077982B2 (en) | Image match-point detection apparatus, image match-point detection method and storage medium | |
CN103024263B (en) | Image processing apparatus and image processing method | |
JPH0993430A (en) | Image synthesis method and image synthesizer | |
JPWO2007074605A1 (en) | Image processing method, image processing program, image processing apparatus, and imaging apparatus | |
US20080107357A1 (en) | Image Processing Apparatus, Image Processing Method, and Computer Program | |
JP2004272751A (en) | Generating still images from multiple frame images | |
JP2006287589A (en) | Image processing method and image processing apparatus | |
US20100182460A1 (en) | Image processing apparatus | |
JP2003078808A (en) | Device and method for detecting motion vector, device and method for correcting camera shake and imaging apparatus | |
JP2006033232A (en) | Image processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SEIKO EPSON CORPORATION,JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AISO, SEIJI;REEL/FRAME:017611/0693 Effective date: 20041116 Owner name: SEIKO EPSON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AISO, SEIJI;REEL/FRAME:017611/0693 Effective date: 20041116 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: CREDIT SUISSE AG, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:ALCATEL-LUCENT USA INC.;REEL/FRAME:030510/0627 Effective date: 20130130 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.) |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.) |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20180302 |