US9241111B1 - Array of cameras with various focal distances - Google Patents
Array of cameras with various focal distances Download PDFInfo
- Publication number
- US9241111B1 US9241111B1 US13/905,938 US201313905938A US9241111B1 US 9241111 B1 US9241111 B1 US 9241111B1 US 201313905938 A US201313905938 A US 201313905938A US 9241111 B1 US9241111 B1 US 9241111B1
- Authority
- US
- United States
- Prior art keywords
- image
- camera
- cameras
- array
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2621—Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
Definitions
- Cameras are now a standard feature on many electronic devices. Many devices have more than one camera, such as a front-facing camera and a rear-facing camera.
- FIG. 1 illustrates a system for taking photographs using an array of camera set at different focal distances.
- FIG. 2 is a logarithmic graph showing depths of field that may be obtained for an example set of camera set at four different focal distances.
- FIG. 3 illustrates the relationship between the distance between a lens-sensor pair and the resulting depth of field around the focal distance.
- FIGS. 4 to 8 illustrate an example camera assembly and its constituent components, including a monolithic lens array providing lenses with different depths of field and an array of image sensors fabricated on a single semiconductor wafer.
- FIG. 9 illustrates a color filter that may be interposed between the image sensors and lens array.
- FIGS. 10 to 12 illustrate examples of different arrangements for placement of various lenses within the array so as to minimize the computational complexity of corrections needed when images are combined.
- FIGS. 13 to 16 illustrate examples of a camera assembly where the difference in focal distance is set by positioning the image sensors at different distances from the lens array.
- FIGS. 17 to 20 illustrate examples of camera modules where each module forms a different camera group having a different focal distance.
- FIGS. 21 to 23 illustrate another example of camera modules where each module forms a different camera group having a different focal distance.
- FIG. 24 illustrates a method for capturing images using an array of lenses with different focal distances.
- FIG. 25 is a block diagram conceptually illustrating components of a device for composing a photograph from multiple images taken an array of cameras set at different focal distances.
- FIG. 26 illustrates an example of a computer network for use with devices described in the present application.
- the rear-facing camera may be the thickest part of the whole device.
- Such cameras take up critical space in the device, as they must often be positioned in the bezel to avoid overlaying the main display screen or other components.
- a camera may include a lens and a sensor.
- An array camera may include a number of cameras arranged together to capture one or more images.
- Array cameras come in several varieties. In some versions of array cameras, the cameras are all on one piece of silicon with an array lens over it. In other versions, an array of individual camera modules is assembled. In one type of array camera, all of the cameras are identical RGB (Red-Green-Blue) cameras, each taking some image (e.g., a one-megapixel image) and each being pointed in the same direction.
- RGB Red-Green-Blue
- each camera can be corrected and the images are combined to make a new image that has better resolution and less noise than the original camera.
- the cameras may be set to different exposures to get a high dynamic range effect in the combined image.
- one camera is red, another is blue, another is green, and perhaps a fourth is panchromatic. This pattern may be repeated, for example 3 or 4 times resulting in 12 or 16 cameras total.
- Each camera of the array of cameras may be made up of a blue, a green, and a red channel, each channel corresponding to a dedicated image sensor and lens pair.
- a panchromatic grayscale channel may also be included in each camera to improve overall sensitivity.
- the images from all the different cameras will have slight differences in perspective. They are combined by piece-wise realigning all of the images before adding the images to form a composite. If each camera has a different color for an individual pixel, it is necessary to go through and assign a composite pixel a color.
- Array cameras can be a fraction of the thickness of conventional designs.
- the individual cameras are individually smaller than a conventional camera (e.g., half as tall).
- each individual image sensor may be a lower resolution than an equivalent conventional camera, the images captured may be assembled into an high resolution image.
- each image sensor may be one megapixel, by combining images from an array of 4, 9, or 16 cameras in an array, the combined image may be 5, 8, or 13 megapixels (for example). That is, by assembling many images in combination, a significant portion of the resolution of all cameras is retained, although there are some losses due to image redundancy (i.e., having sixteen one megapixel cameras might result in only 12 megapixel images when combined due to image redundancies).
- the noise may be reduced by root N, where N is the number of cameras.
- N is the number of cameras.
- Array cameras may be constructed from one piece of silicon and then put an array of lenses over it where those lenses are built as an array. Constructed as a monolithic piece of glass or plastic, or as a composite of both (plastic lenses on a glass substrate), individual lenses cannot be easily moved up-or-down to adjust focus without moving the entire array. Moreover, in view of the scale of the array, a relatively large mechanism would be needed to make such an adjustment, and tilting of the array relative to the semiconductor substrate would need to be avoided.
- an array arrangement may be good for outdoor scenes and landscapes, but produces fuzzy images at distances closer than the depth of field provides.
- each camera or each group of cameras has similar set at different focal distances.
- the techniques discussed herein may be used with any of the above (or other) varieties of array camera and lens construction.
- array cameras typically make do with the large depth of field afforded by the small format of each individual camera of the array.
- a lens can precisely focus at only one distance at a time, the decrease in sharpness is gradual on each side of the focal distance, so that within the depth of field (DOF), the softened sharpness is imperceptible under normal viewing conditions. Note that for the same field of view and same focal ratio, a smaller camera will have a larger depth of field.
- the lenses on the array cameras may be set so that the lenses are at the hyperfocal distance.
- the lens is set at a focal distance that puts everything from infinity to some near point within the depth of focus. That is, for example, everything from 1.5 meters to infinity will appear to be in focus and the image will degrade by becoming out of focus as the object distance decreases below the near point—in this example 1.5 meters.
- the hyperfocal distance is the nearest focal distance at which the depth of field extends to infinity. Focusing beyond the hyperfocal distance does not increase the far depth of field, but does decrease the depth of field as the near edge of the depth of field advances farther away from the camera.
- At least one group of cameras are set up with a different focal distance from another group of image sensors.
- one group may be set at the hyperfocal distance, proving an in-focus depth of field from 1 m to infinity, while a second group of is set for a closer.
- a second group or groups of cameras may then be configured for a much closer focal distance, for example at a point where everything from 3 cm to 6 cm is in focus.
- Such a group might be used for “macro” photography, and would facilitate reading things at close proximity, such as bar codes, etc.
- Additional groups may be included.
- a third group or groups and a fourth group or groups could be set up for intermediate focal distances, to have an in-focus depth of fields from 5 cm to 20 cm, and 15 cm to 2 m.
- images may be combined to capture more range.
- edge detection may be used to discriminate between the sharpness of edges, which can then be used to determine the best focus for that part of the image.
- Cameras may be statistically “weighted” as the images are combined based upon edge sharpness. Such statistical combining may produce a low noise image.
- a sharpening algorithm may then be run on the composite image to make an image look even sharper.
- a higher degree of sharpening can be applied to an image obtained from compositing images from across multiple focal distances due to lower noise and more varied data.
- noise can be enhanced and false edges can appear.
- Sharpness may be determined by one of many computational sharpness evaluation techniques such as edge-gradient magnitude or fractal dimension.
- the images may be combined using statistical techniques.
- the human eye is not as sensitive to detail in color as it is to detail in brightness.
- the weighting for chroma may be more balanced, not giving as much preference to images from the in-focus cameras for the combined image, since the eye is less sensitive to spatial resolution in the chroma channel but is sensitive to noise in the chroma channel. Such noise can be better reduced by statistically combining information from a greater number of cameras, even if the images used are somewhat out-of-focus.
- focal distances may be distributed across the array of cameras so that some are capable of infinity focus and some are capable of very-near macro focus and some are intermediate and the various combinations of focal distance are well distributed across the camera color types.
- FIG. 1 illustrates a system for capturing an image using an array of cameras having a plurality of focal distances.
- the device 100 containing the camera array is a handheld device and the camera array serves as a rear-facing camera.
- the camera array simultaneously photographs a subject 112 at a plurality of focal distances 110 a - d , processing ( 120 ) the multiple images to create a composite image ( 114 ).
- the processing includes capturing the images through lenses set at a plurality of different focal distances ( 122 ).
- the captured images are rectified ( 124 ), correcting differences in disparity and alignment between lenses and groups.
- the sharpness of each rectified image is evaluated ( 126 ), which may be done piece-wise in segments.
- the rectified images are then combined, applying statistical analysis to the sharpness of each segment. While out-of-focus images may contribute less to the final image, each segment of the composite image ( 114 ) may include data from all images.
- Segments may be very small (e.g., four spatially adjacent pixels per channel/image sensor), or may include a large portion of the image (e.g., 1/12 of the image). Analysis of the segments may include a larger area surrounding the segment (i.e., overlapping adjacent segments).
- the cameras may be arranged to provide a similar field of view and similar magnification, capturing substantially coextensive images at different focuses, with “similar” in this context meaning “substantially the same,” with the differences being those inherent due to the spatial separation, orientation, and manufacturing tolerances of the various lenses in the array and camera components (e.g., image differences caused by parallax when a same scene is captured by separate lenses).
- the focal distances of the cameras are different, but the focal lengths of the lenses of the camera array may be the same (focal length being a measure of how strongly a lens converges or diverges light).
- the logarithmic graph 200 in FIG. 2 illustrates example depth of fields when four sets of camera are used.
- the depth of field of a first lens group ( 201 ) extends from 1.5 m to infinity (i.e., the first group is arranged at the hyperfocal distance).
- the depth of field of a second lens group ( 202 ) extends from 15 cm to 2 m, providing a DOF of 185 cm and 50 cm of overlap with the first lens group.
- the depth of field of a third lens group ( 203 ) extends from 5 cm to 20 cm, providing a DOF of 15 cm and 5 cm of overlap with the second lens group.
- the depth of field of a fourth lens group ( 204 ) extends from 3 cm to 6 cm, providing a DOF of 3 cm and 1 cm of overlap with the third lens group.
- FIG. 3 is an example further illustrating the relationship between lens position and depth of field.
- a semiconductor wafer 350 includes at least three image sensors 321 to 323 .
- Each of these image sensors is aligned with a respective lens 311 to 313 , and each lens may have a same focal length.
- the hyperfocal distance 301 is obtained by positioning the lens 311 near to the sensor 321 , providing the depth of field 201 .
- the next lens 312 is positioned further away from the sensor 322 , producing the intermediate depth of field 202 (the intermediate focal distance 302 being within the depth of field).
- a third lens 313 is arranged even further from the sensor 323 , with a focal distance 303 producing the shallowest depth of field 313 (i.e., shallowest in this illustration).
- FIGS. 4 and 5 illustrate a microlens array 410 where the different focal distances have been engineered directly into the array.
- the lens groups are arranged in columns.
- a first lens group 411 is arranged to provide a farthest focal distance.
- a second lens group 412 provides an intermediate focal distance that is closer than that of the first group.
- a third lens group 413 provides another intermediate focal distance that is between that of the second and fourth lens groups.
- a fourth lens group 414 provides the nearest focal distance.
- the focal lengths of the first, second, third, and fourth lens groups may be the same. Integrating the various focal distances into a monolithic lens array reduces the number of parts to align, thereby reducing a source of optical variations between cameras that can cause misregistration errors when images are combined.
- FIG. 6 illustrates a semiconductor wafer providing a sensor array 620 .
- the sensors are arranged into four camera groups ( 621 to 624 ). Each group has a red sensor, a green sensor, a blue sensor, and a panchromatic sensor, arranged to align with the lens array 410 .
- the sensors are tightly packed, which facilitates integration of the support circuitry for all the imagers and reduces spatial separation between camera groups (simplifying the combining of images). It also simplifies the number of external electrical connections that need to be made between the cameras and other circuitry and connectors. Areas between the image sensors where the lenses do not reach may be used for read-out circuitry, reducing the amount of read-out circuitry around the perimeter of the array. Also, integrating the cameras on a same substrate eliminates misregistration errors that flow from difference in alignment, rotation and angle that may occur when mounting separate cameras.
- a spacer 730 shown in FIG. 7 may be arranged between the sensor array 620 and the lens array 410 to form a camera assembly 8 shown in FIG. 8 .
- the spacer may include walls 732 to reduce optical crosstalk between image sensors-lens pairs.
- the spacer 730 may also be used for among other reasons such as to set the focal distances and to protect the underlying sensors from direct contact with the lenses.
- the spacer 730 may or may not be a separate element, as it may be engineered into an underside of the lens array 410 , grown on the surface of the sensor array 620 (e.g., a planar layer of silicate glass grown on the semiconductor substrate, patterned by photolithography), or some combination thereof.
- Optimizing individual sensors for specific color channels may produce a saturated color image (optimized, for example, by adding bandpass coatings to individual image sensors and/or bandgap tuning the sensors for particular wavelength ranges). Also, individually configuring each sensor may reduce the potential for color channel crosstalk, where filtered light of one color bleeds over to a sensor for another.
- a composite filter can be interposed with the between the lenses and sensors.
- the composite filter includes a filter 960 r that provides the red channel, a filter 960 g that provides the green channel, a filter 960 b that provides the blue channel, and a clear filter 960 p (or no filter) that provides the panchromatic channel.
- the effect of misregistering the superimposed images on each other is an important consideration.
- the distance an image object is from the camera array is consequential to misregistration, as objects far from the camera array exhibit little to no disparity, whereas objects near to the camera array will exhibit high disparity.
- disparity is readily demonstrated by binocular disparity, where the closer an object comes to a person's nose, the more difficult it becomes to reconcile the difference in perspective between left and right eyes.
- the impact of disparity is greatest within camera groups having short focal distances, since their depth of field may capture near-objects in focus, whereas distant objects have little to no disparity (the separation between cameras may be small compared to the distance to the objects).
- disparity has the least impact on camera groups having far focal distances, since their deep depth of field may capture distant objects in focus, whereas the nearest objects in focus are more distant than the near objects of the short focal-distance cameras.
- FIGS. 10 and 11 illustrate lens arrays 1010 and 1110 like that in FIG. 4 , but in which the lenses have been arranged to improve composite imaging.
- the near focal distance/DOF lenses 414 have been clustered adjacent to each other, whereas the lenses with the farthest focal distance/DOF have the largest spatial separation of any group of lenses in the array.
- FIG. 12 is a similar an example lens array 1210 for a camera that has three lens groups of three channels each, demonstrating that the lens-placement optimization concept applies without regard to the particular dimensions of the array.
- FIGS. 10 and 11 illustrate lens arrays 1010 and 1110 like that in FIG. 4 , but in which the lenses have been arranged to improve composite imaging.
- the near focal distance/DOF lenses 414 have been clustered adjacent to each other, whereas the lenses with the farthest focal distance/DOF have the largest spatial separation of any group of lenses in the array.
- FIG. 12 is a similar an example lens array 1210 for a camera that has three lens groups of three channels each
- disparity between camera groups is also considered, with the spatial distance between camera groups having shorter focal distance (e.g., 413 , 414 ) being minimized, such that disparity is reduced both within the individual camera groups between channels, and as well as between the camera groups.
- FIGS. 13 to 23 illustrate some other examples of how camera assemblies may be configured.
- each sensor array forms a camera group consisting of a red sensor 621 r , a green sensor 621 g , a blue sensor 621 b , and a panchromatic sensor 621 p .
- a microlens array 1410 with identical lenses 1411 may be used to achieve the different focal distances.
- the sensors were planar and the different focal distances were engineered into the lens array.
- camera assembly 1600 which may include walls 1532 to reduce channel crosstalk, is sandwiched between the lens array 1410 and mount 1350 to form a camera assembly 1600 as shown in FIG. 16 .
- An advantage of camera assembly 1600 is that it enables the use of a generic lens array 1410 and single-camera sensor arrays 1321 , while still providing different the focal distances. In particular, this may be an advantage because both lens array 1410 and sensor arrays 1321 may be manufactured at higher yields than the comparatively complex lens array 410 and larger-area sensor array 620 used for camera assembly 800 , thereby reducing component costs. Trade-offs include the need for a relatively complex mount 1350 and spacer 1530 , increased potential for image misregistration (e.g., more potential sources of misalignment between cameras), and the additional complexity involved in combining the components to form camera assembly 1600 .
- FIGS. 17 to 20 illustrate another alternative design.
- this design as shown in FIG. 17 , separate lens arrays 1711 to 1714 are fabricated, each having a different focal distance. However, the lenses of lens arrays 1711 to 1714 may all have a same focal length.
- Each lens array is aligned with its own sensor array 1820 ( FIG. 18 ), providing a red ( 621 r ), green ( 621 g ), blue ( 621 b ), and panchromatic ( 621 p ) channels.
- Sandwiching a spacer 1930 FIG. 19 , shown with walls 1932 ) between each lens array and sensor array produces a plurality of modules 2041 to 2044 , as shown in FIG. 20 .
- spacer 1930 may be part of the sensor array 1820 , the lens arrays 1711 to 1714 , or a combination thereof.
- the modules may mounted on or bonded to a substrate or adhered to a case to form a camera assembly.
- This approach allows groups with different focal distances to be mixed-and-matched at a time of device assembly. Also, this approach enables construction of a sparse array where camera groups are separated across the face of the device (e.g., along the edges). For example, the camera groups could be scattered across the back of a tablet computer with several centimeters between arrays. While this complicates image processing, it provides higher-range depth information and such an arrangement allows device designers to spread out the cameras within the device to locations where there is free or available space.
- FIGS. 21 to 23 is similar to the previous example, but uses a uniform lens array 2210 of identical lenses 2211 ( FIG. 22 ), relying on spacers 2131 - 2134 to set the focal distance.
- Modules 2341 to 2344 may be mounted on or soldered to a substrate or adhered to a case to form a camera assembly. Like the previous example, the modules can be mixed-and-matched and used to make sparse arrays.
- FIG. 24 illustrates a method for capturing images using an array of lenses with different focal distances and expands on the algorithm shown in the system in FIG. 1 .
- the separation produces alignment and parallax errors that may require correction. How large the errors are depends in part on how far away a captured scene is from the camera array. The further away the subject is, the more trivial the image registration errors. At infinity, there would be no superposition errors between cameras besides errors produced by manufacturing defects.
- Misregistration errors appear as a fuzziness when the superimposed images are stacked on each other.
- This fuzziness may be mitigated by piece-wise realigning each image. This may be done, for example, by applying an auto-correlation technique, and maybe performed in two steps.
- a first step ( 2440 ) known errors of the camera are corrected. For example, some cameras may be pointing in different directions because the lens array was not perfect or because a monolithic piece of silicon was not used. Also, some cameras may be rotated about the axis, which may involve computational correction. (The lack of rotational errors is an advantage of a monolithic sensor array, but even then there may be other small errors). These errors may be reduced by calibrating the cameras at the time of manufacturing, and remaining errors (such as pointing errors) may be corrected for by computationally simple transformations.
- misregistrations are corrected. Misregistrations are dependent on object distance.
- distance map Objects that require no shift or adjustment can be assumed to be at infinity (in terms of distance from the lenses). But at some threshold distance it is necessary to start adjusting the image in order to get the different images to align. The objects that require the most shifting may be relatively close to the cameras. By going backwards, it is possible to determine the separation between the camera groups and individual image sensors, and the amount of disparity (how much image segments must be shifted) that is required to align the images. The degree of disparity correlates with distance, which can be used to construct a distance map sometimes referred to as a “Z map” (where Y is up/down, X is right/left and Z is distance away).
- the distance map has uses beyond image correction. For example, if doing a video conference and the person involved in the conference is only one meter away, everything further than two meters may be regarded as unneeded background. Rather than wasting bandwidth to send the portions of the image conveying background, the data corresponding to image segments exceeding a threshold distance (e.g., two meters) may be dropped, or something else like a solid color or other image can be “underlayed” behind the person.
- Distance maps can also be used when augmenting the image with additional objects or information and used to obtain the correct perspective and obscuration for integrating the additional objects within the original image.
- sharpness evaluation may be determined by one of many computational sharpness evaluation techniques such as edge-gradient magnitude or fractal dimension.
- segments evaluated may be either small or large in terms of pixel count, if small segments are used, additional boundary pixels that overlap and include surrounding segments may be used to avoid undersampling. Overlap may also be used with large segments as well, but in terms of area evaluated, the overlapping boundary pixels may be a majority of the area evaluated with a small segment, whereas it may be a relatively small addition to a large sample.
- the luminance/brightness is weighted ( 2480 ) by giving greater statistical weight to segments having the better sharpness, but when chroma/color is weighted ( 2482 ), the preference given to segments having the better sharpness is reduced in comparison to the weighting used for luminance, increasing the chroma contribution of segments having the less sharpness to the composite image. That is to say, sharper segments may still contribute more to chroma than less sharp segments, but the statistical weight between segments is more balanced than is used for luminance. In the alternative, chroma/color may be averaged.
- edges When aligning edges, some edges will have higher gradients or a higher contrast than others. This information can be used to bias the combination so as to preserve the highest gradient of the edge, at least in the luminance space (“black-and-white” or grayscale brightness part of a picture).
- luminance space black-and-white” or grayscale brightness part of a picture.
- noise in color benefits from statistical color smoothing, which is a benefit of stacking all of the images for color and making the color smooth and true, but keeping the detail in the luminance so that image looks sharp.
- color detail is an order of magnitude less important than luminance detail.
- variable-pixel linear reconstruction An example of a statistical technique that may be used when images are combined (128) is variable-pixel linear reconstruction.
- the composite image may be upscaled and sharpened ( 2430 ).
- Combining, upscaling, and/or sharpening may include the application of a “superresolution” technique.
- Upscaling/upsampling may be performed as the images are combined, sub-sampling pixels across image segments. This may include application of a superresolution technique, sub-sampling across segments to produce a higher-resolution image.
- An advantage of upscaling by sub-sampling across multiple segments is that it takes advantage of the difference between images to more effectively interpolate pixels.
- upscaling may be performed on the composite image, which may include application of a single-frame superresolution technique. With either approach to upscaling, additional sharpening may be performed on the upscaled composite image.
- FIG. 25 is a block diagram conceptually illustrating a system 2500 including a multi focal distance image compositing device 2510 for carrying out the processes shown in FIG. 1 and FIG. 24 . Some or all of the components of system 2500 may be built in to the device 100 in FIG. 1 . Aspects of the system 2500 include computer-readable and computer-executable instructions that may reside on the device 2510 .
- FIG. 25 illustrates a number of components that may be included in the system 2500 with the image compositing device 2510 , however other non-illustrated components may also be included. Also, some of the illustrated components may not be present in every device capable of employing the general concepts of the system for taking photographs using an array of cameras set at different focal distances. Further, some components that are illustrated in the image compositing device 2510 as a single component may also appear multiple times in a single device. For example, the device 2510 may include multiple input/output device interfaces 2502 or multiple controllers/processors 2504 .
- Multiple image compositing devices 2510 may be employed in a system 2500 .
- the image compositing devices 2510 may include different components for performing different aspects of the image compositing process.
- the multiple devices may include overlapping components.
- the image compositing device 2510 as illustrated in FIG. 25 is exemplary, and may be a stand-alone device or may be included, in whole or in part, as a component of a larger device or system.
- the various components illustrated as part of device 2510 may be spread across multiple devices.
- the concepts disclosed herein may be applied within a number of different devices and computer systems, including, for example, digital cameras, cellular phones, personal digital assistants (PDAs), tablet computers, wearable computers with a head-mounted camera and display, other mobile devices, etc.
- the image compositing device 2510 may also be a component of other devices or systems that may provide processing services to a device containing a camera via a network, including a general-purpose computing systems, server-client computing systems, mainframe computing systems, telephone computing systems, laptop computers, etc.
- the system 2500 including the image compositing device 2510 may include a plurality of cameras 2520 a and 2520 b , where each camera has a different focal distance.
- Each camera 2520 may include a plurality of image sensors, each sensor being aligned with and paired with a lens.
- the individual sensors may be of any design, such as charge-coupled device (CCD) image sensor or an active-pixel sensor (APS).
- CCD charge-coupled device
- APS active-pixel sensor
- each camera would comprise a red sensor, a green sensor, a blue sensor, and a panchromatic sensor (in other words, in FIG. 6 , a column of sensors). This is also true for FIGS. 13-16 (a row of sensors).
- each camera would be a module. While these examples all include an array of four image sensors in the camera, other arrangements are also possible, such as if each camera includes a single panchromatic sensor, a polychromatic sensor, or an array of three color sensors.
- the system 2500 may also include a display 2512 of any suitable technology, such as a liquid crystal display, an organic light emitting diode display, electronic paper, an electrochromic display, a cathode ray tube display, a field emission display, a pico projector or other suitable components for displaying images and/or video.
- a display 2512 of any suitable technology such as a liquid crystal display, an organic light emitting diode display, electronic paper, an electrochromic display, a cathode ray tube display, a field emission display, a pico projector or other suitable components for displaying images and/or video.
- the display 2512 and cameras 2520 a /b may each be integrated with the image compositing device 2510 or may be separate.
- the image compositing device 2510 may also include an address/data bus 2524 for conveying data among components of the image compositing device 2510 .
- Each component within the device 2510 may also be directly connected to other components in addition to (or instead of) being connected to other components across the bus 2524 .
- the image compositing device 2510 may include a controller/processor 2504 that may each include one or more central processing units (CPUs) for processing data and computer-readable instructions, and a memory 2506 for storing data and instructions.
- the memory 2506 may include volatile random access memory (RAM), non-volatile read only memory (ROM), and/or other types of memory.
- the image compositing device 2510 may also include a data storage component 2508 , for storing data and instructions.
- the data storage component 2508 may include one or more storage types such as magnetic storage, optical storage, solid-state storage, etc.
- the device 2510 may also be connected to removable or external memory and/or storage (such as a removable memory card, memory key drive, networked storage, etc.) through the input/output device interfaces 2502 .
- Computer instructions for processing by the controller/processor 2504 for operating the device 2510 and its various components may be executed by the controller/processor 2504 and stored in the memory 2506 , storage 2508 , an external device, or in memory/storage included in the image processing module 2530 discussed below.
- some or all of the executable instructions may be embedded in hardware or firmware in addition to or instead of software.
- the systems, processes, and algorithms disclosed herein may be implemented in various combinations of software, firmware, and/or hardware.
- the image compositing device 2510 includes input/output device interfaces 2502 .
- a variety of input/output devices may be included in the device.
- Example input devices include an audio capture device such as a microphone, additional cameras 2520 which are included in the array, and an additional camera unrelated to the array (such as a front-side camera on a handheld device if the rear-side camera is the array).
- Example output devices include the display 2512 and an audio output device such as a speaker.
- the input/output device interfaces 2502 may also include an interface for an external peripheral device connection such as universal serial bus (USB), FireWire, Thunderbolt or other connection protocol.
- the input/output device interfaces 2502 may also include a network connection such as an Ethernet port, modem, etc.
- the input/output device interfaces 2502 may also include a wireless communication device, such as radio frequency (RF), infrared, Bluetooth, wireless local area network (WLAN) (such as WiFi), or wireless network radio, such as a radio capable of communication with a wireless communication network such as a Long Term Evolution (LTE) network, WiMAX network, 3G network, etc.
- RF radio frequency
- WLAN wireless local area network
- wireless network radio such as a radio capable of communication with a wireless communication network such as a Long Term Evolution (LTE) network, WiMAX network, 3G network, etc.
- LTE Long Term Evolution
- 3G 3G network
- the image compositing device 2510 further includes an image processing module 2530 for combining the images from the array of cameras set at different focal distances.
- the image processing module 230 performs the image processing aspects described in the process in FIGS. 1 and 25 .
- a rectification engine 2534 rectifies ( 124 ) the various images so that they may be superimposed.
- a sharpness evaluation engine 2536 performs a piece-wise evaluation of the sharpness of the rectified images ( 126 ) by image segment.
- An image compositing engine 2538 applies a statistical technique to combine the images ( 128 ) based on the evaluation of sharpness for each segment. Some or all of the upscaling and sharpening functionality may also be included in the image composting engine 2538 .
- the image processing module may also include an image sharpening engine (not shown) to sharpen the composite image ( 2430 ).
- multiple devices may contain components of the system 2500 and the devices may be connected over a network 2602 .
- Network 2602 may include a local or private network or may include a wide network such as the internet.
- Devices may be connected to the network 2602 through either wired or wireless connections.
- wireless device 2604 may be connected to the network 2602 through a wireless service provider.
- Other devices, such as computer 2612 may connect to the network 2602 through a wired connection.
- Other devices, such as laptop 2608 or tablet computer 2610 may be capable of connection to the network 2602 using various connection methods including through a wireless service provider, over a WiFi connection, or the like.
- Networked devices may acquire the images from an array of camera, and separately process and composite those images.
- Input and output devices may be connected to networked devices either through a wired or wireless connection.
- one device may include the array of camera and another device may composite the images.
- wireless device 2604 , wireless headset 2606 , laptop 2608 , tablet computer 2610 , wired headset 2614 , or camera 2618 might contain the array of cameras
- computer 2612 , computer 2614 , or server 2616 might contain the image processing module 2530 . Because compositing the images may involve significant computational resources, in terms of both storage and processing power, such split configurations may be employed where the device image acquisition device has lower processing capabilities than a remote device.
- the microlens arrays may be manufactured using techniques similar to those used in fabricating silicon wafers. For example, a large diameter piece of glass (e.g., 200 or 300 mm) can be used as a substrate, and then the lenses can either be etched out of the glass, grown on the surface, or otherwise deposited thereon. Another approach is to form the array using a mold (e.g., using all plastic, or pressing hot glass, or molding plastic over glass). The spacing between lenses is easily controlled using either approach.
- a large diameter piece of glass e.g. 200 or 300 mm
- a mold e.g., using all plastic, or pressing hot glass, or molding plastic over glass.
- the lenses of the microlens arrays discussed herein may be one optical element, or each may be a stack of multiple optical elements. Likewise, multiple layers of microlenses may be aligned to form the microlens array as a multi-lens stack. Also, while the figures illustrate the lenses as being arranged in a grid pattern, other array patterns may be used, such as hexagonal arrays (with the associated image sensors arranged in a corresponding pattern).
- sparse arrays may be beneficial when adding cameras to devices where internal space is limited.
- the arrangement of sparse arrays may or may not be in a geometric pattern, and may appear to be random.
- a sparse array is distinguished from other designs by including at least one lens-sensor pair that is not contiguous or adjacent with any other lens of the camera array.
- the examples have the cameras of the camera array oriented in a same direction (e.g, as when arranged as a back-side camera), the sensor and lens might also be spread out with different orientations and internal to a device, using prisms to redirect light from one or more apertures to the different camera modules.
- polychromatic image sensors might also be used, such as integrated red-green-blue sensors (e.g., Bayer-type sensors), where the different colors share a common lens.
- integrated red-green-blue sensors e.g., Bayer-type sensors
- a camera of the array might be a single-channel image sensor (e.g., panchromatic) paired with a lens.
- the image sensors may have an identical density of pixels, or the density of pixels may be different based on the focusing properties of the corresponding lenses. However, the sensors may still have a same total number of pixels and each lens-sensor pair may still provide the same magnification and field-of-view as the sensor-lens pairs in the other cameras of the array.
- images may be captured simultaneously, or at least, substantially simultaneously, so as to minimize motion-induced misregistration. “Substantially simultaneous” capture may occur if, for example, individual images are sequentially buffered, stored, and/or transmitted as they are acquired (e.g., in a network-distributed system where the capturing device has limited memory and processing power).
- the multi-focal-distance image compositing device of the present disclosure may be implemented as a computer method, a system or as an article of manufacture such as a memory device or non-transitory computer readable storage medium.
- the computer readable storage medium may be readable by a computer and may comprise instructions for causing a computer or other device to perform processes described in the present disclosure.
- the computer readable storage medium may be implemented by a volatile computer memory, non-volatile computer memory, hard drive, solid-state memory, flash drive, removable disk and/or other media.
- the term “a” or “one” may include one or more items unless specifically stated otherwise. Further, the phrase “based on” is intended to mean “based at least in part on” unless specifically stated otherwise.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Cameras In General (AREA)
Abstract
Description
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/905,938 US9241111B1 (en) | 2013-05-30 | 2013-05-30 | Array of cameras with various focal distances |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/905,938 US9241111B1 (en) | 2013-05-30 | 2013-05-30 | Array of cameras with various focal distances |
Publications (1)
Publication Number | Publication Date |
---|---|
US9241111B1 true US9241111B1 (en) | 2016-01-19 |
Family
ID=55071522
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/905,938 Expired - Fee Related US9241111B1 (en) | 2013-05-30 | 2013-05-30 | Array of cameras with various focal distances |
Country Status (1)
Country | Link |
---|---|
US (1) | US9241111B1 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150260965A1 (en) * | 2014-03-17 | 2015-09-17 | Canon Kabushiki Kaisha | Multi-lens optical apparatus |
US20150326801A1 (en) * | 2014-05-06 | 2015-11-12 | Kalpana Seshadrinathan | Rectification techniques for heterogeneous camera arrays |
US20150334309A1 (en) * | 2014-05-16 | 2015-11-19 | Htc Corporation | Handheld electronic apparatus, image capturing apparatus and image capturing method thereof |
US20150358542A1 (en) * | 2014-06-06 | 2015-12-10 | Canon Kabushiki Kaisha | Image processing apparatus, image capturing apparatus, image processing method, image capturing method, and non-transitory computer-readable medium for focus bracketing |
US20160050245A1 (en) * | 2014-08-18 | 2016-02-18 | Cisco Technology, Inc. | Region on Interest Selection |
US20160112650A1 (en) * | 2014-10-17 | 2016-04-21 | The Lightco Inc. | Methods and apparatus for supporting burst modes of camera operation |
WO2017189104A1 (en) * | 2016-04-28 | 2017-11-02 | Qualcomm Incorporated | Parallax mask fusion of color and mono images for macrophotography |
US20180033155A1 (en) * | 2016-07-26 | 2018-02-01 | Qualcomm Incorporated | Systems and methods for compositing images |
US9971130B1 (en) | 2016-12-13 | 2018-05-15 | Industrial Technology Research Institute | Composite array camera lens module |
CN108965686A (en) * | 2017-05-17 | 2018-12-07 | 中兴通讯股份有限公司 | The method and device taken pictures |
US10834310B2 (en) | 2017-08-16 | 2020-11-10 | Qualcomm Incorporated | Multi-camera post-capture image processing |
CN113826376A (en) * | 2019-05-24 | 2021-12-21 | Oppo广东移动通信有限公司 | User equipment and strabismus correction method |
US11205071B2 (en) * | 2018-07-16 | 2021-12-21 | Advanced New Technologies Co., Ltd. | Image acquisition method, apparatus, system, and electronic device |
US11265481B1 (en) * | 2017-06-27 | 2022-03-01 | Amazon Technologies, Inc. | Aligning and blending image data from multiple image sensors |
CN115226417A (en) * | 2021-02-20 | 2022-10-21 | 京东方科技集团股份有限公司 | Image acquisition device, image acquisition apparatus, image acquisition method, and image production method |
US11652653B1 (en) * | 2022-08-11 | 2023-05-16 | Sandipan Subir Chaudhuri | Conferencing between remote participants |
US20230353871A1 (en) * | 2020-05-30 | 2023-11-02 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
CN119233065A (en) * | 2024-11-29 | 2024-12-31 | 复旦大学 | Image sensor matrix, electronic device and imaging method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5682443A (en) * | 1993-03-24 | 1997-10-28 | Crosfield Electronics Limited | Image color modification method and apparatus employing unsharp masking |
US20080298714A1 (en) * | 2007-05-31 | 2008-12-04 | Core Logic, Inc. | Image edge correction apparatus and method |
US20120050562A1 (en) * | 2009-04-22 | 2012-03-01 | Raytrix Gmbh | Digital imaging system, plenoptic optical device and image data processing method |
US20140016016A1 (en) * | 2012-07-16 | 2014-01-16 | Alexander Berestov | System And Method For Effectively Implementing A Lens Array In An Electronic Device |
-
2013
- 2013-05-30 US US13/905,938 patent/US9241111B1/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5682443A (en) * | 1993-03-24 | 1997-10-28 | Crosfield Electronics Limited | Image color modification method and apparatus employing unsharp masking |
US20080298714A1 (en) * | 2007-05-31 | 2008-12-04 | Core Logic, Inc. | Image edge correction apparatus and method |
US20120050562A1 (en) * | 2009-04-22 | 2012-03-01 | Raytrix Gmbh | Digital imaging system, plenoptic optical device and image data processing method |
US20140016016A1 (en) * | 2012-07-16 | 2014-01-16 | Alexander Berestov | System And Method For Effectively Implementing A Lens Array In An Electronic Device |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10225444B2 (en) * | 2014-03-17 | 2019-03-05 | Canon Kabushiki Kaisha | Multi-lens optical apparatus |
US20150260965A1 (en) * | 2014-03-17 | 2015-09-17 | Canon Kabushiki Kaisha | Multi-lens optical apparatus |
US9501826B2 (en) * | 2014-05-06 | 2016-11-22 | Intel Corporation | Rectification techniques for heterogeneous camera arrays |
US20150326801A1 (en) * | 2014-05-06 | 2015-11-12 | Kalpana Seshadrinathan | Rectification techniques for heterogeneous camera arrays |
US20150334309A1 (en) * | 2014-05-16 | 2015-11-19 | Htc Corporation | Handheld electronic apparatus, image capturing apparatus and image capturing method thereof |
US20150358542A1 (en) * | 2014-06-06 | 2015-12-10 | Canon Kabushiki Kaisha | Image processing apparatus, image capturing apparatus, image processing method, image capturing method, and non-transitory computer-readable medium for focus bracketing |
US9607240B2 (en) * | 2014-06-06 | 2017-03-28 | Canon Kabushiki Kaisha | Image processing apparatus, image capturing apparatus, image processing method, image capturing method, and non-transitory computer-readable medium for focus bracketing |
US9628529B2 (en) * | 2014-08-18 | 2017-04-18 | Cisco Technology, Inc. | Region on interest selection |
US20160050245A1 (en) * | 2014-08-18 | 2016-02-18 | Cisco Technology, Inc. | Region on Interest Selection |
US20160112650A1 (en) * | 2014-10-17 | 2016-04-21 | The Lightco Inc. | Methods and apparatus for supporting burst modes of camera operation |
US10419672B2 (en) * | 2014-10-17 | 2019-09-17 | Light Labs Inc. | Methods and apparatus for supporting burst modes of camera operation |
US9912865B2 (en) * | 2014-10-17 | 2018-03-06 | Light Labs Inc. | Methods and apparatus for supporting burst modes of camera operation |
US20180270419A1 (en) * | 2014-10-17 | 2018-09-20 | Light Labs Inc. | Methods and apparatus for supporting burst modes of camera operation |
US10362205B2 (en) | 2016-04-28 | 2019-07-23 | Qualcomm Incorporated | Performing intensity equalization with respect to mono and color images |
US10341543B2 (en) | 2016-04-28 | 2019-07-02 | Qualcomm Incorporated | Parallax mask fusion of color and mono images for macrophotography |
WO2017189104A1 (en) * | 2016-04-28 | 2017-11-02 | Qualcomm Incorporated | Parallax mask fusion of color and mono images for macrophotography |
US10290111B2 (en) * | 2016-07-26 | 2019-05-14 | Qualcomm Incorporated | Systems and methods for compositing images |
US20180033155A1 (en) * | 2016-07-26 | 2018-02-01 | Qualcomm Incorporated | Systems and methods for compositing images |
US9971130B1 (en) | 2016-12-13 | 2018-05-15 | Industrial Technology Research Institute | Composite array camera lens module |
CN108965686A (en) * | 2017-05-17 | 2018-12-07 | 中兴通讯股份有限公司 | The method and device taken pictures |
US11265481B1 (en) * | 2017-06-27 | 2022-03-01 | Amazon Technologies, Inc. | Aligning and blending image data from multiple image sensors |
US10834310B2 (en) | 2017-08-16 | 2020-11-10 | Qualcomm Incorporated | Multi-camera post-capture image processing |
US11956527B2 (en) | 2017-08-16 | 2024-04-09 | Qualcomm Incorporated | Multi-camera post-capture image processing |
US11233935B2 (en) | 2017-08-16 | 2022-01-25 | Qualcomm Incorporated | Multi-camera post-capture image processing |
US11244158B2 (en) * | 2018-07-16 | 2022-02-08 | Advanced New Technologies Co., Ltd. | Image acquisition method, apparatus, system, and electronic device |
US11205071B2 (en) * | 2018-07-16 | 2021-12-21 | Advanced New Technologies Co., Ltd. | Image acquisition method, apparatus, system, and electronic device |
CN113826376B (en) * | 2019-05-24 | 2023-08-15 | Oppo广东移动通信有限公司 | User equipment and strabismus correction method |
CN113826376A (en) * | 2019-05-24 | 2021-12-21 | Oppo广东移动通信有限公司 | User equipment and strabismus correction method |
US20230353871A1 (en) * | 2020-05-30 | 2023-11-02 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
US11962901B2 (en) * | 2020-05-30 | 2024-04-16 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
US20240244320A1 (en) * | 2020-05-30 | 2024-07-18 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
US12167130B2 (en) * | 2020-05-30 | 2024-12-10 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
CN115226417A (en) * | 2021-02-20 | 2022-10-21 | 京东方科技集团股份有限公司 | Image acquisition device, image acquisition apparatus, image acquisition method, and image production method |
US11652653B1 (en) * | 2022-08-11 | 2023-05-16 | Sandipan Subir Chaudhuri | Conferencing between remote participants |
US20240056323A1 (en) * | 2022-08-11 | 2024-02-15 | Sandipan Subir Chaudhuri | Conferencing between remote participants |
CN119233065A (en) * | 2024-11-29 | 2024-12-31 | 复旦大学 | Image sensor matrix, electronic device and imaging method |
CN119233065B (en) * | 2024-11-29 | 2025-03-04 | 复旦大学 | Image sensor matrix, electronic device and imaging method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9241111B1 (en) | Array of cameras with various focal distances | |
US11856291B2 (en) | Thin multi-aperture imaging system with auto-focus and methods for using same | |
US10735635B2 (en) | Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps | |
US9866810B2 (en) | Optimization of optical systems for improved light field capture and manipulation | |
KR101517704B1 (en) | Image recording device and method for recording an image | |
US8478123B2 (en) | Imaging devices having arrays of image sensors and lenses with multiple aperture sizes | |
JP5399215B2 (en) | Multi-lens camera device and electronic information device | |
US9461083B2 (en) | Solid-state image pickup element and image pickup apparatus | |
US9049411B2 (en) | Camera arrays incorporating 3×3 imager configurations | |
US9055213B2 (en) | Systems and methods for measuring depth using images captured by monolithic camera arrays including at least one bayer camera | |
US9532033B2 (en) | Image sensor and imaging device | |
US20120274811A1 (en) | Imaging devices having arrays of image sensors and precision offset lenses | |
US8791403B2 (en) | Lens array for partitioned image sensor to focus a single image onto N image sensor regions | |
TW201605236A (en) | Compound eye camera | |
CN103842877A (en) | Imaging device and focus parameter value calculation method | |
US9386203B2 (en) | Compact spacer in multi-lens array module | |
CN103843320A (en) | Image sensor and imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AMAZON TECHNOLOGIES, INC., NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BALDWIN, LEO BENEDICT;REEL/FRAME:030516/0592 Effective date: 20130530 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240119 |