US20070263099A1 - Ambient Light Rejection In Digital Video Images - Google Patents
Ambient Light Rejection In Digital Video Images Download PDFInfo
- Publication number
- US20070263099A1 US20070263099A1 US11/460,884 US46088406A US2007263099A1 US 20070263099 A1 US20070263099 A1 US 20070263099A1 US 46088406 A US46088406 A US 46088406A US 2007263099 A1 US2007263099 A1 US 2007263099A1
- Authority
- US
- United States
- Prior art keywords
- image
- digital
- scene
- images
- digital pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
Definitions
- the invention relates to digital video imaging and, in particular, to a method and a system for implementing ambient light rejection in digital video images.
- a digital imaging system for still or motion images uses an image sensor or a photosensitive device that is sensitive to a broad spectrum of light to capture an image of a scene.
- the photosensitive device reacts to light reflected from the scene and can translate the strength of that light into electronic signals that are digitized.
- an image sensor includes a two-dimensional array of light detecting elements, also called pixels, and generates electronic signals, also called pixel data, at each light detecting element that are indicative of the intensity of the light impinging upon each light detecting element.
- the sensor data generated by an image sensor is often represented as a two-dimensional array of pixel data.
- Digital imaging systems are often applied in computer vision or machine vision applications. Many computer vision and machine vision applications require detection of a specific scene and objects in the scene. To ensure detection accuracy, it is crucial that lighting conditions and motion artifacts in the scene does not affect the detection of the image object. However, in many real life applications, lighting changes and motions in the scenes are unavoidable. Ambient light rejection has been developed to overcome the effect of lighting condition changes and motion artifacts in video images for improving the detection accuracy of objects in machine vision applications.
- Ambient light rejection refers to an imaging technique whereby active illumination is used to periodically illuminate a scene so as to enable the cancellation of the ambient light in the scene.
- the scene to be captured can have varying lighting conditions, from darkness to artificial light source to sunlight.
- Traditional ambient light rejection uses active illumination together with multiple captures where a scene is captured twice, once under the ambient lighting condition and the other under the same ambient lights and a controlled external light source. This is often referred to as the “sequential frame ambient light rejection” method and is illustrated in FIG. 1 .
- a first image capture (Frame A) is made when there is no active illumination and a second image capture (Frame B) is made when active illumination, such as from an infrared light source, is provided.
- Frame A and frame B are sequential frames of video images. When the difference between the two image frames is taken, an output image that is illuminated under only the controlled external light source results. The ambient or surrounding light is thereby removed.
- the conventional ambient light rejection imaging systems have many disadvantages.
- Traditional imaging systems implementing ambient light rejection are typically limited by the imaging system's capture speed so that the multi image capture is limited to the frame rate and only sequential frame ambient light rejection is possible.
- Sequential frame ambient light rejection technique can suffer from motion artifacts especially when there are high speed motions in the scene.
- these imaging systems can be very computational intensive and often require large amount of memory to implement.
- the conventional ambient light rejection imaging systems often have low signal-to-noise ratio so that a strong active light source is required.
- Using strong IR illumination to overwhelm the ambient light is not practical in some applications because of risk of eye injury and expense.
- An imaging system enabling the implementation of ambient light rejection for a variety of lighting conditions and high speed motion in the scene is desired.
- a method for generating an ambient light rejected output image includes providing a sensor array including a two-dimensional array of digital pixels where the digital pixels output digital signals as digital pixel data representing the image of the scene, capturing a pair of images of a scene within the time period of a video frame using the sensor array where the pair of images includes a first image being illuminated by ambient light and a second image being illuminated by the ambient light and a light source, storing the digital pixel data associated with the first and second images in a data memory, and subtracting the first image from the second image to obtain the ambient light rejected output image.
- FIG. 1 is a timing diagram illustrating the sequential frames ambient light rejection technique.
- FIG. 2 is a block diagram illustrating a digital imaging system implementing ambient light rejection according to one embodiment of the present invention.
- FIG. 3 is a timing diagram illustrating the intra-frame image capture scheme for implementing ambient light rejection according to one embodiment of the present invention.
- FIG. 4 is a timing diagram illustrating a typical image capture operation of a DPS array.
- FIG. 5 is a timing diagram illustrating the image capture operation of a DPS array implementing intra-frame image capture according to one embodiment of the present invention.
- FIG. 6 is a timing diagram illustrating the multiple intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to another embodiment of the present invention.
- FIG. 7 is a timing diagram illustrating the anti-jamming intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention.
- FIG. 8 is a timing diagram illustrating the negative illumination intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention.
- FIG. 9 is a block diagram of a digital image sensor as described in U.S. Pat. No. 5,461,425 of Fowler et al.
- FIG. 10 is a functional block diagram of an image sensor as described in U.S. Pat. No. 6,975,355 of Yang et al.
- FIG. 11 is a diagram illustrating the frame differencing method using a look-up table according to one embodiment of the present invention.
- FIG. 12 illustrates the generation of the LUT output data value for each N-bit data input value according to one embodiment of the present invention.
- a digital imaging system implementing ambient light rejection using active illumination utilizes a digital pixel sensor to realize a high speed intra-frame capture rate.
- the active illumination is implemented using an external light source under the control of the digital imaging system to provide synchronized active illumination.
- the digital imaging system performs multiple captures of the scene to capture at least one ambient lighted image and at least one active-and-ambient lighted image within a video frame. The difference of the ambient lighted image and the active-and-ambient lighted image is obtained to generate an output image where the ambient light component of the image is removed and only the active illuminated component of the image remains.
- the digital imaging system provides high quality and high resolution ambient rejected images under a wide range of lighting conditions and fast motions in the scene.
- the digital imaging system of the present invention exploits the massively parallel analog-to-digital conversion capability of a digital pixel sensor to realize a high capture rate. Furthermore, multiple sampling can be applied to improve the dynamic range of the final images. In this manner, the digital imaging system of the present invention generates high resolution ambient light rejected images while avoiding many of the disadvantages of the conventional methods.
- the digital imaging system can capture a pair of ambient lighted and active-and-ambient lighted images in rapid succession so as to minimize the impact of lighting changes or motions in the scene. That is, each ambient lighted or active-and-ambient lighted image can be captured at a high capture speed and the pair of images can be captured as close in time as possible, with minimal delay between the ambient lighted and the active illuminated image captures.
- the ambient lighted image and the active-and-ambient lighted image are capture in close temporal proximity, the two images can be in registration, or aligned, with each other so that ambient cancellation can be carried out effectively and a high resolution ambient rejected output image is obtained.
- the digital imaging system when the digital imaging system is applied in a video imaging application, the digital imaging system can carry out multiple image captures within the standard video frame rate (e.g., 60 frames per second or 16 ms per frame). By performing multiple intra-frame image captures, each pair of ambient light and active-and-ambient lighted images can be taken in close temporal proximity to avoid the impact of lightning changes or motions in the scene.
- the standard video frame rate e.g. 60 frames per second or 16 ms per frame.
- the digital imaging system of the present invention is capable of realizing a very high ambient light rejection ratio.
- the amount of light generated by the external light source is limited.
- the digital imaging system of the present invention achieves high ambient light rejection ratio by minimizing the exposure time while increasing the peak external light source power. Because the digital imaging system is capable of a fast capture rate, the digital imaging system can operate with the shorter exposure time while obtaining still providing a high resolution image.
- the duty cycle of the imaging system is shortened to maintain the total power consumption over time.
- FIG. 2 is a block diagram illustrating a digital imaging system implementing ambient light rejection according to one embodiment of the present invention.
- a digital imaging system 100 in accordance with the present embodiment is implemented as a video imaging system.
- digital imaging system 100 can be implemented as a still image camera or a motion and still image camera.
- Digital imaging system 100 includes a digital image sensor subsystem 102 and a digital image processor subsystem 104 .
- Digital image sensor subsystem 102 and digital image processor subsystem 104 can be formed on a single integrated circuit or each subsystem can be formed as an individual integrated circuit.
- the digital image sensor subsystem and the digital image processor subsystem can be formed as a multi-chip module whereby each subsystem is formed as separate integrated circuits on a common substrate.
- digital image sensor subsystem 102 and digital image processor subsystem 104 are formed as two separate integrated circuits.
- the terms “digital image sensor 102” and “digital image processor 104” will be used to refer to the respective subsystems of digital imaging system 100 .
- the use of the terms “digital image sensor 102” and “digital image processor 104” are not intended to limit the implementation of the digital imaging system of the present invention to two integrated circuits only.
- Digital image sensor 102 is an operationally “stand-alone” imaging subsystem and is capable of capturing and recording image data independent of digital image processor 104 .
- Digital image sensor 102 operates to collect visual information in the form of light intensity values using an area image sensor, such as sensor array 210 , which includes a two-dimensional array of light detecting elements, also called photodetectors.
- Sensor array 210 collects image data under the control of a data processor 214 .
- image data collected by sensor array 210 are read out of the photodetectors and stored in an image buffer 212 .
- image buffer 212 includes enough memory space to store at least one frame of image data from sensor array 210 .
- data processor 214 of digital image sensor 102 provides a control signal on a bus 224 for controlling a light source 250 external to digital imaging system 100 to provide controlled active illumination. In this manner, data processor 214 asserts the control signal whenever active illumination is desired to enable an image capture in synchronous with the active illumination.
- the external light source is a light emitting diode (LED) and data processor 214 provides an LED control signal to cause LED 250 to turn on when active illumination is required.
- the external light source can be other suitable light source, such as infrared (IR) illumination.
- Image data recorded by digital image sensor 102 is transferred through an image sensor interface circuit (IM I/F) 218 to digital image processor 104 .
- digital image sensor 102 and digital image processor 104 communicate over a pixel bus 220 and a serial peripheral interface (SPI) bus 222 .
- Pixel bus 220 is uni-directional and serves to transfer image data from digital image sensor 102 to digital image processor 104 .
- SPI bus 222 is a bi-directional bus for transferring instructions between the digital image sensor and the digital image processor.
- the communication interface between digital image sensor 102 and digital image processor 104 is a purely digital interface. Therefore, pixel bus 220 can implement high speed data transfer, allowing real time display of images captured by digital image sensor 102 .
- pixel bus 220 is implemented as a low-voltage differential signaling (LVDS) data bus. By using a LVDS data bus, very high speed data transfer can be implemented.
- SPI bus 222 is implemented as a four-wire serial communication and serial flash bus. In other embodiments, SPI bus 222 can be implemented as a parallel bi-directional control interface.
- Digital image processor 104 receives image data from digital image sensor 102 on pixel bus 220 .
- the image data is received at an image processor interface circuit (IP I/F) 226 and stored at a frame buffer 228 .
- Digital image processor 104 operating under the control of system processor 240 , performs digital signal processing functions on the image data to provide output video signals in a predetermined video format. More specifically, the image data stored in frame buffer 228 is processed into video data in the desired video format through the operation of an image processor 230 .
- image processor 230 is implemented in part in accordance with commonly assigned and copending U.S. patent application Ser. No.
- image processor 230 can be configured to perform vertical interpolation and/or color interpolation (“demosaicing”) to generate full color video data, as described in the '868 application.
- image processor 230 under the command of system processor 240 , also operates to perform ambient light cancellation between a pair of ambient lighted and active-and-ambient lighted images, as will be described in more detail below.
- digital imaging system 100 is implemented using the video imaging system architecture described in commonly assigned and copending U.S. patent application Ser. No. 10/634,302, entitled “Video Imaging System Including A Digital Image Sensor and A Digital Signal Processor,” of Michael Frank et al., filed Aug. 4, 2003, which application is incorporated herein by reference in its entirety.
- the output video signals generated by image processor 230 can be used in any number of ways depending on the application.
- the signals can be provided to a television set for display.
- the output video signals can also be fed to a video recording device to be recorded on a video recording medium.
- digital imaging system 100 is a video camcorder
- the TV signals can be provided to a viewfinder on the camcorder.
- digital imaging system 100 generates video signals in either the NTSC video format or the PAL video format.
- digital imaging system 100 can be configured to support any video formats, including digital television, and any number of video formats, as long as image processor 230 is appropriately configured, as described in details in the aforementioned '868 application.
- Digital imaging system 100 uses a single image sensor to capture video images which are then processed into output video data in the desired video formats.
- digital image sensor 102 includes a sensor array 210 of light detecting elements (also called pixels) and generates digital pixel data as output signals at each pixel location.
- Digital image sensor 102 also includes image buffer 212 for storing at least one frame of digital pixel data from sensor array 210 and data processor 214 for controlling the capture and readout operations of the image sensor.
- Data processor 214 also controls the external light source 250 for providing active illumination.
- Digital image sensor 102 may include other circuitry not shown in FIG. 2 to support the image capture and readout operations of the image sensor.
- sensor array 210 of digital image sensor 102 is implemented as a digital pixel sensor (DPS).
- DPS digital pixel sensor
- a digital pixel sensor refers to a CMOS image sensor with pixel level analog-to-digital conversion capabilities.
- a CMOS image sensor with pixel level analog-to-digital conversion is described in U.S. Pat. No. 5,461,425 of B. Fowler et al. (the '425 patent), which patent is incorporated herein by reference in its entirety.
- a digital pixel sensor provides a digital output signal at each pixel element representing the light intensity value detected by that pixel element.
- the combination of a photodetector and an analog-to-digital (A/D) converter in an area image sensor helps enhance detection accuracy, reduce power consumption, and improves overall system performance.
- a digital pixel sensor (DPS) array refers to a digital image sensor having an array of photodetectors where each photodetector produces a digital output signal.
- the DPS array implements the digital pixel sensor architecture illustrated in FIG. 9 and described in the aforementioned '425 patent.
- the DPS array of the '425 patent utilizes pixel level analog-to-digital conversion to provide a digital output signal at each pixel.
- the pixels of a DPS array are sometimes referred to as a “sensor pixel” or a “sensor element” or a “digital pixel,” which terms are used to indicate that each of the photodetectors of a DPS array includes an analog-to-digital conversion (ADC) circuit, and is distinguishable from a conventional photodetector which includes a photodetector and produces an analog signal.
- ADC analog-to-digital conversion
- the digital output signals of a DPS array have advantages over the conventional analog signals in that the digital signals can be read out at a much higher speed than the conventional image sensor.
- other schemes for implementing a pixel level A/D conversion in an area image sensor may also be used in the digital image sensor of the present invention.
- each of pixel elements 15 in sensor array 12 includes an ADC circuit.
- the image sensor of the present invention can employ other DPS architectures, including a shared ADC scheme.
- an ADC circuit is shared among a group of neighboring photodetectors. For example, in one embodiment, four neighboring photodetectors may share one ADC circuit situated in the center of the four photodetectors.
- the ADC circuit performs A/D conversion of the output voltage signal from each photodetectors by multiplexing between the four photodetectors.
- the shared ADC architecture retains all the benefits of a pixel level analog-to-digital conversion while providing the advantages of consuming a much smaller circuit area, thus reducing manufacturing cost and improving yield. Above all, the shared ADC architecture allows a higher fill factor so that a larger part of the sensor area is available for forming the photodetectors.
- the ADC circuit of each digital pixel or each group of digital pixels is implemented using the Multi-Channel Bit Serial (MCBS) analog-to-digital conversion technique described in U.S. Pat. No. 5,801,657 B. Fowler et al. (the '657 patent), which patent is incorporated herein by reference in its entirety.
- the MCBS ADC technique of the '657 patent can significantly improve the overall system performance while minimizing the size of the ADC circuit.
- a MCBS ADC has many advantages applicable to image acquisition and more importantly, facilitates high-speed readout.
- the ADC circuit of each digital pixel or each group of digital pixels implements a thermometer-code analog-to-digital conversion technique with continuous sampling of the input signal for achieving a digital conversion with a high dynamic range.
- a massively parallel thermometer-code analog-to-digital conversion scheme is described in copending and commonly assigned U.S. patent application Ser. No. 10/185,584, entitled “Digital Image Capture having an Ultra-high Dynamic Range,” of Justin Reyneri et al., filed Jun. 26, 2002, which patent application is incorporated herein by reference in its entirety.
- digital image sensor 102 includes image buffer 212 as an on-chip memory for storing at least one frame of pixel data.
- image sensor 102 can also operate with an off-chip memory as the image buffer.
- the use of on-chip memory is not critical to the practice of the present invention.
- the incorporation of an on-chip memory in a DPS sensor alleviates the data transmission bottleneck problem associated with the use of an off-chip memory for storage of the pixel data.
- the integration of a memory with a DPS sensor makes feasible the use of multiple sampling for improving the quality of the captured images.
- Multiple sampling is a technique capable of achieving a wide dynamic range in an image sensor without many of the disadvantages associated with other dynamic range enhancement techniques, such as degradation in signal-to-noise ratio and increased implementation complexity.
- U.S. Pat. No. 6,975,355 entitled “Multiple Sampling via a Time-indexed Method to Achieve Wide Dynamic Ranges,” of David Yang et al., issued Dec. 13, 2005, describes a method for facilitating image multiple sampling using a time-indexed approach.
- the '355 patent is incorporated herein by reference in its entirety.
- FIG. 10 duplicates FIG. 3 of the '355 patent and shows a functional block diagram of an image sensor 300 which may be used to implement digital image sensor 102 in one embodiment.
- Image sensor 300 includes a DPS sensor array 302 which has an N by M array of pixel elements.
- Sensor array 302 employs either the dedicated ADC scheme or the shared ADC scheme and incorporates pixel level analog-to-digital conversion.
- a sense amplifier and latch circuit 304 is coupled to sensor array 302 to facilitate the readout of digital signals from sensor array 302 .
- the digital signals (also referred to as digital pixel data) are stored in digital pixel data memory 310 .
- image sensor 300 also includes a threshold memory 306 and a time index memory 308 coupled to sensor array 302 .
- Threshold memory 306 stores information of each pixel indicating whether the light intensity value measured by each pixel in sensor array 302 has passed a predetermined threshold level. The exposure time indicating when the light intensity measured by each pixel has passed the threshold level is stored in time index memory 308 .
- each pixel element in sensor array 302 can be individually time-stamped by threshold memory 306 and time index memory 308 and stored in digital pixel data memory 310 .
- a DPS image sensor employing multiple sampling is capable of recording 14 to 16 or more bits of dynamic range in the captured image, in contrast with the 10 bits of dynamic range attainable by conventional image sensors.
- digital image sensor 102 is a DPS image sensor and is implemented using the architecture of image sensor 300 of FIG. 10 to support multiple sampling for attaining a high dynamic range in image capture.
- digital image sensor 102 implements correlated double sampling for noise reduction.
- Correlated double sampling is an image processing technique employed to reduce kT/C or thermal noise and 1/f noise in an image sensor array.
- CDS can also be employed to compensate for any fixed pattern noise or variable comparator offset.
- the sensor array is reset and the pixel values at each photodetector is measured and stored in specified memory locations in the data memory (image buffer 212 ).
- the pixel value measured at sensor array reset is called “CDS values” or “CDS subtract values.”
- the stored CDS values are subtracted from the measured pixel intensity values to provide normalized pixel data free of errors caused by noise and offset.
- Digital image processor 104 is a high performance image processor for processing pixel data from digital image sensor 102 into video images in a desired video format.
- digital image processor 104 implements signal processing functions for supporting an entire video signal processing chain.
- the image processing functions of digital image processor 104 include demosaicing, image scaling, and other high-quality video enhancements, including color correction, edge, sharpness, color fidelity, backlight compensation, contrast, and dynamic range extrapolation.
- the image processing operations are carried out at video rates.
- system processor 240 is implemented as an ARM (Advanced RISC Machine) processor.
- Firmware for supporting the operation of system processor 240 can be stored in a memory buffer.
- a portion of frame buffer 228 may be allocated for storing the firmware used by system processor 240 .
- System processor 240 operates to initialize and supervise the functional blocks of image processor 104 .
- digital image processor 104 also facilitate image capture operations for obtaining pairs of ambient lighted and active-and-ambient lighted images and processing the pairs of images to generate output images with ambient light cancellation.
- image capture operations for obtaining pairs of ambient lighted and active-and-ambient lighted images and processing the pairs of images to generate output images with ambient light cancellation.
- the operation of digital image system 100 for implementing ambient light rejection in the output video signals will be described in more detail below.
- FIG. 3 is a timing diagram illustrating the intra-frame image capture scheme for implementing ambient light rejection according to one embodiment of the present invention.
- a signal line 52 represents the image capture operation of digital imaging system 100 while a signal line 54 represents the active illumination control.
- a logical “low” level of signal lines 52 and 54 indicates respectively no image capture operation or the active illumination being turned off while a logical “high” level indicates respectively an image capture operation or the active illumination being turned on.
- an intra-frame image capture scheme is implemented to obtain a pair of ambient lighted and active-and-ambient lighted images, successively captured in close temporal proximity, within a single video frame image of a scene.
- a first image capture is carried out without active illumination to generate a first image with only ambient lighting.
- a second image capture is carried out with active illumination to generate a second image with active and ambient lighting. Both the first and second captures occur within a single video frame and the fast image capture is made possible by the high capture speed operation of digital image sensor 102 .
- Digital image processor 104 operates to subtract the first image from the second image to provide an ambient rejected output image.
- the image subtraction is performed pixel by pixel.
- an image subtraction method utilizing a look-up table is used to provide a fast and simple way to perform the image frame subtraction, as will be described in more detail below.
- the active illuminated image capture is carried out after the ambient lighted image capture.
- the order of the two image captures is not critical to the practice of the present invention and in other embodiments and the active illuminated image capture can be carried out before the ambient lighted image capture. It is only critical that a pair of ambient light and active-and-ambient lighted images is obtained in close temporal proximity to each other.
- the intra-frame capture scheme provides significant advantages over the conventional sequential frame capture technique.
- second capturing the pair of images within close temporal proximity image artifacts due to motions in the scene are greatly reduced.
- digital imaging system 100 can provide an increased output frame rate to improve the resolution of the output video signals.
- digital imaging system 100 is used for continuous video monitoring at a full video rate (e.g., 60 frames per second) and for motion in a scene moving at extreme speed (e.g. over 120 mph or 200 kph).
- FIG. 4 is a timing diagram illustrating a typical image capture operation of a DPS array.
- FIG. 5 is a timing diagram illustrating the image capture operation of a DPS array implementing intra-frame image capture according to one embodiment of the present invention.
- a DPS array in a typical image capture operation, within the time period of a single video frame ( 1/60 sec. or 16.67 ms), a DPS array first performs a sensor array reset operation and then the DPS array performs analog-to-digital conversion (ADC) to read out pixel data values for CDS. Then, image integration starts and the DPS array is exposed to the scene for a given exposure time. Each digital pixel integrates incident light impinging upon the DPS array. A mechanical shutter or an electronic shutter can be used to control the exposure time. Following image integration, the DPS array performs analog-to-digital conversion to read out pixel data values representing the image of the scene.
- the CDS data values and the image data values are stored in the image buffer and can be processed on-chip or off-chip to provide the normalized pixel data values.
- the intra-fame image capture scheme of the present invention implements a fast image capture operation where two fast image captures are performed within the time period of a single video frame ( 1/60 sec.).
- a fast image capture includes a reset operation, an image integration operation and an ADC data readout operation.
- a first fast image capture and a second fast image capture are carried out.
- One of the two fast image captures is synchronized with the activation of the external light source.
- the CDS operation is eliminated. This is because when performing ambient light cancellation, only the difference between the two images is important. Therefore, pixel data errors due to offset or noise will affect both images equally and most errors will be cancelled out when the ambient light component is removed from the final image.
- the capture time for each image capture can be shortened so that two fast image captures can fit within the time of a single video frame rate.
- the two fast image captures of FIG. 5 can be carried out with zero or minimal delay between each capture so that the two captures can be of nearly the same identical scene.
- the ADC operation is carried out using a single-capture-bit-serial (SCBS) conversion scheme or the MCBS conversion scheme described above and an exposure time of 120 ⁇ s can be used for each image capture.
- SCBS single-capture-bit-serial
- MCBS MCBS conversion scheme
- the intra-frame capture scheme for ambient light cancellation is carried out by taking a single ambient lighted image and a single active-and-ambient lighted image within a single video frame.
- the intra-frame capture scheme of the present invention is extended to perform multiple intra-frame image capture for ambient light cancellation.
- the multiple intra-frame image capture scheme has the advantage of increasing the total integration time without incorporating undesirable motion artifacts.
- FIG. 6 is a timing diagram illustrating the multiple intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to another embodiment of the present invention.
- multiple sets of image captures are carried out to generate multiple pairs of ambient and active-and-ambient lighted images.
- Each pair of ambient and active-and-ambient lighted images is subtracted from each other to generate a difference output image containing only the active illuminated component of the scene. Then, the difference output images of all image pairs are summed to provide a final ambient rejected output image.
- the multiple intra-frame image capture scheme is particularly advantageous when the illumination provided by the external light source is limited so that a longer exposure time is desired to integrate the active illuminated image sufficiently.
- the integration time alone is extended, the ambient light rejection ratio will be adversely affected and there will be more motion artifacts in the resulting image because the image integration was extended over a longer period of time.
- the multiple image capture scheme of the present invention increases the effective integration time without creating motion induced artifacts as each pair of ambient lighted and active-and-ambient lighted images are taken within close temporal proximity to each other.
- FIG. 7 is a timing diagram illustrating the jamming resistant intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention.
- a pseudo random delay “ ⁇ ” is included between the start of each frame and the start of the first image capture. In this manner, the pair of image captures occurs at a pseudo random delay time from the start of each video frame and the active illumination will also be provided in a pseudo random manner.
- the intra-frame image capture scheme of the present invention is thereby provided with jamming resistant capability. It is now no longer possible for a jamming device to use a simple phase locked loop to lock in and synchronize to the active illumination.
- the digital imaging system of the present invention is provided to capture a scene with a large field of view. In that case, it is sometimes desirable to block out part of the view, so as to provide privacy or to remove parts of the scene that are not of interest and are “distracting” to human or machine scene interpretation.
- the intra-frame image capture scheme for ambient light rejection is implemented using a second controlled light source to provide a second source of illumination to portions of the scene that are to be blocked out. This second source of illumination to limited portions of the scene is referred herein as “negative” illumination.
- the “negative” illumination is applied in the intra-frame image capture scheme as described below, the portions of the scene illuminated by the negative illumination will appear black in the final ambient rejected output image and those portions of the scene are thereby blocked out in the final output image.
- FIG. 8 is a timing diagram illustrating the negative illumination intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention.
- a digital imaging system implements ambient light rejection by taking a first image capture with only ambient lighting and a second image capture with active illumination with both image capture occurring within a single video frame, in the same manner as described above.
- the digital imaging system in addition to the first external light source (such as LED 250 of FIG. 2 ) providing the active illumination to the entire scene, the digital imaging system is equipped with a second external light source (such as LED 252 of FIG. 2 ) providing a second source of illumination to selected portions of the scene.
- a second external light source such as LED 252 of FIG. 2
- the second external light source 252 is also under the control of data processor 214 of digital image sensor 102 of the digital imaging system to be synchronized with the image captures of the image sensor. LED 252 is situated so that the second external light source is directed to only selected portions of the field of view of digital image sensor 102 desired to be blocked out.
- the second external light source 252 is synchronized to the first image capture of digital image sensor 102 .
- the second external light source 252 is activated in synchronous with the first image capture to provide a source of illumination to the selected portions of the scene during the first image capture. Those selected portions of the scene are thus illuminated by the ambient light as well as the “negative” illumination from the second external light source. The remaining portions of the scene are illuminated by the ambient light only. Then, at the second image capture, the first external light source is activated to provide active illumination to the entire scene.
- the first image capture provides a first image with the entire scene being ambient lighted and portions of the scene also lit by the “negative” illumination.
- the second image capture provides a second image with the entire scene being ambient lighted and active illuminated.
- the portions of the scene that are illuminated by the “negative” illumination will appear black and those portions of the scene are thus blocked out in the final ambient rejected output image.
- the level of “negative” illumination should match the ambient cancellation illumination for the selected parts of the scene to black out the image.
- the ambient cancellation illumination refers to the active illumination used to reject the ambient light and used to generate the ambient rejected output image.
- the level of “negative” illumination is determined by using saturated arithmetic in the subtraction process and making sure that the negative illumination component is greater than the active illumination component used for ambient cancellation.
- the level of “negative” illumination is determined by using a feedback loop to ensure that the relevant part of the scene to be blacked out possesses neither positive nor “negative” brightness.
- the digital imaging system of the present invention implementing ambient light rejection using active illumination and high speed intra-frame image capture can be advantageously applied to many applications.
- the digital imaging system is applied in an automobile for driver face monitoring.
- the digital imaging system can be used to capture images of the driver's face to determine if the driver may be falling asleep.
- the driver face monitoring application can also be used to identify the driver where the driver identification can be used to automatically set up the seat or mirror position preferences for the driver.
- the driver face monitoring application can also be extended for use in face recognition or monitoring of users of automatic teller machines.
- the face monitoring application can also be used as biometrics for access control.
- the digital imaging system of the present invention can also be applied to image objects formed using a retro-reflective medium.
- a retro-reflective medium refers to reflective medium which provide high levels of reflectance along a direction back toward the source of illuminating radiation.
- Automobile license plates are typically formed with retro-reflection.
- the digital imaging system of the present invention can be applied to capture automobile license plates or other objects with retro-reflection.
- the digital imaging system can provide ambient-rejected images of the objects regardless of the lighting conditions or the motion the objects are subjected to.
- frame differencing In the digital imaging system described above, two images are obtained and they need to be subtracted from each other to obtain the ambient rejected output image.
- the operation of subtracting two frames is referred to as frame differencing.
- frame differencing is performed using dedicated circuitry such as an adder circuit. Adding an adder circuit to the digital imaging system increases the complexity and cost of the system.
- the digital imaging system of the present invention implements frame differencing using a decoder look-up table (LUT).
- LUT decoder look-up table
- the frame differencing can be carried out without additional circuitry for pixel data subtraction.
- linearizing of the pixel data can be carried out at the same time as the frame subtraction to further improve the speed of operation of the digital imaging system.
- digital imaging system 100 includes digital image sensor 102 for capturing and storing pixel data indicative of the image of a scene.
- Digital image sensor 102 includes a memory buffer 212 for storing at least one frame of pixel data where each pixel data is allocated N bits of memory space.
- image buffer 212 can include sufficient memory to store at least two frames of pixel data so that the pair of ambient lighted and active-and-ambient lighted images can be stored in image buffer 212 at the same time.
- image buffer 212 allocates N bits for storing pixel data for each pixel and each of the ambient lighted and active-and-ambient lighted images is stored using N/2 bits of memory.
- the combined N bits of pixel data of the ambient light image and active-and-ambient lighted images are used to index a look-up table.
- the output pixel data value from the look-up table is the difference between the ambient light image and active-and-ambient lighted images.
- the output pixel data value may also be linearized so that the linearization and the subtraction steps are combined in one look-up table operation.
- One advantage of the frame differencing scheme of the present invention is that the frame differencing scheme can be implemented using memory space required for only one frame of N-bit image data, resulting space and cost savings. In this manner, the digital imaging system of the present invention can use the same memory space to provide N-bit pixel data when the ambient light rejection is not selected and to provide N/2-bit pixel data when ambient light rejection is selected.
- image buffer 212 stores pixel data in 12 bits and each of the ambient lighted and active-and-ambient lighted images is stored in 6 bits.
- FIG. 11 is a diagram illustrating the frame differencing method using a look-up table according to one embodiment of the present invention.
- a look-up table (LUT) 400 is generated where the LUT is indexed by an N-bit data input representing the N/2-bit pixel data associated with the two image captures of a pixel element.
- the LUT provides output data values indicative of the pixel value difference of the two image captures for that pixel element.
- the LUT 400 is generated using linearized pixel data values so that the output data values provided by the LUT represent linearized frame difference pixel data values.
- LUT 400 is indexed by a 12-bit data input.
- the 12-bit data input represents a 12-bit pixel data pair formed by combining the 6-bit pixel data of a pixel element from the first image capture and the 6-bit pixel data of the same pixel element from the second image capture.
- the first and second image captures refer to the ambient lighted and active-and-ambient lighted images.
- LUT 400 includes 4096 entries where each entry is uniquely associated with a 12-bit pixel data pair. The generation of the output data values of LUT 400 for each 12-bit pixel data pair is described with reference to FIG. 12 .
- FIG. 12 illustrates the generation of the LUT output data value for each N-bit data input value according to one embodiment of the present invention.
- the LUT output data value for each 12-bit data input is computed as follows.
- the N/2 bits of pixel data for a pixel element in the first image capture is linearized to provide a first pixel data value.
- the N/2 bits of pixel data for the same pixel element in the second image capture is linearized to provide a second pixel data value.
- the two linearized pixel data values are subtracted and the resulting value is used as the LUT output data value for the N-bit data input value.
- a 6-bit pixel data “010111” from the first image capture is linearized to a first pixel data value of 23 and a 6-bit pixel data “101101” from the second image capture is linearized to a second pixel data value of 45.
- the two linearized pixel data values are subtracted to yield an output data value of 22.
- the data value of 22 is stored in an entry 410 LUT 400 where entry 410 is indexed by the 12-bit LUT data input of “010111101101”, as shown in FIG. 11 .
- the digital imaging system of the present invention can perform ambient light rejection without added computational burden.
- the frame differencing scheme using a look-up table of the present invention allows the digital imaging system of the present invention to perform ambient light rejection at high speed and without complex circuitry.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
- This application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/746,815, filed on May 9, 2006, having the same inventorship hereof, which application is incorporated herein by reference in its entirety.
- The invention relates to digital video imaging and, in particular, to a method and a system for implementing ambient light rejection in digital video images.
- A digital imaging system for still or motion images uses an image sensor or a photosensitive device that is sensitive to a broad spectrum of light to capture an image of a scene. The photosensitive device reacts to light reflected from the scene and can translate the strength of that light into electronic signals that are digitized. Generally, an image sensor includes a two-dimensional array of light detecting elements, also called pixels, and generates electronic signals, also called pixel data, at each light detecting element that are indicative of the intensity of the light impinging upon each light detecting element. Thus, the sensor data generated by an image sensor is often represented as a two-dimensional array of pixel data.
- Digital imaging systems are often applied in computer vision or machine vision applications. Many computer vision and machine vision applications require detection of a specific scene and objects in the scene. To ensure detection accuracy, it is crucial that lighting conditions and motion artifacts in the scene does not affect the detection of the image object. However, in many real life applications, lighting changes and motions in the scenes are unavoidable. Ambient light rejection has been developed to overcome the effect of lighting condition changes and motion artifacts in video images for improving the detection accuracy of objects in machine vision applications.
- Ambient light rejection refers to an imaging technique whereby active illumination is used to periodically illuminate a scene so as to enable the cancellation of the ambient light in the scene. The scene to be captured can have varying lighting conditions, from darkness to artificial light source to sunlight. Traditional ambient light rejection uses active illumination together with multiple captures where a scene is captured twice, once under the ambient lighting condition and the other under the same ambient lights and a controlled external light source. This is often referred to as the “sequential frame ambient light rejection” method and is illustrated in
FIG. 1 . As shown inFIG. 1 , a first image capture (Frame A) is made when there is no active illumination and a second image capture (Frame B) is made when active illumination, such as from an infrared light source, is provided. Frame A and frame B are sequential frames of video images. When the difference between the two image frames is taken, an output image that is illuminated under only the controlled external light source results. The ambient or surrounding light is thereby removed. - The conventional ambient light rejection imaging systems have many disadvantages. Traditional imaging systems implementing ambient light rejection are typically limited by the imaging system's capture speed so that the multi image capture is limited to the frame rate and only sequential frame ambient light rejection is possible. Sequential frame ambient light rejection technique can suffer from motion artifacts especially when there are high speed motions in the scene. Also, these imaging systems can be very computational intensive and often require large amount of memory to implement.
- Furthermore, the conventional ambient light rejection imaging systems often have low signal-to-noise ratio so that a strong active light source is required. Using strong IR illumination to overwhelm the ambient light is not practical in some applications because of risk of eye injury and expense.
- An imaging system enabling the implementation of ambient light rejection for a variety of lighting conditions and high speed motion in the scene is desired.
- According to one embodiment of the present invention, a method for generating an ambient light rejected output image includes providing a sensor array including a two-dimensional array of digital pixels where the digital pixels output digital signals as digital pixel data representing the image of the scene, capturing a pair of images of a scene within the time period of a video frame using the sensor array where the pair of images includes a first image being illuminated by ambient light and a second image being illuminated by the ambient light and a light source, storing the digital pixel data associated with the first and second images in a data memory, and subtracting the first image from the second image to obtain the ambient light rejected output image.
- The present invention is better understood upon consideration of the detailed description below and the accompanying drawings.
-
FIG. 1 is a timing diagram illustrating the sequential frames ambient light rejection technique. -
FIG. 2 is a block diagram illustrating a digital imaging system implementing ambient light rejection according to one embodiment of the present invention. -
FIG. 3 is a timing diagram illustrating the intra-frame image capture scheme for implementing ambient light rejection according to one embodiment of the present invention. -
FIG. 4 is a timing diagram illustrating a typical image capture operation of a DPS array. -
FIG. 5 is a timing diagram illustrating the image capture operation of a DPS array implementing intra-frame image capture according to one embodiment of the present invention. -
FIG. 6 is a timing diagram illustrating the multiple intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to another embodiment of the present invention. -
FIG. 7 is a timing diagram illustrating the anti-jamming intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention. -
FIG. 8 is a timing diagram illustrating the negative illumination intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention. -
FIG. 9 is a block diagram of a digital image sensor as described in U.S. Pat. No. 5,461,425 of Fowler et al. -
FIG. 10 is a functional block diagram of an image sensor as described in U.S. Pat. No. 6,975,355 of Yang et al. -
FIG. 11 is a diagram illustrating the frame differencing method using a look-up table according to one embodiment of the present invention. -
FIG. 12 illustrates the generation of the LUT output data value for each N-bit data input value according to one embodiment of the present invention. - In accordance with the principles of the present invention, a digital imaging system implementing ambient light rejection using active illumination utilizes a digital pixel sensor to realize a high speed intra-frame capture rate. The active illumination is implemented using an external light source under the control of the digital imaging system to provide synchronized active illumination. The digital imaging system performs multiple captures of the scene to capture at least one ambient lighted image and at least one active-and-ambient lighted image within a video frame. The difference of the ambient lighted image and the active-and-ambient lighted image is obtained to generate an output image where the ambient light component of the image is removed and only the active illuminated component of the image remains. The digital imaging system provides high quality and high resolution ambient rejected images under a wide range of lighting conditions and fast motions in the scene.
- The digital imaging system of the present invention exploits the massively parallel analog-to-digital conversion capability of a digital pixel sensor to realize a high capture rate. Furthermore, multiple sampling can be applied to improve the dynamic range of the final images. In this manner, the digital imaging system of the present invention generates high resolution ambient light rejected images while avoiding many of the disadvantages of the conventional methods.
- More specifically, by using a digital pixel sensor to realize a very high image capture rate, the digital imaging system can capture a pair of ambient lighted and active-and-ambient lighted images in rapid succession so as to minimize the impact of lighting changes or motions in the scene. That is, each ambient lighted or active-and-ambient lighted image can be captured at a high capture speed and the pair of images can be captured as close in time as possible, with minimal delay between the ambient lighted and the active illuminated image captures. When the ambient lighted image and the active-and-ambient lighted image are capture in close temporal proximity, the two images can be in registration, or aligned, with each other so that ambient cancellation can be carried out effectively and a high resolution ambient rejected output image is obtained.
- In particular, when the digital imaging system is applied in a video imaging application, the digital imaging system can carry out multiple image captures within the standard video frame rate (e.g., 60 frames per second or 16 ms per frame). By performing multiple intra-frame image captures, each pair of ambient light and active-and-ambient lighted images can be taken in close temporal proximity to avoid the impact of lightning changes or motions in the scene.
- Furthermore, the digital imaging system of the present invention is capable of realizing a very high ambient light rejection ratio. In many applications, due to safety and power consumption concerns, the amount of light generated by the external light source is limited. When the light provided by the external light source is weak, it becomes challenging to distinguish the active-illuminated component of an image in the presence of bright ambient light, such as direct sunlight. The digital imaging system of the present invention achieves high ambient light rejection ratio by minimizing the exposure time while increasing the peak external light source power. Because the digital imaging system is capable of a fast capture rate, the digital imaging system can operate with the shorter exposure time while obtaining still providing a high resolution image. When the peak external light source power is increased, the duty cycle of the imaging system is shortened to maintain the total power consumption over time.
- System Overview
-
FIG. 2 is a block diagram illustrating a digital imaging system implementing ambient light rejection according to one embodiment of the present invention. Referring toFIG. 2 , adigital imaging system 100 in accordance with the present embodiment is implemented as a video imaging system. In other embodiments,digital imaging system 100 can be implemented as a still image camera or a motion and still image camera.Digital imaging system 100 includes a digitalimage sensor subsystem 102 and a digitalimage processor subsystem 104. Digitalimage sensor subsystem 102 and digitalimage processor subsystem 104 can be formed on a single integrated circuit or each subsystem can be formed as an individual integrated circuit. In other embodiments, the digital image sensor subsystem and the digital image processor subsystem can be formed as a multi-chip module whereby each subsystem is formed as separate integrated circuits on a common substrate. In the present embodiment, digitalimage sensor subsystem 102 and digitalimage processor subsystem 104 are formed as two separate integrated circuits. In the present description, the terms “digital image sensor 102” and “digital image processor 104” will be used to refer to the respective subsystems ofdigital imaging system 100. The use of the terms “digital image sensor 102” and “digital image processor 104” are not intended to limit the implementation of the digital imaging system of the present invention to two integrated circuits only. -
Digital image sensor 102 is an operationally “stand-alone” imaging subsystem and is capable of capturing and recording image data independent ofdigital image processor 104.Digital image sensor 102 operates to collect visual information in the form of light intensity values using an area image sensor, such assensor array 210, which includes a two-dimensional array of light detecting elements, also called photodetectors.Sensor array 210 collects image data under the control of adata processor 214. At a predefined frame rate, image data collected bysensor array 210 are read out of the photodetectors and stored in animage buffer 212. Typically,image buffer 212 includes enough memory space to store at least one frame of image data fromsensor array 210. - In the present embodiment,
data processor 214 ofdigital image sensor 102 provides a control signal on abus 224 for controlling alight source 250 external todigital imaging system 100 to provide controlled active illumination. In this manner,data processor 214 asserts the control signal whenever active illumination is desired to enable an image capture in synchronous with the active illumination. In the present embodiment, the external light source is a light emitting diode (LED) anddata processor 214 provides an LED control signal to causeLED 250 to turn on when active illumination is required. In other embodiments, the external light source can be other suitable light source, such as infrared (IR) illumination. - Image data recorded by
digital image sensor 102 is transferred through an image sensor interface circuit (IM I/F) 218 todigital image processor 104. In the present embodiment,digital image sensor 102 anddigital image processor 104 communicate over apixel bus 220 and a serial peripheral interface (SPI)bus 222.Pixel bus 220 is uni-directional and serves to transfer image data fromdigital image sensor 102 todigital image processor 104.SPI bus 222 is a bi-directional bus for transferring instructions between the digital image sensor and the digital image processor. Indigital imaging system 100, the communication interface betweendigital image sensor 102 anddigital image processor 104 is a purely digital interface. Therefore,pixel bus 220 can implement high speed data transfer, allowing real time display of images captured bydigital image sensor 102. - In one embodiment,
pixel bus 220 is implemented as a low-voltage differential signaling (LVDS) data bus. By using a LVDS data bus, very high speed data transfer can be implemented. Furthermore, in one embodiment,SPI bus 222 is implemented as a four-wire serial communication and serial flash bus. In other embodiments,SPI bus 222 can be implemented as a parallel bi-directional control interface. -
Digital image processor 104 receives image data fromdigital image sensor 102 onpixel bus 220. The image data is received at an image processor interface circuit (IP I/F) 226 and stored at aframe buffer 228.Digital image processor 104, operating under the control ofsystem processor 240, performs digital signal processing functions on the image data to provide output video signals in a predetermined video format. More specifically, the image data stored inframe buffer 228 is processed into video data in the desired video format through the operation of animage processor 230. In one embodiment,image processor 230 is implemented in part in accordance with commonly assigned and copending U.S. patent application Ser. No. 10/174,868, entitled “A Multi-Standard Video Image Capture Device Using A Single CMOS Image Sensor,” of Michael Frank and David Kuo, filed Jun. 16, 2002 (the '868 application), which application is incorporated herein by reference in its entirety. For example,image processor 230 can be configured to perform vertical interpolation and/or color interpolation (“demosaicing”) to generate full color video data, as described in the '868 application. - In accordance with the present invention,
image processor 230, under the command ofsystem processor 240, also operates to perform ambient light cancellation between a pair of ambient lighted and active-and-ambient lighted images, as will be described in more detail below. - In one embodiment,
digital imaging system 100 is implemented using the video imaging system architecture described in commonly assigned and copending U.S. patent application Ser. No. 10/634,302, entitled “Video Imaging System Including A Digital Image Sensor and A Digital Signal Processor,” of Michael Frank et al., filed Aug. 4, 2003, which application is incorporated herein by reference in its entirety. - The output video signals generated by
image processor 230 can be used in any number of ways depending on the application. For example, the signals can be provided to a television set for display. The output video signals can also be fed to a video recording device to be recorded on a video recording medium. Whendigital imaging system 100 is a video camcorder, the TV signals can be provided to a viewfinder on the camcorder. - In the present description,
digital imaging system 100 generates video signals in either the NTSC video format or the PAL video format. However, this is illustrative only and in other embodiments,digital imaging system 100 can be configured to support any video formats, including digital television, and any number of video formats, as long asimage processor 230 is appropriately configured, as described in details in the aforementioned '868 application. - The detail structure and operation of
digital imaging system 100 for implementing ambient light rejection will now be described with reference toFIG. 2 and the remaining figures. - Digital Image sensor
-
Digital imaging system 100 uses a single image sensor to capture video images which are then processed into output video data in the desired video formats. Specifically,digital image sensor 102 includes asensor array 210 of light detecting elements (also called pixels) and generates digital pixel data as output signals at each pixel location.Digital image sensor 102 also includesimage buffer 212 for storing at least one frame of digital pixel data fromsensor array 210 anddata processor 214 for controlling the capture and readout operations of the image sensor.Data processor 214 also controls the externallight source 250 for providing active illumination.Digital image sensor 102 may include other circuitry not shown inFIG. 2 to support the image capture and readout operations of the image sensor. - In the present embodiment,
sensor array 210 ofdigital image sensor 102 is implemented as a digital pixel sensor (DPS). A digital pixel sensor refers to a CMOS image sensor with pixel level analog-to-digital conversion capabilities. A CMOS image sensor with pixel level analog-to-digital conversion is described in U.S. Pat. No. 5,461,425 of B. Fowler et al. (the '425 patent), which patent is incorporated herein by reference in its entirety. A digital pixel sensor provides a digital output signal at each pixel element representing the light intensity value detected by that pixel element. The combination of a photodetector and an analog-to-digital (A/D) converter in an area image sensor helps enhance detection accuracy, reduce power consumption, and improves overall system performance. - In the present description, a digital pixel sensor (DPS) array refers to a digital image sensor having an array of photodetectors where each photodetector produces a digital output signal. In one embodiment of the present invention, the DPS array implements the digital pixel sensor architecture illustrated in
FIG. 9 and described in the aforementioned '425 patent. The DPS array of the '425 patent utilizes pixel level analog-to-digital conversion to provide a digital output signal at each pixel. The pixels of a DPS array are sometimes referred to as a “sensor pixel” or a “sensor element” or a “digital pixel,” which terms are used to indicate that each of the photodetectors of a DPS array includes an analog-to-digital conversion (ADC) circuit, and is distinguishable from a conventional photodetector which includes a photodetector and produces an analog signal. The digital output signals of a DPS array have advantages over the conventional analog signals in that the digital signals can be read out at a much higher speed than the conventional image sensor. Of course, other schemes for implementing a pixel level A/D conversion in an area image sensor may also be used in the digital image sensor of the present invention. - In the digital pixel sensor architecture shown in
FIG. 9 , a dedicated ADC scheme is used. That is, each ofpixel elements 15 insensor array 12 includes an ADC circuit. The image sensor of the present invention can employ other DPS architectures, including a shared ADC scheme. In the shared ADC scheme, instead of providing a dedicated ADC circuit to each photodetector in a sensor array, an ADC circuit is shared among a group of neighboring photodetectors. For example, in one embodiment, four neighboring photodetectors may share one ADC circuit situated in the center of the four photodetectors. The ADC circuit performs A/D conversion of the output voltage signal from each photodetectors by multiplexing between the four photodetectors. The shared ADC architecture retains all the benefits of a pixel level analog-to-digital conversion while providing the advantages of consuming a much smaller circuit area, thus reducing manufacturing cost and improving yield. Above all, the shared ADC architecture allows a higher fill factor so that a larger part of the sensor area is available for forming the photodetectors. - In one embodiment of the present invention, the ADC circuit of each digital pixel or each group of digital pixels is implemented using the Multi-Channel Bit Serial (MCBS) analog-to-digital conversion technique described in U.S. Pat. No. 5,801,657 B. Fowler et al. (the '657 patent), which patent is incorporated herein by reference in its entirety. The MCBS ADC technique of the '657 patent can significantly improve the overall system performance while minimizing the size of the ADC circuit. Furthermore, as described in the '657 patent, a MCBS ADC has many advantages applicable to image acquisition and more importantly, facilitates high-speed readout.
- In another embodiment of the present invention, the ADC circuit of each digital pixel or each group of digital pixels implements a thermometer-code analog-to-digital conversion technique with continuous sampling of the input signal for achieving a digital conversion with a high dynamic range. A massively parallel thermometer-code analog-to-digital conversion scheme is described in copending and commonly assigned U.S. patent application Ser. No. 10/185,584, entitled “Digital Image Capture having an Ultra-high Dynamic Range,” of Justin Reyneri et al., filed Jun. 26, 2002, which patent application is incorporated herein by reference in its entirety.
- Returning to
FIG. 2 , in the present embodiment,digital image sensor 102 includesimage buffer 212 as an on-chip memory for storing at least one frame of pixel data. However, in other embodiments,digital image sensor 102 can also operate with an off-chip memory as the image buffer. The use of on-chip memory is not critical to the practice of the present invention. The incorporation of an on-chip memory in a DPS sensor alleviates the data transmission bottleneck problem associated with the use of an off-chip memory for storage of the pixel data. In particular, the integration of a memory with a DPS sensor makes feasible the use of multiple sampling for improving the quality of the captured images. Multiple sampling is a technique capable of achieving a wide dynamic range in an image sensor without many of the disadvantages associated with other dynamic range enhancement techniques, such as degradation in signal-to-noise ratio and increased implementation complexity. U.S. Pat. No. 6,975,355, entitled “Multiple Sampling via a Time-indexed Method to Achieve Wide Dynamic Ranges,” of David Yang et al., issued Dec. 13, 2005, describes a method for facilitating image multiple sampling using a time-indexed approach. The '355 patent is incorporated herein by reference in its entirety. -
FIG. 10 duplicatesFIG. 3 of the '355 patent and shows a functional block diagram of animage sensor 300 which may be used to implementdigital image sensor 102 in one embodiment. The operation ofimage sensor 300 using multiple sampling is described in detail in the '355 patent.Image sensor 300 includes aDPS sensor array 302 which has an N by M array of pixel elements.Sensor array 302 employs either the dedicated ADC scheme or the shared ADC scheme and incorporates pixel level analog-to-digital conversion. A sense amplifier andlatch circuit 304 is coupled tosensor array 302 to facilitate the readout of digital signals fromsensor array 302. The digital signals (also referred to as digital pixel data) are stored in digitalpixel data memory 310. To support multiple sampling,image sensor 300 also includes athreshold memory 306 and atime index memory 308 coupled tosensor array 302.Threshold memory 306 stores information of each pixel indicating whether the light intensity value measured by each pixel insensor array 302 has passed a predetermined threshold level. The exposure time indicating when the light intensity measured by each pixel has passed the threshold level is stored intime index memory 308. As a result of this memory configuration, each pixel element insensor array 302 can be individually time-stamped bythreshold memory 306 andtime index memory 308 and stored in digitalpixel data memory 310. A DPS image sensor employing multiple sampling is capable of recording 14 to 16 or more bits of dynamic range in the captured image, in contrast with the 10 bits of dynamic range attainable by conventional image sensors. In the present embodiment,digital image sensor 102 is a DPS image sensor and is implemented using the architecture ofimage sensor 300 ofFIG. 10 to support multiple sampling for attaining a high dynamic range in image capture. - In the present embodiment,
digital image sensor 102 implements correlated double sampling for noise reduction. Correlated double sampling (CDS) is an image processing technique employed to reduce kT/C or thermal noise and 1/f noise in an image sensor array. CDS can also be employed to compensate for any fixed pattern noise or variable comparator offset. To implement CDS, the sensor array is reset and the pixel values at each photodetector is measured and stored in specified memory locations in the data memory (image buffer 212). The pixel value measured at sensor array reset is called “CDS values” or “CDS subtract values.” Subsequently, for each frame of pixel data captured by thesensor array 210, the stored CDS values are subtracted from the measured pixel intensity values to provide normalized pixel data free of errors caused by noise and offset. - Digital Image Processor
-
Digital image processor 104 is a high performance image processor for processing pixel data fromdigital image sensor 102 into video images in a desired video format. In the present embodiment,digital image processor 104 implements signal processing functions for supporting an entire video signal processing chain. Specifically, the image processing functions ofdigital image processor 104 include demosaicing, image scaling, and other high-quality video enhancements, including color correction, edge, sharpness, color fidelity, backlight compensation, contrast, and dynamic range extrapolation. The image processing operations are carried out at video rates. - The overall operation of
digital image processor 104 is controlled bysystem processor 240. In the present embodiment,system processor 240 is implemented as an ARM (Advanced RISC Machine) processor. Firmware for supporting the operation ofsystem processor 240 can be stored in a memory buffer. A portion offrame buffer 228 may be allocated for storing the firmware used bysystem processor 240.System processor 240 operates to initialize and supervise the functional blocks ofimage processor 104. - In accordance with the present invention,
digital image processor 104 also facilitate image capture operations for obtaining pairs of ambient lighted and active-and-ambient lighted images and processing the pairs of images to generate output images with ambient light cancellation. The operation ofdigital image system 100 for implementing ambient light rejection in the output video signals will be described in more detail below. - Ambient Light Rejection Using Intra-Frame Image Capture
-
FIG. 3 is a timing diagram illustrating the intra-frame image capture scheme for implementing ambient light rejection according to one embodiment of the present invention. Referring toFIG. 3 , asignal line 52 represents the image capture operation ofdigital imaging system 100 while asignal line 54 represents the active illumination control. In the present illustration, a logical “low” level ofsignal lines - In accordance with one embodiment of the present invention, an intra-frame image capture scheme is implemented to obtain a pair of ambient lighted and active-and-ambient lighted images, successively captured in close temporal proximity, within a single video frame image of a scene. As shown in
FIG. 3 , a first image capture is carried out without active illumination to generate a first image with only ambient lighting. Then a second image capture is carried out with active illumination to generate a second image with active and ambient lighting. Both the first and second captures occur within a single video frame and the fast image capture is made possible by the high capture speed operation ofdigital image sensor 102.Digital image processor 104 operates to subtract the first image from the second image to provide an ambient rejected output image. The image subtraction is performed pixel by pixel. According to another aspect of the present invention, an image subtraction method utilizing a look-up table is used to provide a fast and simple way to perform the image frame subtraction, as will be described in more detail below. - In
FIG. 3 , the active illuminated image capture is carried out after the ambient lighted image capture. The order of the two image captures is not critical to the practice of the present invention and in other embodiments and the active illuminated image capture can be carried out before the ambient lighted image capture. It is only critical that a pair of ambient light and active-and-ambient lighted images is obtained in close temporal proximity to each other. - The intra-frame capture scheme provides significant advantages over the conventional sequential frame capture technique. First, by capturing the pair of ambient lighted and active-and-ambient lighted images within a video frame and in close temporal proximity to each other, the correlation of the two images improves significantly so that the completeness of the ambient light cancellation is greatly improved. Furthermore, by capturing the pair of images within close temporal proximity, image artifacts due to motions in the scene are greatly reduced. Finally, because of the high speed operation of
digital image sensor 102,digital imaging system 100 can provide an increased output frame rate to improve the resolution of the output video signals. In one embodiment,digital imaging system 100 is used for continuous video monitoring at a full video rate (e.g., 60 frames per second) and for motion in a scene moving at extreme speed (e.g. over 120 mph or 200 kph). - The detail of the image capture operation will now be described with reference to
FIGS. 4 and 5 .FIG. 4 is a timing diagram illustrating a typical image capture operation of a DPS array.FIG. 5 is a timing diagram illustrating the image capture operation of a DPS array implementing intra-frame image capture according to one embodiment of the present invention. - Referring first to
FIG. 4 , in a typical image capture operation, within the time period of a single video frame ( 1/60 sec. or 16.67 ms), a DPS array first performs a sensor array reset operation and then the DPS array performs analog-to-digital conversion (ADC) to read out pixel data values for CDS. Then, image integration starts and the DPS array is exposed to the scene for a given exposure time. Each digital pixel integrates incident light impinging upon the DPS array. A mechanical shutter or an electronic shutter can be used to control the exposure time. Following image integration, the DPS array performs analog-to-digital conversion to read out pixel data values representing the image of the scene. The CDS data values and the image data values are stored in the image buffer and can be processed on-chip or off-chip to provide the normalized pixel data values. - Referring now to
FIG. 5 , in accordance with one embodiment of the present invention, the intra-fame image capture scheme of the present invention implements a fast image capture operation where two fast image captures are performed within the time period of a single video frame ( 1/60 sec.). In the present embodiment, a fast image capture includes a reset operation, an image integration operation and an ADC data readout operation. As shown inFIG. 5 , within the time of a single video frame, a first fast image capture and a second fast image capture are carried out. One of the two fast image captures is synchronized with the activation of the external light source. - In the fast image capture scheme used in
FIG. 5 , the CDS operation is eliminated. This is because when performing ambient light cancellation, only the difference between the two images is important. Therefore, pixel data errors due to offset or noise will affect both images equally and most errors will be cancelled out when the ambient light component is removed from the final image. By eliminating the CDS operation, the capture time for each image capture can be shortened so that two fast image captures can fit within the time of a single video frame rate. - More importantly, the two fast image captures of
FIG. 5 can be carried out with zero or minimal delay between each capture so that the two captures can be of nearly the same identical scene. In one embodiment, the ADC operation is carried out using a single-capture-bit-serial (SCBS) conversion scheme or the MCBS conversion scheme described above and an exposure time of 120 μs can be used for each image capture. Thus, two captures can be completed in a time that is a fraction of 1 ms. By using such a high speed of image capture,digital imaging system 100 can have a high tolerance on ambient lighting changes as well as fast motions in the scene. - Multi Intra-Frame Capture
- In the above described embodiment, the intra-frame capture scheme for ambient light cancellation is carried out by taking a single ambient lighted image and a single active-and-ambient lighted image within a single video frame. According to another aspect of the present invention, the intra-frame capture scheme of the present invention is extended to perform multiple intra-frame image capture for ambient light cancellation. The multiple intra-frame image capture scheme has the advantage of increasing the total integration time without incorporating undesirable motion artifacts.
-
FIG. 6 is a timing diagram illustrating the multiple intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to another embodiment of the present invention. Referring toFIG. 6 , within a single video frame (Frame A), multiple sets of image captures are carried out to generate multiple pairs of ambient and active-and-ambient lighted images. Each pair of ambient and active-and-ambient lighted images is subtracted from each other to generate a difference output image containing only the active illuminated component of the scene. Then, the difference output images of all image pairs are summed to provide a final ambient rejected output image. - The multiple intra-frame image capture scheme is particularly advantageous when the illumination provided by the external light source is limited so that a longer exposure time is desired to integrate the active illuminated image sufficiently. However, if the integration time alone is extended, the ambient light rejection ratio will be adversely affected and there will be more motion artifacts in the resulting image because the image integration was extended over a longer period of time. The multiple image capture scheme of the present invention increases the effective integration time without creating motion induced artifacts as each pair of ambient lighted and active-and-ambient lighted images are taken within close temporal proximity to each other.
- Anti-Jamming
- In a digital imaging system using active illumination to implement ambient light rejection, it is possible to defeat the imaging system by sensing the periodic active illumination and providing additional “jamming” illumination. The jamming illumination would be synchronized to the active illumination and shifted to illuminate the subject of interest when the active illumination is not active. When image subtraction is carried out, the subject as well as the ambient light will be rejected, rendering the output image meaningless.
- According to one aspect of the present invention, a jamming resistant intra-frame image capture scheme is implemented in the digital imaging system of the present invention to prevent the digital imaging system from being jammed and rendered ineffective.
FIG. 7 is a timing diagram illustrating the jamming resistant intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention. Referring toFIG. 7 , a pseudo random delay “δ” is included between the start of each frame and the start of the first image capture. In this manner, the pair of image captures occurs at a pseudo random delay time from the start of each video frame and the active illumination will also be provided in a pseudo random manner. By shifting the image capture time in a pseudo random manner within the video frame, the intra-frame image capture scheme of the present invention is thereby provided with jamming resistant capability. It is now no longer possible for a jamming device to use a simple phase locked loop to lock in and synchronize to the active illumination. - Negative Illumination
- In some applications, the digital imaging system of the present invention is provided to capture a scene with a large field of view. In that case, it is sometimes desirable to block out part of the view, so as to provide privacy or to remove parts of the scene that are not of interest and are “distracting” to human or machine scene interpretation. In accordance with another aspect of the present invention, the intra-frame image capture scheme for ambient light rejection is implemented using a second controlled light source to provide a second source of illumination to portions of the scene that are to be blocked out. This second source of illumination to limited portions of the scene is referred herein as “negative” illumination. When the “negative” illumination is applied in the intra-frame image capture scheme as described below, the portions of the scene illuminated by the negative illumination will appear black in the final ambient rejected output image and those portions of the scene are thereby blocked out in the final output image.
-
FIG. 8 is a timing diagram illustrating the negative illumination intra-frame image capture scheme for implementing ambient light rejection in the digital imaging system according to one embodiment of the present invention. Referring toFIG. 8 , a digital imaging system implements ambient light rejection by taking a first image capture with only ambient lighting and a second image capture with active illumination with both image capture occurring within a single video frame, in the same manner as described above. However, in accordance with the present embodiment, in addition to the first external light source (such asLED 250 ofFIG. 2 ) providing the active illumination to the entire scene, the digital imaging system is equipped with a second external light source (such asLED 252 ofFIG. 2 ) providing a second source of illumination to selected portions of the scene. The second externallight source 252 is also under the control ofdata processor 214 ofdigital image sensor 102 of the digital imaging system to be synchronized with the image captures of the image sensor.LED 252 is situated so that the second external light source is directed to only selected portions of the field of view ofdigital image sensor 102 desired to be blocked out. - The second external
light source 252 is synchronized to the first image capture ofdigital image sensor 102. Referring now toFIG. 8 , the second externallight source 252 is activated in synchronous with the first image capture to provide a source of illumination to the selected portions of the scene during the first image capture. Those selected portions of the scene are thus illuminated by the ambient light as well as the “negative” illumination from the second external light source. The remaining portions of the scene are illuminated by the ambient light only. Then, at the second image capture, the first external light source is activated to provide active illumination to the entire scene. - As a result, the first image capture provides a first image with the entire scene being ambient lighted and portions of the scene also lit by the “negative” illumination. The second image capture provides a second image with the entire scene being ambient lighted and active illuminated. When the first image is subtracted from the second image, the portions of the scene that are illuminated by the “negative” illumination will appear black and those portions of the scene are thus blocked out in the final ambient rejected output image.
- To ensure complete cancellation, the level of “negative” illumination should match the ambient cancellation illumination for the selected parts of the scene to black out the image. The ambient cancellation illumination refers to the active illumination used to reject the ambient light and used to generate the ambient rejected output image. In one embodiment, the level of “negative” illumination is determined by using saturated arithmetic in the subtraction process and making sure that the negative illumination component is greater than the active illumination component used for ambient cancellation. In another embodiment, the level of “negative” illumination is determined by using a feedback loop to ensure that the relevant part of the scene to be blacked out possesses neither positive nor “negative” brightness.
- Applications
- The digital imaging system of the present invention implementing ambient light rejection using active illumination and high speed intra-frame image capture can be advantageously applied to many applications. In one embodiment, the digital imaging system is applied in an automobile for driver face monitoring. For instance, the digital imaging system can be used to capture images of the driver's face to determine if the driver may be falling asleep. The driver face monitoring application can also be used to identify the driver where the driver identification can be used to automatically set up the seat or mirror position preferences for the driver. The driver face monitoring application can also be extended for use in face recognition or monitoring of users of automatic teller machines. The face monitoring application can also be used as biometrics for access control.
- In an automobile, sun and artificial lights are always in motion and can come into the automobile at any angle with varying intensity. This varying light condition makes it difficult to use conventional imaging to capture the driver's face as the image may be too dark or too bright. It is also impractical to use strong IR illumination to overwhelm the ambient light because of risk of eye injury to the driver. However, with the use of the ambient light rejection technique of the present invention, an image of the driver illuminated only with the controlled active illumination, which can be a low power light source, is obtained to allow reliable and consistent capture of the driver's face image for further image processing.
- The digital imaging system of the present invention can also be applied to image objects formed using a retro-reflective medium. A retro-reflective medium refers to reflective medium which provide high levels of reflectance along a direction back toward the source of illuminating radiation. Automobile license plates are typically formed with retro-reflection. The digital imaging system of the present invention can be applied to capture automobile license plates or other objects with retro-reflection. The digital imaging system can provide ambient-rejected images of the objects regardless of the lighting conditions or the motion the objects are subjected to.
- The applications of the digital imaging system of the present invention are numerous and the above-described applications are illustrative only and are not intended to be limiting.
- Frame Differencing Using Look-up Table
- In the digital imaging system described above, two images are obtained and they need to be subtracted from each other to obtain the ambient rejected output image. The operation of subtracting two frames is referred to as frame differencing. Typically, frame differencing is performed using dedicated circuitry such as an adder circuit. Adding an adder circuit to the digital imaging system increases the complexity and cost of the system.
- According to one aspect of the present invention, the digital imaging system of the present invention implements frame differencing using a decoder look-up table (LUT). In this manner, the frame differencing can be carried out without additional circuitry for pixel data subtraction. Furthermore, linearizing of the pixel data can be carried out at the same time as the frame subtraction to further improve the speed of operation of the digital imaging system.
- As described above with reference to
FIG. 2 ,digital imaging system 100 includesdigital image sensor 102 for capturing and storing pixel data indicative of the image of a scene.Digital image sensor 102 includes amemory buffer 212 for storing at least one frame of pixel data where each pixel data is allocated N bits of memory space. For implementing the intra-frame image capture scheme for ambient light rejection,image buffer 212 can include sufficient memory to store at least two frames of pixel data so that the pair of ambient lighted and active-and-ambient lighted images can be stored inimage buffer 212 at the same time. - In accordance with the frame differencing scheme of the present invention,
image buffer 212 allocates N bits for storing pixel data for each pixel and each of the ambient lighted and active-and-ambient lighted images is stored using N/2 bits of memory. The combined N bits of pixel data of the ambient light image and active-and-ambient lighted images are used to index a look-up table. The output pixel data value from the look-up table is the difference between the ambient light image and active-and-ambient lighted images. The output pixel data value may also be linearized so that the linearization and the subtraction steps are combined in one look-up table operation. - One advantage of the frame differencing scheme of the present invention is that the frame differencing scheme can be implemented using memory space required for only one frame of N-bit image data, resulting space and cost savings. In this manner, the digital imaging system of the present invention can use the same memory space to provide N-bit pixel data when the ambient light rejection is not selected and to provide N/2-bit pixel data when ambient light rejection is selected. In one embodiment,
image buffer 212 stores pixel data in 12 bits and each of the ambient lighted and active-and-ambient lighted images is stored in 6 bits. -
FIG. 11 is a diagram illustrating the frame differencing method using a look-up table according to one embodiment of the present invention. Referring toFIG. 11 , a look-up table (LUT) 400 is generated where the LUT is indexed by an N-bit data input representing the N/2-bit pixel data associated with the two image captures of a pixel element. The LUT provides output data values indicative of the pixel value difference of the two image captures for that pixel element. Furthermore, in the present embodiment, theLUT 400 is generated using linearized pixel data values so that the output data values provided by the LUT represent linearized frame difference pixel data values. - In one embodiment,
LUT 400 is indexed by a 12-bit data input. As described above, the 12-bit data input represents a 12-bit pixel data pair formed by combining the 6-bit pixel data of a pixel element from the first image capture and the 6-bit pixel data of the same pixel element from the second image capture. The first and second image captures refer to the ambient lighted and active-and-ambient lighted images. In the present embodiment,LUT 400 includes 4096 entries where each entry is uniquely associated with a 12-bit pixel data pair. The generation of the output data values ofLUT 400 for each 12-bit pixel data pair is described with reference toFIG. 12 . -
FIG. 12 illustrates the generation of the LUT output data value for each N-bit data input value according to one embodiment of the present invention. The LUT output data value for each 12-bit data input is computed as follows. The N/2 bits of pixel data for a pixel element in the first image capture is linearized to provide a first pixel data value. The N/2 bits of pixel data for the same pixel element in the second image capture is linearized to provide a second pixel data value. The two linearized pixel data values are subtracted and the resulting value is used as the LUT output data value for the N-bit data input value. For instance, a 6-bit pixel data “010111” from the first image capture is linearized to a first pixel data value of 23 and a 6-bit pixel data “101101” from the second image capture is linearized to a second pixel data value of 45. The two linearized pixel data values are subtracted to yield an output data value of 22. The data value of 22 is stored in anentry 410LUT 400 whereentry 410 is indexed by the 12-bit LUT data input of “010111101101”, as shown inFIG. 11 . - By generating a look-up table and using the look-up table to perform the frame differencing operation, the digital imaging system of the present invention can perform ambient light rejection without added computational burden. The frame differencing scheme using a look-up table of the present invention allows the digital imaging system of the present invention to perform ambient light rejection at high speed and without complex circuitry.
- The above detailed descriptions are provided to illustrate specific embodiments of the present invention and are not intended to be limiting. Numerous modifications and variations within the scope of the present invention are possible. The present invention is defined by the appended claims.
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/460,884 US20070263099A1 (en) | 2006-05-09 | 2006-07-28 | Ambient Light Rejection In Digital Video Images |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US74681506P | 2006-05-09 | 2006-05-09 | |
US11/460,884 US20070263099A1 (en) | 2006-05-09 | 2006-07-28 | Ambient Light Rejection In Digital Video Images |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070263099A1 true US20070263099A1 (en) | 2007-11-15 |
Family
ID=38684732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/460,884 Abandoned US20070263099A1 (en) | 2006-05-09 | 2006-07-28 | Ambient Light Rejection In Digital Video Images |
Country Status (1)
Country | Link |
---|---|
US (1) | US20070263099A1 (en) |
Cited By (92)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080303915A1 (en) * | 2007-06-07 | 2008-12-11 | Denso Corporation | Face image capture apparatus |
US20120081566A1 (en) * | 2010-09-30 | 2012-04-05 | Apple Inc. | Flash synchronization using image sensor interface timing signal |
US20120179021A1 (en) * | 2011-01-11 | 2012-07-12 | General Electric Company | SPECT Image Reconstruction Methods and Systems |
US8228406B2 (en) | 2010-06-04 | 2012-07-24 | Apple Inc. | Adaptive lens shading correction |
US8319861B2 (en) | 2010-06-04 | 2012-11-27 | Apple Inc. | Compensation for black level changes |
US8325248B2 (en) | 2010-06-04 | 2012-12-04 | Apple Inc. | Dual processing of raw image data |
US20130100150A1 (en) * | 2010-03-25 | 2013-04-25 | Nokia Corporation | Apparatus, Display Module and Method for Adaptive Blank Frame Insertion |
US20130208953A1 (en) * | 2010-08-24 | 2013-08-15 | Hanwang Technology Co Ltd | Human facial identification method and system, as well as infrared backlight compensation method and system |
US20130259321A1 (en) * | 2010-12-03 | 2013-10-03 | Fujitsu Limited | Biometric authentication device and biometric authentication method |
WO2014026711A1 (en) | 2012-08-15 | 2014-02-20 | Abb Technology Ltd | Sensor arrangement for machine vision |
US20140118556A1 (en) * | 2012-10-31 | 2014-05-01 | Pixart Imaging Inc. | Detection system |
US8760542B2 (en) * | 2010-08-11 | 2014-06-24 | Inview Technology Corporation | Compensation of compressive imaging measurements based on measurements from power meter |
US8817120B2 (en) | 2012-05-31 | 2014-08-26 | Apple Inc. | Systems and methods for collecting fixed pattern noise statistics of image data |
US8836672B2 (en) | 2011-02-09 | 2014-09-16 | Dornerworks, Ltd. | System and method for improving machine vision in the presence of ambient light |
US8872946B2 (en) | 2012-05-31 | 2014-10-28 | Apple Inc. | Systems and methods for raw image processing |
US8885073B2 (en) * | 2010-08-11 | 2014-11-11 | Inview Technology Corporation | Dedicated power meter to measure background light level in compressive imaging system |
US8917336B2 (en) | 2012-05-31 | 2014-12-23 | Apple Inc. | Image signal processing involving geometric distortion correction |
US8953882B2 (en) | 2012-05-31 | 2015-02-10 | Apple Inc. | Systems and methods for determining noise statistics of image data |
US9014504B2 (en) | 2012-05-31 | 2015-04-21 | Apple Inc. | Systems and methods for highlight recovery in an image signal processor |
US9025867B2 (en) | 2012-05-31 | 2015-05-05 | Apple Inc. | Systems and methods for YCC image processing |
US9031319B2 (en) | 2012-05-31 | 2015-05-12 | Apple Inc. | Systems and methods for luma sharpening |
US20150163422A1 (en) * | 2013-12-05 | 2015-06-11 | Apple Inc. | Image Sensor Having Pixels with Different Integration Periods |
US9077943B2 (en) | 2012-05-31 | 2015-07-07 | Apple Inc. | Local image statistics collection |
EP2810219A4 (en) * | 2012-02-01 | 2015-07-22 | Ecoatm Inc | METHOD AND APPARATUS FOR RECYCLING ELECTRONIC DEVICES |
US9105078B2 (en) | 2012-05-31 | 2015-08-11 | Apple Inc. | Systems and methods for local tone mapping |
US9131196B2 (en) | 2012-05-31 | 2015-09-08 | Apple Inc. | Systems and methods for defective pixel correction with neighboring pixels |
US9142012B2 (en) | 2012-05-31 | 2015-09-22 | Apple Inc. | Systems and methods for chroma noise reduction |
US9277144B2 (en) | 2014-03-12 | 2016-03-01 | Apple Inc. | System and method for estimating an ambient light condition using an image sensor and field-of-view compensation |
US9293500B2 (en) | 2013-03-01 | 2016-03-22 | Apple Inc. | Exposure control for image sensors |
US9319611B2 (en) | 2013-03-14 | 2016-04-19 | Apple Inc. | Image sensor with flexible pixel summing |
US9332239B2 (en) | 2012-05-31 | 2016-05-03 | Apple Inc. | Systems and methods for RGB image processing |
US9473706B2 (en) | 2013-12-09 | 2016-10-18 | Apple Inc. | Image sensor flicker detection |
US9497397B1 (en) | 2014-04-08 | 2016-11-15 | Apple Inc. | Image sensor with auto-focus and color ratio cross-talk comparison |
WO2016193826A1 (en) * | 2015-06-02 | 2016-12-08 | Sony Mobile Communications Inc. | Enhanced video capture in adverse lighting conditions |
US9538106B2 (en) | 2014-04-25 | 2017-01-03 | Apple Inc. | Image sensor having a uniform digital power signature |
US9549099B2 (en) | 2013-03-12 | 2017-01-17 | Apple Inc. | Hybrid image sensor |
US9584743B1 (en) | 2014-03-13 | 2017-02-28 | Apple Inc. | Image sensor with auto-focus and pixel cross-talk compensation |
EP3139347A1 (en) * | 2015-08-20 | 2017-03-08 | Diehl BGT Defence GmbH & Co. Kg | Method for determining an alignment of an object |
US9596423B1 (en) | 2013-11-21 | 2017-03-14 | Apple Inc. | Charge summing in an image sensor |
US9686485B2 (en) | 2014-05-30 | 2017-06-20 | Apple Inc. | Pixel binning in an image sensor |
US9741754B2 (en) | 2013-03-06 | 2017-08-22 | Apple Inc. | Charge transfer circuit with storage nodes in image sensors |
US9773169B1 (en) | 2012-11-06 | 2017-09-26 | Cross Match Technologies, Inc. | System for capturing a biometric image in high ambient light environments |
US9818160B2 (en) | 2008-10-02 | 2017-11-14 | ecoATM, Inc. | Kiosk for recycling electronic devices |
US9881284B2 (en) | 2008-10-02 | 2018-01-30 | ecoATM, Inc. | Mini-kiosk for recycling electronic devices |
US9885672B2 (en) | 2016-06-08 | 2018-02-06 | ecoATM, Inc. | Methods and systems for detecting screen covers on electronic devices |
US9904911B2 (en) | 2008-10-02 | 2018-02-27 | ecoATM, Inc. | Secondary market and vending system for devices |
US9911102B2 (en) | 2014-10-02 | 2018-03-06 | ecoATM, Inc. | Application for device evaluation and other processes associated with device recycling |
US9912883B1 (en) | 2016-05-10 | 2018-03-06 | Apple Inc. | Image sensor with calibrated column analog-to-digital converters |
US10032140B2 (en) * | 2008-10-02 | 2018-07-24 | ecoATM, LLC. | Systems for recycling consumer electronic devices |
EP2763398B1 (en) | 2007-12-21 | 2018-10-31 | Photonis Netherlands B.V. | Use of an image sensor array in laser range gated imaging |
US10127647B2 (en) | 2016-04-15 | 2018-11-13 | Ecoatm, Llc | Methods and systems for detecting cracks in electronic devices |
US10263032B2 (en) | 2013-03-04 | 2019-04-16 | Apple, Inc. | Photodiode with different electric potential regions for image sensors |
US10269110B2 (en) | 2016-06-28 | 2019-04-23 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US10285626B1 (en) | 2014-02-14 | 2019-05-14 | Apple Inc. | Activity identification using an optical heart rate monitor |
TWI662940B (en) * | 2018-06-01 | 2019-06-21 | 廣達電腦股份有限公司 | Image capturing device |
US10354413B2 (en) | 2013-06-25 | 2019-07-16 | Pixart Imaging Inc. | Detection system and picture filtering method thereof |
US10401411B2 (en) | 2014-09-29 | 2019-09-03 | Ecoatm, Llc | Maintaining sets of cable components used for wired analysis, charging, or other interaction with portable electronic devices |
US10417615B2 (en) | 2014-10-31 | 2019-09-17 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10440301B2 (en) | 2017-09-08 | 2019-10-08 | Apple Inc. | Image capture device, pixel, and method providing improved phase detection auto-focus performance |
US10438987B2 (en) | 2016-09-23 | 2019-10-08 | Apple Inc. | Stacked backside illuminated SPAD array |
US10445708B2 (en) | 2014-10-03 | 2019-10-15 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US10475002B2 (en) | 2014-10-02 | 2019-11-12 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10572946B2 (en) | 2014-10-31 | 2020-02-25 | Ecoatm, Llc | Methods and systems for facilitating processes associated with insurance services and/or other services for electronic devices |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
US10656251B1 (en) | 2017-01-25 | 2020-05-19 | Apple Inc. | Signal acquisition in a SPAD detector |
US10742892B1 (en) * | 2019-02-18 | 2020-08-11 | Samsung Electronics Co., Ltd. | Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device |
US10801886B2 (en) | 2017-01-25 | 2020-10-13 | Apple Inc. | SPAD detector having modulated sensitivity |
US10825082B2 (en) | 2008-10-02 | 2020-11-03 | Ecoatm, Llc | Apparatus and method for recycling mobile phones |
US10848693B2 (en) | 2018-07-18 | 2020-11-24 | Apple Inc. | Image flare detection using asymmetric pixels |
US10860990B2 (en) | 2014-11-06 | 2020-12-08 | Ecoatm, Llc | Methods and systems for evaluating and recycling electronic devices |
CN112560835A (en) * | 2019-09-26 | 2021-03-26 | 苏州立创致恒电子科技有限公司 | Imaging assembly for shielding ambient light interference and image imaging method |
US10962628B1 (en) | 2017-01-26 | 2021-03-30 | Apple Inc. | Spatial temporal weighting in a SPAD detector |
US11010841B2 (en) | 2008-10-02 | 2021-05-18 | Ecoatm, Llc | Kiosk for recycling electronic devices |
US11019294B2 (en) | 2018-07-18 | 2021-05-25 | Apple Inc. | Seamless readout mode transitions in image sensors |
US11080672B2 (en) | 2014-12-12 | 2021-08-03 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US11089247B2 (en) | 2012-05-31 | 2021-08-10 | Apple Inc. | Systems and method for reducing fixed pattern noise in image data |
US20220191408A1 (en) * | 2020-12-14 | 2022-06-16 | Bae Systems Information And Electronic Systems Integration Inc. | Interleaved simultaneous binning mode |
US11430094B2 (en) | 2020-07-20 | 2022-08-30 | Samsung Electronics Co., Ltd. | Guided multi-exposure image fusion |
EP4057218A1 (en) * | 2021-03-11 | 2022-09-14 | Koninklijke Philips N.V. | A method and image capturing apparatus |
EP4057217A1 (en) * | 2021-03-11 | 2022-09-14 | Koninklijke Philips N.V. | A method and image capturing apparatus for minimising error in ambient light corrected images |
US11462868B2 (en) | 2019-02-12 | 2022-10-04 | Ecoatm, Llc | Connector carrier for electronic device kiosk |
US11482067B2 (en) | 2019-02-12 | 2022-10-25 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
US20220385933A1 (en) * | 2019-11-06 | 2022-12-01 | Koninklijke Philips N.V. | A system for performing image motion compensation |
US11546532B1 (en) | 2021-03-16 | 2023-01-03 | Apple Inc. | Dynamic correlated double sampling for noise rejection in image sensors |
US11563910B2 (en) | 2020-08-04 | 2023-01-24 | Apple Inc. | Image capture devices having phase detection auto-focus pixels |
US11798250B2 (en) | 2019-02-18 | 2023-10-24 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
US11922467B2 (en) | 2020-08-17 | 2024-03-05 | ecoATM, Inc. | Evaluating an electronic device using optical character recognition |
US11989710B2 (en) | 2018-12-19 | 2024-05-21 | Ecoatm, Llc | Systems and methods for vending and/or purchasing mobile phones and other electronic devices |
US12033454B2 (en) | 2020-08-17 | 2024-07-09 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
US12069384B2 (en) | 2021-09-23 | 2024-08-20 | Apple Inc. | Image capture devices having phase detection auto-focus pixels |
US12192644B2 (en) | 2021-07-29 | 2025-01-07 | Apple Inc. | Pulse-width modulation pixel sensor |
US12271929B2 (en) | 2020-08-17 | 2025-04-08 | Ecoatm Llc | Evaluating an electronic device using a wireless charger |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4303855A (en) * | 1978-12-20 | 1981-12-01 | International Business Machines Corporation | System for separating an optical signal from ambient light |
US6256067B1 (en) * | 1996-08-07 | 2001-07-03 | Agilent Technologies, Inc. | Electronic camera for selectively photographing a subject illuminated by an artificial light source |
US20030193600A1 (en) * | 2002-03-28 | 2003-10-16 | Minolta Co., Ltd | Image capturing apparatus |
US6642955B1 (en) * | 2000-01-10 | 2003-11-04 | Extreme Cctv Inc. | Surveillance camera system with infrared and visible light bandpass control circuit |
US20040080623A1 (en) * | 2002-08-15 | 2004-04-29 | Dixon Cleveland | Motion clutter suppression for image-subtracting cameras |
US6748259B1 (en) * | 2000-06-15 | 2004-06-08 | Spectros Corporation | Optical imaging of induced signals in vivo under ambient light conditions |
US6832728B2 (en) * | 2001-03-26 | 2004-12-21 | Pips Technology, Inc. | Remote indicia reading system |
US7016518B2 (en) * | 2002-03-15 | 2006-03-21 | Extreme Cctv Inc. | Vehicle license plate imaging and reading system for day and night |
-
2006
- 2006-07-28 US US11/460,884 patent/US20070263099A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4303855A (en) * | 1978-12-20 | 1981-12-01 | International Business Machines Corporation | System for separating an optical signal from ambient light |
US6256067B1 (en) * | 1996-08-07 | 2001-07-03 | Agilent Technologies, Inc. | Electronic camera for selectively photographing a subject illuminated by an artificial light source |
US6642955B1 (en) * | 2000-01-10 | 2003-11-04 | Extreme Cctv Inc. | Surveillance camera system with infrared and visible light bandpass control circuit |
US6748259B1 (en) * | 2000-06-15 | 2004-06-08 | Spectros Corporation | Optical imaging of induced signals in vivo under ambient light conditions |
US6832728B2 (en) * | 2001-03-26 | 2004-12-21 | Pips Technology, Inc. | Remote indicia reading system |
US7016518B2 (en) * | 2002-03-15 | 2006-03-21 | Extreme Cctv Inc. | Vehicle license plate imaging and reading system for day and night |
US20030193600A1 (en) * | 2002-03-28 | 2003-10-16 | Minolta Co., Ltd | Image capturing apparatus |
US20040080623A1 (en) * | 2002-08-15 | 2004-04-29 | Dixon Cleveland | Motion clutter suppression for image-subtracting cameras |
Cited By (156)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080303915A1 (en) * | 2007-06-07 | 2008-12-11 | Denso Corporation | Face image capture apparatus |
US7916169B2 (en) * | 2007-06-07 | 2011-03-29 | Denso Corporation | Face image capture apparatus |
EP2763398B1 (en) | 2007-12-21 | 2018-10-31 | Photonis Netherlands B.V. | Use of an image sensor array in laser range gated imaging |
US11107046B2 (en) | 2008-10-02 | 2021-08-31 | Ecoatm, Llc | Secondary market and vending system for devices |
US11080662B2 (en) | 2008-10-02 | 2021-08-03 | Ecoatm, Llc | Secondary market and vending system for devices |
US11526932B2 (en) | 2008-10-02 | 2022-12-13 | Ecoatm, Llc | Kiosks for evaluating and purchasing used electronic devices and related technology |
US9818160B2 (en) | 2008-10-02 | 2017-11-14 | ecoATM, Inc. | Kiosk for recycling electronic devices |
US12198108B2 (en) | 2008-10-02 | 2025-01-14 | Ecoatm, Llc | Secondary market and vending system for devices |
US9881284B2 (en) | 2008-10-02 | 2018-01-30 | ecoATM, Inc. | Mini-kiosk for recycling electronic devices |
US11443289B2 (en) | 2008-10-02 | 2022-09-13 | Ecoatm, Llc | Secondary market and vending system for devices |
US12182773B2 (en) | 2008-10-02 | 2024-12-31 | Ecoatm, Llc | Secondary market and vending system for devices |
US9904911B2 (en) | 2008-10-02 | 2018-02-27 | ecoATM, Inc. | Secondary market and vending system for devices |
US11790328B2 (en) | 2008-10-02 | 2023-10-17 | Ecoatm, Llc | Secondary market and vending system for devices |
US11010841B2 (en) | 2008-10-02 | 2021-05-18 | Ecoatm, Llc | Kiosk for recycling electronic devices |
US10032140B2 (en) * | 2008-10-02 | 2018-07-24 | ecoATM, LLC. | Systems for recycling consumer electronic devices |
US10825082B2 (en) | 2008-10-02 | 2020-11-03 | Ecoatm, Llc | Apparatus and method for recycling mobile phones |
US10853873B2 (en) | 2008-10-02 | 2020-12-01 | Ecoatm, Llc | Kiosks for evaluating and purchasing used electronic devices and related technology |
US10055798B2 (en) | 2008-10-02 | 2018-08-21 | Ecoatm, Llc | Kiosk for recycling electronic devices |
US11907915B2 (en) | 2008-10-02 | 2024-02-20 | Ecoatm, Llc | Secondary market and vending system for devices |
US11935138B2 (en) | 2008-10-02 | 2024-03-19 | ecoATM, Inc. | Kiosk for recycling electronic devices |
US10157427B2 (en) | 2008-10-02 | 2018-12-18 | Ecoatm, Llc | Kiosk for recycling electronic devices |
US10991338B2 (en) * | 2010-03-25 | 2021-04-27 | Nokia Technologies Oy | Apparatus, display module and method for adaptive blank frame insertion |
US20130100150A1 (en) * | 2010-03-25 | 2013-04-25 | Nokia Corporation | Apparatus, Display Module and Method for Adaptive Blank Frame Insertion |
US8228406B2 (en) | 2010-06-04 | 2012-07-24 | Apple Inc. | Adaptive lens shading correction |
US8988563B2 (en) | 2010-06-04 | 2015-03-24 | Apple Inc. | Dual parallel processing of frames of raw image data |
US8319861B2 (en) | 2010-06-04 | 2012-11-27 | Apple Inc. | Compensation for black level changes |
US8325248B2 (en) | 2010-06-04 | 2012-12-04 | Apple Inc. | Dual processing of raw image data |
US8760542B2 (en) * | 2010-08-11 | 2014-06-24 | Inview Technology Corporation | Compensation of compressive imaging measurements based on measurements from power meter |
US8885073B2 (en) * | 2010-08-11 | 2014-11-11 | Inview Technology Corporation | Dedicated power meter to measure background light level in compressive imaging system |
US20130208953A1 (en) * | 2010-08-24 | 2013-08-15 | Hanwang Technology Co Ltd | Human facial identification method and system, as well as infrared backlight compensation method and system |
US9245174B2 (en) * | 2010-08-24 | 2016-01-26 | Hanwang Technology Co., Ltd. | Human facial identification method and system, as well as infrared backlight compensation method and system |
US8488055B2 (en) * | 2010-09-30 | 2013-07-16 | Apple Inc. | Flash synchronization using image sensor interface timing signal |
US20130286242A1 (en) * | 2010-09-30 | 2013-10-31 | Apple Inc. | Flash synchronization using image sensor interface timing signal |
US8643770B2 (en) * | 2010-09-30 | 2014-02-04 | Apple Inc. | Flash synchronization using image sensor interface timing signal |
US9344613B2 (en) | 2010-09-30 | 2016-05-17 | Apple Inc. | Flash synchronization using image sensor interface timing signal |
US20120081566A1 (en) * | 2010-09-30 | 2012-04-05 | Apple Inc. | Flash synchronization using image sensor interface timing signal |
US20130259321A1 (en) * | 2010-12-03 | 2013-10-03 | Fujitsu Limited | Biometric authentication device and biometric authentication method |
EP2648158A1 (en) * | 2010-12-03 | 2013-10-09 | Fujitsu Limited | Biometric authentication device and biometric authentication method |
US9298965B2 (en) * | 2010-12-03 | 2016-03-29 | Fujitsu Limited | Biometric authentication device and biometric authentication method |
EP2648158A4 (en) * | 2010-12-03 | 2017-03-29 | Fujitsu Limited | Biometric authentication device and biometric authentication method |
US8463363B2 (en) * | 2011-01-11 | 2013-06-11 | General Electric Company | SPECT image reconstruction methods and systems |
US20120179021A1 (en) * | 2011-01-11 | 2012-07-12 | General Electric Company | SPECT Image Reconstruction Methods and Systems |
US8836672B2 (en) | 2011-02-09 | 2014-09-16 | Dornerworks, Ltd. | System and method for improving machine vision in the presence of ambient light |
EP2810219A4 (en) * | 2012-02-01 | 2015-07-22 | Ecoatm Inc | METHOD AND APPARATUS FOR RECYCLING ELECTRONIC DEVICES |
US9342858B2 (en) | 2012-05-31 | 2016-05-17 | Apple Inc. | Systems and methods for statistics collection using clipped pixel tracking |
US8917336B2 (en) | 2012-05-31 | 2014-12-23 | Apple Inc. | Image signal processing involving geometric distortion correction |
US8817120B2 (en) | 2012-05-31 | 2014-08-26 | Apple Inc. | Systems and methods for collecting fixed pattern noise statistics of image data |
US8872946B2 (en) | 2012-05-31 | 2014-10-28 | Apple Inc. | Systems and methods for raw image processing |
US12177586B2 (en) | 2012-05-31 | 2024-12-24 | Apple Inc. | Systems and methods for reducing fixed pattern noise in image data |
US9332239B2 (en) | 2012-05-31 | 2016-05-03 | Apple Inc. | Systems and methods for RGB image processing |
US11689826B2 (en) | 2012-05-31 | 2023-06-27 | Apple Inc. | Systems and method for reducing fixed pattern noise in image data |
US8953882B2 (en) | 2012-05-31 | 2015-02-10 | Apple Inc. | Systems and methods for determining noise statistics of image data |
US9014504B2 (en) | 2012-05-31 | 2015-04-21 | Apple Inc. | Systems and methods for highlight recovery in an image signal processor |
US9025867B2 (en) | 2012-05-31 | 2015-05-05 | Apple Inc. | Systems and methods for YCC image processing |
US9031319B2 (en) | 2012-05-31 | 2015-05-12 | Apple Inc. | Systems and methods for luma sharpening |
US9317930B2 (en) | 2012-05-31 | 2016-04-19 | Apple Inc. | Systems and methods for statistics collection using pixel mask |
US9710896B2 (en) | 2012-05-31 | 2017-07-18 | Apple Inc. | Systems and methods for chroma noise reduction |
US9741099B2 (en) | 2012-05-31 | 2017-08-22 | Apple Inc. | Systems and methods for local tone mapping |
US9743057B2 (en) | 2012-05-31 | 2017-08-22 | Apple Inc. | Systems and methods for lens shading correction |
US11089247B2 (en) | 2012-05-31 | 2021-08-10 | Apple Inc. | Systems and method for reducing fixed pattern noise in image data |
US9077943B2 (en) | 2012-05-31 | 2015-07-07 | Apple Inc. | Local image statistics collection |
US9105078B2 (en) | 2012-05-31 | 2015-08-11 | Apple Inc. | Systems and methods for local tone mapping |
US9142012B2 (en) | 2012-05-31 | 2015-09-22 | Apple Inc. | Systems and methods for chroma noise reduction |
US9131196B2 (en) | 2012-05-31 | 2015-09-08 | Apple Inc. | Systems and methods for defective pixel correction with neighboring pixels |
WO2014026711A1 (en) | 2012-08-15 | 2014-02-20 | Abb Technology Ltd | Sensor arrangement for machine vision |
US10755417B2 (en) | 2012-10-31 | 2020-08-25 | Pixart Imaging Inc. | Detection system |
US10255682B2 (en) | 2012-10-31 | 2019-04-09 | Pixart Imaging Inc. | Image detection system using differences in illumination conditions |
US9684840B2 (en) * | 2012-10-31 | 2017-06-20 | Pixart Imaging Inc. | Detection system |
US20140118556A1 (en) * | 2012-10-31 | 2014-05-01 | Pixart Imaging Inc. | Detection system |
US9773169B1 (en) | 2012-11-06 | 2017-09-26 | Cross Match Technologies, Inc. | System for capturing a biometric image in high ambient light environments |
US9293500B2 (en) | 2013-03-01 | 2016-03-22 | Apple Inc. | Exposure control for image sensors |
US10263032B2 (en) | 2013-03-04 | 2019-04-16 | Apple, Inc. | Photodiode with different electric potential regions for image sensors |
US9741754B2 (en) | 2013-03-06 | 2017-08-22 | Apple Inc. | Charge transfer circuit with storage nodes in image sensors |
US10943935B2 (en) | 2013-03-06 | 2021-03-09 | Apple Inc. | Methods for transferring charge in an image sensor |
US9549099B2 (en) | 2013-03-12 | 2017-01-17 | Apple Inc. | Hybrid image sensor |
US9319611B2 (en) | 2013-03-14 | 2016-04-19 | Apple Inc. | Image sensor with flexible pixel summing |
US10354413B2 (en) | 2013-06-25 | 2019-07-16 | Pixart Imaging Inc. | Detection system and picture filtering method thereof |
US9596423B1 (en) | 2013-11-21 | 2017-03-14 | Apple Inc. | Charge summing in an image sensor |
US9596420B2 (en) * | 2013-12-05 | 2017-03-14 | Apple Inc. | Image sensor having pixels with different integration periods |
US20150163422A1 (en) * | 2013-12-05 | 2015-06-11 | Apple Inc. | Image Sensor Having Pixels with Different Integration Periods |
US9473706B2 (en) | 2013-12-09 | 2016-10-18 | Apple Inc. | Image sensor flicker detection |
US10285626B1 (en) | 2014-02-14 | 2019-05-14 | Apple Inc. | Activity identification using an optical heart rate monitor |
US9277144B2 (en) | 2014-03-12 | 2016-03-01 | Apple Inc. | System and method for estimating an ambient light condition using an image sensor and field-of-view compensation |
US9584743B1 (en) | 2014-03-13 | 2017-02-28 | Apple Inc. | Image sensor with auto-focus and pixel cross-talk compensation |
US9497397B1 (en) | 2014-04-08 | 2016-11-15 | Apple Inc. | Image sensor with auto-focus and color ratio cross-talk comparison |
US9538106B2 (en) | 2014-04-25 | 2017-01-03 | Apple Inc. | Image sensor having a uniform digital power signature |
US10609348B2 (en) | 2014-05-30 | 2020-03-31 | Apple Inc. | Pixel binning in an image sensor |
US9686485B2 (en) | 2014-05-30 | 2017-06-20 | Apple Inc. | Pixel binning in an image sensor |
US10401411B2 (en) | 2014-09-29 | 2019-09-03 | Ecoatm, Llc | Maintaining sets of cable components used for wired analysis, charging, or other interaction with portable electronic devices |
US10475002B2 (en) | 2014-10-02 | 2019-11-12 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US11126973B2 (en) | 2014-10-02 | 2021-09-21 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10496963B2 (en) | 2014-10-02 | 2019-12-03 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US9911102B2 (en) | 2014-10-02 | 2018-03-06 | ecoATM, Inc. | Application for device evaluation and other processes associated with device recycling |
US11790327B2 (en) | 2014-10-02 | 2023-10-17 | Ecoatm, Llc | Application for device evaluation and other processes associated with device recycling |
US11734654B2 (en) | 2014-10-02 | 2023-08-22 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10438174B2 (en) | 2014-10-02 | 2019-10-08 | Ecoatm, Llc | Application for device evaluation and other processes associated with device recycling |
US12217221B2 (en) | 2014-10-02 | 2025-02-04 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US11232412B2 (en) | 2014-10-03 | 2022-01-25 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US10445708B2 (en) | 2014-10-03 | 2019-10-15 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US11989701B2 (en) | 2014-10-03 | 2024-05-21 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US12205081B2 (en) | 2014-10-31 | 2025-01-21 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10572946B2 (en) | 2014-10-31 | 2020-02-25 | Ecoatm, Llc | Methods and systems for facilitating processes associated with insurance services and/or other services for electronic devices |
US11436570B2 (en) | 2014-10-31 | 2022-09-06 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10417615B2 (en) | 2014-10-31 | 2019-09-17 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10860990B2 (en) | 2014-11-06 | 2020-12-08 | Ecoatm, Llc | Methods and systems for evaluating and recycling electronic devices |
US11080672B2 (en) | 2014-12-12 | 2021-08-03 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US12008520B2 (en) | 2014-12-12 | 2024-06-11 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US11315093B2 (en) | 2014-12-12 | 2022-04-26 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
WO2016193826A1 (en) * | 2015-06-02 | 2016-12-08 | Sony Mobile Communications Inc. | Enhanced video capture in adverse lighting conditions |
US9565368B2 (en) | 2015-06-02 | 2017-02-07 | Sony Corporation | Enhanced video capture in adverse lighting conditions |
CN107667523A (en) * | 2015-06-02 | 2018-02-06 | 索尼移动通讯有限公司 | Enhancing video capture under bad illumination condition |
EP3139347B1 (en) | 2015-08-20 | 2018-06-13 | Diehl Defence GmbH & Co. KG | Method for determining an alignment of an object |
EP3139347A1 (en) * | 2015-08-20 | 2017-03-08 | Diehl BGT Defence GmbH & Co. Kg | Method for determining an alignment of an object |
US10127647B2 (en) | 2016-04-15 | 2018-11-13 | Ecoatm, Llc | Methods and systems for detecting cracks in electronic devices |
US9912883B1 (en) | 2016-05-10 | 2018-03-06 | Apple Inc. | Image sensor with calibrated column analog-to-digital converters |
US9885672B2 (en) | 2016-06-08 | 2018-02-06 | ecoATM, Inc. | Methods and systems for detecting screen covers on electronic devices |
US10909673B2 (en) | 2016-06-28 | 2021-02-02 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US11803954B2 (en) | 2016-06-28 | 2023-10-31 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US10269110B2 (en) | 2016-06-28 | 2019-04-23 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US10438987B2 (en) | 2016-09-23 | 2019-10-08 | Apple Inc. | Stacked backside illuminated SPAD array |
US10658419B2 (en) | 2016-09-23 | 2020-05-19 | Apple Inc. | Stacked backside illuminated SPAD array |
US10656251B1 (en) | 2017-01-25 | 2020-05-19 | Apple Inc. | Signal acquisition in a SPAD detector |
US10801886B2 (en) | 2017-01-25 | 2020-10-13 | Apple Inc. | SPAD detector having modulated sensitivity |
US10962628B1 (en) | 2017-01-26 | 2021-03-30 | Apple Inc. | Spatial temporal weighting in a SPAD detector |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
US10440301B2 (en) | 2017-09-08 | 2019-10-08 | Apple Inc. | Image capture device, pixel, and method providing improved phase detection auto-focus performance |
TWI662940B (en) * | 2018-06-01 | 2019-06-21 | 廣達電腦股份有限公司 | Image capturing device |
US10937163B2 (en) | 2018-06-01 | 2021-03-02 | Quanta Computer Inc. | Image capturing device |
US11019294B2 (en) | 2018-07-18 | 2021-05-25 | Apple Inc. | Seamless readout mode transitions in image sensors |
US11659298B2 (en) | 2018-07-18 | 2023-05-23 | Apple Inc. | Seamless readout mode transitions in image sensors |
US10848693B2 (en) | 2018-07-18 | 2020-11-24 | Apple Inc. | Image flare detection using asymmetric pixels |
US11989710B2 (en) | 2018-12-19 | 2024-05-21 | Ecoatm, Llc | Systems and methods for vending and/or purchasing mobile phones and other electronic devices |
US11843206B2 (en) | 2019-02-12 | 2023-12-12 | Ecoatm, Llc | Connector carrier for electronic device kiosk |
US11462868B2 (en) | 2019-02-12 | 2022-10-04 | Ecoatm, Llc | Connector carrier for electronic device kiosk |
US11482067B2 (en) | 2019-02-12 | 2022-10-25 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
US10742892B1 (en) * | 2019-02-18 | 2020-08-11 | Samsung Electronics Co., Ltd. | Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device |
US12223684B2 (en) | 2019-02-18 | 2025-02-11 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
US11798250B2 (en) | 2019-02-18 | 2023-10-24 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
CN112560835A (en) * | 2019-09-26 | 2021-03-26 | 苏州立创致恒电子科技有限公司 | Imaging assembly for shielding ambient light interference and image imaging method |
US20220385933A1 (en) * | 2019-11-06 | 2022-12-01 | Koninklijke Philips N.V. | A system for performing image motion compensation |
US11800134B2 (en) * | 2019-11-06 | 2023-10-24 | Koninklijke Philips N.V. | System for performing image motion compensation |
US11430094B2 (en) | 2020-07-20 | 2022-08-30 | Samsung Electronics Co., Ltd. | Guided multi-exposure image fusion |
US11563910B2 (en) | 2020-08-04 | 2023-01-24 | Apple Inc. | Image capture devices having phase detection auto-focus pixels |
US11922467B2 (en) | 2020-08-17 | 2024-03-05 | ecoATM, Inc. | Evaluating an electronic device using optical character recognition |
US12033454B2 (en) | 2020-08-17 | 2024-07-09 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
US12271929B2 (en) | 2020-08-17 | 2025-04-08 | Ecoatm Llc | Evaluating an electronic device using a wireless charger |
US11558574B2 (en) * | 2020-12-14 | 2023-01-17 | Bae Systems Information And Electronic Systems Integration Inc. | Interleaved simultaneous binning mode |
US20220191408A1 (en) * | 2020-12-14 | 2022-06-16 | Bae Systems Information And Electronic Systems Integration Inc. | Interleaved simultaneous binning mode |
WO2022189060A1 (en) * | 2021-03-11 | 2022-09-15 | Koninklijke Philips N.V. | A method and image capturing apparatus for minimising error in ambient light corrected images |
WO2022189176A1 (en) * | 2021-03-11 | 2022-09-15 | Koninklijke Philips N.V. | A method and image capturing apparatus |
JP7612890B2 (en) | 2021-03-11 | 2025-01-14 | コーニンクレッカ フィリップス エヌ ヴェ | Method and image capture device for minimizing errors in ambient light corrected images - Patents.com |
EP4057217A1 (en) * | 2021-03-11 | 2022-09-14 | Koninklijke Philips N.V. | A method and image capturing apparatus for minimising error in ambient light corrected images |
EP4057218A1 (en) * | 2021-03-11 | 2022-09-14 | Koninklijke Philips N.V. | A method and image capturing apparatus |
US11546532B1 (en) | 2021-03-16 | 2023-01-03 | Apple Inc. | Dynamic correlated double sampling for noise rejection in image sensors |
US12192644B2 (en) | 2021-07-29 | 2025-01-07 | Apple Inc. | Pulse-width modulation pixel sensor |
US12069384B2 (en) | 2021-09-23 | 2024-08-20 | Apple Inc. | Image capture devices having phase detection auto-focus pixels |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20070263099A1 (en) | Ambient Light Rejection In Digital Video Images | |
US7884868B2 (en) | Image capturing element, image capturing apparatus, image capturing method, image capturing system, and image processing apparatus | |
JP4571179B2 (en) | Imaging device | |
US8619143B2 (en) | Image sensor including color and infrared pixels | |
JP4840578B2 (en) | Flicker detection method and apparatus for imaging apparatus | |
JP3976754B2 (en) | Wide dynamic range imaging device with selective reading | |
US20070237506A1 (en) | Image blurring reduction | |
US7582871B2 (en) | Image pickup apparatus and a switching-over method for the same | |
WO2010044185A1 (en) | Imaging element and imaging device | |
WO2018075690A9 (en) | Wdr imaging with led flicker mitigation | |
WO2007081743A2 (en) | Method and apparatus providing pixel storage gate charge sensing for electronic stabilization in imagers | |
JP2003101887A (en) | Imaging device and noise removal method | |
JP2007194687A (en) | Imaging device | |
JP2009089219A (en) | Solid-state imaging device and solid-state imaging system using the same | |
JP4523629B2 (en) | Imaging device | |
WO2009147939A1 (en) | Imaging device | |
US7349015B2 (en) | Image capture apparatus for correcting noise components contained in image signals output from pixels | |
JP2012010282A (en) | Imaging device, exposure control method, and exposure control program | |
JP2009253447A (en) | Solid state image sensor for both near-infrared light and visible light, and solid-state imaging apparatus | |
JP4326239B2 (en) | Electronic imaging system and method for improving image quality in an electronic imaging system | |
JP2004289241A (en) | Noise eliminator | |
JP4007802B2 (en) | Imaging device | |
JP3031344B2 (en) | Imaging method and apparatus | |
JP2009118430A (en) | Imaging apparatus, method of driving the imaging apparatus, image generating apparatus and image generating method | |
JP3796421B2 (en) | Imaging apparatus and imaging method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PIXIM, INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MOTTA, RICARDO J.;CHEN, TING;TAO, DOUGLAS K.;AND OTHERS;REEL/FRAME:018020/0987;SIGNING DATES FROM 20060727 TO 20060728 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: COMERICA BANK, A TEXAS BANKING ASSOCIATION, MICHIG Free format text: SECURITY AGREEMENT;ASSIGNOR:PIXIM, INC., A CALIFORNIA CORPORATION;REEL/FRAME:026064/0625 Effective date: 20110325 |
|
AS | Assignment |
Owner name: PIXIM, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:COMERICA BANK;REEL/FRAME:029322/0804 Effective date: 20120911 |