US8013904B2 - View projection matrix based high performance low latency display pipeline - Google Patents
View projection matrix based high performance low latency display pipeline Download PDFInfo
- Publication number
- US8013904B2 US8013904B2 US12/331,281 US33128108A US8013904B2 US 8013904 B2 US8013904 B2 US 8013904B2 US 33128108 A US33128108 A US 33128108A US 8013904 B2 US8013904 B2 US 8013904B2
- Authority
- US
- United States
- Prior art keywords
- image
- projector
- camera
- projection
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 239000011159 matrix material Substances 0.000 title claims abstract description 311
- 230000009466 transformation Effects 0.000 claims abstract description 37
- 238000012545 processing Methods 0.000 claims abstract description 24
- 238000000034 method Methods 0.000 claims description 133
- 230000009977 dual effect Effects 0.000 claims description 77
- 238000012360 testing method Methods 0.000 claims description 32
- 239000002131 composite material Substances 0.000 claims description 18
- 238000012805 post-processing Methods 0.000 claims description 15
- 238000007781 pre-processing Methods 0.000 claims description 15
- 230000000717 retained effect Effects 0.000 claims description 7
- 230000003213 activating effect Effects 0.000 claims description 6
- 238000012986 modification Methods 0.000 abstract description 22
- 230000004048 modification Effects 0.000 abstract description 22
- 230000002829 reductive effect Effects 0.000 abstract description 12
- 239000013598 vector Substances 0.000 description 41
- 230000008569 process Effects 0.000 description 34
- 238000013459 approach Methods 0.000 description 20
- 238000003384 imaging method Methods 0.000 description 20
- 230000007246 mechanism Effects 0.000 description 16
- 238000002156 mixing Methods 0.000 description 13
- 230000036961 partial effect Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 11
- 238000000149 argon plasma sintering Methods 0.000 description 10
- 238000010276 construction Methods 0.000 description 10
- 101100412394 Drosophila melanogaster Reg-2 gene Proteins 0.000 description 9
- 230000009286 beneficial effect Effects 0.000 description 9
- 238000013461 design Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 230000004913 activation Effects 0.000 description 7
- 239000011521 glass Substances 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 230000003993 interaction Effects 0.000 description 6
- 239000000463 material Substances 0.000 description 6
- 230000001788 irregular Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000005094 computer simulation Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 239000008267 milk Substances 0.000 description 3
- 210000004080 milk Anatomy 0.000 description 3
- 235000013336 milk Nutrition 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 101100328884 Caenorhabditis elegans sqt-3 gene Proteins 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000009795 derivation Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000008713 feedback mechanism Effects 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 241001123622 Pseudoeurycea rex Species 0.000 description 1
- 241000495825 Rallus elegans Species 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 239000004566 building material Substances 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 235000020280 flat white Nutrition 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000000135 prohibitive effect Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- the present invention is related to a method of implementing display pipelines with reduce display latency.
- the present invention is further related to methods of reducing display latency in a display system designed to compensate for irregular surfaces and lighting in a projection scene.
- projector-camera systems When projectors and cameras are combined, hybrid devices and systems that are capable of both projecting and capturing light are born.
- This emerging class of imaging devices and systems are known in the research community as projector-camera systems.
- images captured by one or more cameras are used to estimate attributes about display environments, such as the geometric shapes of projection surfaces.
- the projectors in these projector-camera systems then adapt their projected images so as to compensate for shape irregularities in the projection surfaces to improve the resultant imagery.
- a projector can “see” distortions in a projected image, and then adjust its projected image so as to reduce the observed distortions.
- the camera and projector need to be calibrated to each other's imaging parameters so as to assure that any observed image distortion is due to irregularities in the projection environment (i.e., surface irregularities), and not due to distortions inherent to the projector or camera, or due to their relative orientation to each other.
- a stereo camera pair can be used to estimate depth (i.e., achieve a pseudo perspective view) to establish a quasi-depth perception of feature points visible to the stereo camera pair.
- the calibrated stereo camera pair is then used to calibrate the projector. Basically, the establishment of this quasi-depth perception is used to identify surface depth irregularities of a projection surface and thereby of an image projected onto the projection surface.
- the projector can then be calibrated to compensate for the surface depth irregularities in the projected image.
- the projector is first made to project feature points onto a display environment (i.e., the projection surface), which may have an irregular surface.
- the pre-calibrated, stereo camera pair is used to resolve the perspective depth location of the projected points.
- the projector can then be calibrated to compensate for surface/depth irregularities in the projection surface, as determined by the depth location of the projected points.
- Dual photography makes use of Helmholtz reciprocity to use images captured with real cameras to synthesize pseudo images (i.e., dual images) that simulate images “as seen” (or effectively “captured”) by projectors. That is, the pseudo image simulates a captured image as “viewed” by a projector, and thus represents what a projector-captured image would be if a projector could capture images.
- This approach might permit a projector to be treated as a pseudo camera, and thus might eliminate some of the difficulties associated with the calibration of projectors.
- BRDF bidirectional reflectance distribution function
- dual photography ideally takes advantage of this dual nature (i.e., duality relationship) of a projected image and a captured image to simulate one from the other.
- dual photography (and more precisely Helmholtz reciprocity) requires the capturing of the light transport property between a camera and a projector. More specifically, dual photography requires determination of the light transport property (i.e., light transport coefficient) relating an emitted light ray to a captured light ray.
- dual photography would appear to offer great benefits
- dual photography is severely limited by its physical and impractical requirements of needing extremely large amounts of computer memory (both archival, disk-type memory and active, solid-state memory), needing extensive computational processing power, and requiring much time and user intervention to setup equipment and emit and capture multitudes of light rays for every projection environment in which the projector-camera system is to be used.
- a “primal configuration” (i.e., a configuration of real, physical devices prior to any duality transformations) includes a real digital projector 11 , a real projected image 13 , and a real digital camera 15 .
- Light is emitted from real projector 11 and captured by real camera 15 .
- a coefficient relating each projected light ray (from each projector pixel e within real projector 11 ) to a correspondingly captured light ray (captured at each camera sensor pixel g within real camera 15 ) is called a light transport coefficient. Using the light transport coefficient, it is possible to determine the characteristics of the projected light ray from the captured light ray.
- real projector 11 is preferably a digital projector having a projector pixel array 17 symbolically shown as a dotted box and comprised of s rows and r columns of individual projector pixels e.
- Each projector pixel e may be the source of a separately emitted light ray.
- the size of projector pixel array 17 depends on the resolution of real projector 11 .
- a VGA resolution may consist of an array of 640 by 480 pixels (i.e., 307,200 projector pixels e), an SVGA resolution may have an array of 800 by 600 pixels (i.e., 480,000 projector pixels e), an XVG resolution may have an array of 1024 by 768 pixels (i.e., 786,732 projector pixels e), an SXVG resolution may have an array of 1280 by 1024 pixels (i.e., 1,310,720 projector pixels e), and so on, with greater resolution projectors requiring a greater number of individual projector pixels e.
- real camera 15 is a digital camera having a camera sensor pixel array 19 symbolically shown as a dotted box and comprised of u rows and u columns of individual camera pixels g. Each camera pixel g may receive, i.e. capture, part of an emitted light ray.
- the size of camera sensor pixel array 19 again depends on the resolution of real camera 15 . However, it is common for real camera 15 to have a resolution of 4 MegaPixels (i.e., 4,194,304 camera pixels g), or greater.
- each camera pixel g within camera sensor pixel array 19 may capture part of an individually emitted light ray from a distinct projector pixel e, and since each discrete projector pixel e may emit a separate light ray, a multitude of light transport coefficients are needed to relate each discrete projector pixel e to each, and every, camera pixel g.
- a light ray emitted from a single projector pixel e may cover the entirety of camera sensor pixel array 19 , and each camera pixel g will therefore capture a different amount of the emitted light ray.
- each discrete camera pixel g will have a different light transport coefficient indicating how much of the individually emitted light ray it received.
- camera sensor pixel array 19 has 4,194,304 individual camera pixels g (i.e., has a 4 MegaPixel resolution)
- each individual projector pixel e will require a separate set of 4,194,304 individual light transport coefficients to relate it to camera sensor pixel array 19 . Therefore, millions of separately determined sets of light transport coefficients (one set per projector pixel e) will be needed to relate the entirety of projector pixel array 17 to camera sensor pixel array 19 and establish a duality relationship between real projector 11 and real camera 15 .
- each discrete projector pixel e requires a separate set of 4,194,304 individually determined light transport coefficients to relate it to real camera 15 , and since real projector 11 may have millions of discrete projector pixels e, it is beneficial to view each set of light transport coefficients as a separate array of light transport coefficients and to collect these separate arrays into a single light transport matrix (T).
- Each array of light transport coefficients constitutes a separate column within light transport matrix T.
- each column in T constitutes a set of light transport coefficients corresponding to a separate projector pixel e.
- a light transport matrix T will be used to define the duality relationship between real projector 11 and real camera 15 .
- matrix element T ge identifies an individual light transport coefficient (within light transport matrix T) relating an individual, real projector pixel e to an individual, real camera pixel g.
- FIG. 1B The duality transformation, i.e. dual configuration, of the system of FIG. 1A is shown in FIG. 1B .
- real projector 11 of FIG. 1A is transformed into a virtual camera 11 ′′
- real camera 15 of FIG. 1A is transformed into a virtual projector 15 ′′.
- virtual camera 11 ′′ and virtual projector 15 ′′ represent the dual counterparts of real projector 11 and real camera 15 , respectively, and are not real devices themselves. That is, virtual camera 11 ′′ is a mathematical representation of how a hypothetical camera (i.e., virtual camera 11 ′′) would behave to capture a hypothetically projected dual image 13 ′′, which is similar to real image 13 projected by real projector 11 of FIG. 1A .
- virtual projector 15 ′′ is a mathematical representation of how a hypothetical projector (i.e., virtual projector 15 ′′) would behave to project hypothetical dual image 13 ′′ that substantially matches real image 13 , as captured by real camera 15 (of FIG. 1A ).
- the positions of the real projector 11 and real camera 15 of FIG. 1A are interchanged in FIG. 1B as virtual camera 11 ′′ and virtual projector 15 ′′.
- virtual camera 11 ′′ has a virtual camera sensor pixel array 17 ′′ consisting of s rows and r columns to match the resolution of projector pixel array 17 of real projector 11 .
- virtual projector 15 ′′ has a virtual projector pixel array 19 ′′ consisting of u rows and v columns to match the resolution of camera sensor pixel array 19 of real camera 15 .
- the light transport matrix T permits one to create images that appear to be captured by a projector, with a camera acting as a second projector.
- the high complexity involved in generating and manipulating light transport matrix T has heretofore greatly limited its application, particularly in the field of calibrating projector-camera systems.
- projector-camera systems can be used to achieve more complex images than typical. For example, can such systems combine multiple images from multiple projectors to create a single composite image? Alternatively, can one generate “3-D” images, or other visual effects that previously required more complex equipment and more complex projection setups? Also, can one make better use of the camera in a projector-camera system so that the camera can be an active part of an image creation process. Furthermore, what are the implications of using a low resolution, inexpensive camera in such projector-camera systems?
- the present invention addresses the problem of how to determine what a projector needs to project in order to create a desired image by using the inverse of the light transport matrix, and its application will further simplify the calibration of projector-camera systems.
- a method of generating light transport coefficients relating a digital projector to a digital camera includes: simultaneously activating a first group of projection pixels within the projector to project a first test pattern on a projection scene, any projection pixels not in said first test pattern being maintained dark; capturing a first image of the first test pattern on the projection scene; simultaneously activating a second group of projection pixels within the projector to project a second test pattern on the projection scene, any remaining projection pixels not in the second test pattern being maintained dark, wherein the first and second groups of projection pixels have only one projection pixel in common defining a target projection pixel; capturing a second image of said second test pattern on said projection scene; comparing image pixels of the first image to corresponding image pixels of the second image and retaining the darker of two compared image pixels, the retained image pixels constituting a composite image; and identifying all non-dark image pixels in the
- the method of generating light transport coefficients may further include identifying the light transport coefficients for a selected number of the target projection pixels; generating an index associating each of the selected number of target projection pixels to their correspondingly associated non-zero light transport coefficients; and storing only the non-zero light transport coefficients.
- the light transport matrix may be written in either a full matrix format or in a simplified two-array format consisting of a first array of light transport coefficient entries and a second array maintaining a record indicating which light transport coefficients in the first array correspond to which column in the light transport matrix, so the modification process for compensating for light noise and light scattering effects can be applied in either a full matrix format or in a two-array format. In either case, the process includes imposing a display constraint, where it otherwise would not be applicable.
- the first step is to impose a simulated display constraint on a light transport matrix T of a projector-camera system in an arbitrary scene, wherein the display constraint specifies that any two distinct light rays emitted from the projector will hit the camera's image sensor at distinct parts.
- This can be done by the following method: for each row in light transport matrix T, compare matrix entries along a common row of the light transport matrix, and retain the highest valued matrix entry in the common row; populate a light transport array with the highest valued matrix entry from each row of said light transport matrix T; maintain a record indicating from what column in light transport matrix each entry in the light transport array originated from; using the record and the light transport array, extract light footprint information for each projector pixel in the projector, as needed.
- the arbitrary scene includes light scattering objects between a light path from said projector to said camera, and said method is effective for compensating light scattering effects.
- each column in the light transport matrix T corresponds to an image produced by activation of a single projector pixel in the projector, as captured by the whole of the camera's image sensor, and each light footprint information constitutes light transport values for its corresponding projector pixel.
- the light transport array and record constitute a modified light transport matrix for use in place of said light transport matrix T, and the modified light transport matrix is suitable for use in dual photography.
- the above-described method may be used in generating an estimated inverse matrix of the light transport matrix by the following step: calculate normalized light footprint information in groups corresponding to a respective projector pixel; create an intermediate array and populate the intermediate array with the calculated normalized values of its corresponding light footprint information; maintaining an intermediate record for the intermediate array for associating each entry in the intermediate array with its corresponding normalized light footprint; interpret the intermediate array and intermediate record as notation for an intermediate matrix, and apply a transpose matrix operation on said intermediate array.
- this process of calculating normalized light footprint information in groups corresponding to a respective projector pixel consists of generating a sum of the squares of the group of light transport array entries constituting one light footprint, and dividing each entry in the group of light transport array entries by the sum.
- an estimated inverse matrix of the modified light transport matrix T* can be generated by: identifying in turn each column in the modified light transport matrix T* as target column, calculating normalized values for not-nullified entry values in the target column with reference to said target column; creating an intermediate matrix of equal size as said modified light transport matrix T*; populating each column in the intermediate matrix with the calculated normalized values of its corresponding target column in the modified light transport matrix, each normalized value in each populated column in the intermediate matrix maintaining a one-to-one correspondence with the not-nullified entry values in its corresponding column in the modified light transport matrix T*; and applying a transpose matrix operation on said intermediate matrix.
- value entries in the intermediate matrix not populated with a normalized value from a not-nullified entry value in a corresponding target column in the modified light transport matrix are populated with a zero value.
- the process of calculating normalized values for not-nullified entry values in the target column with reference to the target column consists of generating a sum of the squares of only the not-nullified entry values in the target column and disregarding all nullified values in the target column, and dividing each not-nullified entry value by the sum.
- ⁇ hacek over (T) ⁇ a target column in modified light transport matrix T* is denoted as T*r and a corresponding column in ⁇ hacek over (T) ⁇ is denoted as ⁇ hacek over (T) ⁇ r
- T*r a target column in modified light transport matrix
- ⁇ hacek over (T) ⁇ r a corresponding column in ⁇ hacek over (T) ⁇
- FIGS. 1A and 1B show a prior art setup for implementation of dual photography.
- FIGS. 2A and 2B show a setup for dual photography in accord with the present invention.
- FIG. 3A is an example of a light ray projected from a single projector pixel, bouncing off a scene, and forming a small light footprint on a camera sensor pixel array.
- FIG. 3B is an example of how a projected image may be represented as an array of information.
- FIG. 3C is an example of how a captured light footprint on a camera sensor pixel array may be represented as an array of information.
- FIG. 4A is an illustrative example of a projected footprint on a light sensor array within a digital camera resulting from activation of a single projector pixel in a projector.
- FIG. 4B is an illustrative example of a column of light transfer coefficients within a matrix T reflecting the example of FIG. 3A .
- FIGS. 5A and 5B show two examples of two columns of projector pixels simultaneously projected onto a scene having a checkerboard pattern.
- FIGS. 6A and 6B show two examples of two rows of projector pixels simultaneously projected onto a scene having a checkerboard pattern.
- FIG. 7A shows a generated light ray footprint resulting from a single projector pixel, as created by combining the images of FIGS. 5A and 6A .
- FIG. 7B shows a generated light ray footprint resultant from a single projector pixel, as created by combining the images of FIGS. 5B and 6B .
- FIG. 8 is a first example of an index associating projector pixels to non-zero valued light transport coefficients, as determined by a light ray footprint as generated in FIG. 7A or 7 B.
- FIG. 9 is second example of an index associating projector pixels to non-zero valued light transport coefficients, as determined by a light ray footprint as generated in FIGS. 7A or 7 B.
- FIG. 10A shows a real captured image taken by a real camera.
- FIG. 10B shows a dual captured image, as seen by a real projector, as generated using the method of the present invention.
- FIG. 11 is an example of a result obtained by use of nomography to calibrate a projector.
- FIG. 12 shows a projected image distorted by two wine glasses placed between a projector and a scene.
- FIG. 13 is an example of two overlapping adjacent light footprints produced by two distinct projector pixels.
- FIG. 14 is an example of a process in accord with the present invention for imposing the display constraint on an arbitrary scene.
- FIG. 15 is a further step in the imposing of the display constraint as introduced in FIG. 14 .
- FIG. 16 is an example of the present invention being applied to color compensation in poster images.
- FIG. 17 shows the projection scene of FIG. 12 , but with the present invention's method of compensating for light distortion applied.
- FIG. 18 is an exemplary projection setup for using dual photography to create an immersive display system.
- FIG. 19 shows an immersive projector P 2 being used to simulate a front projector P 1 .
- FIG. 20 is an example of an image generated using the virtual projector implementation of FIG. 19 .
- FIG. 21A shows the right side of an image projected by a real front projector.
- FIG. 21B shows the corresponding left side of the image shown in FIG. 21A , but in FIG. 21B , the left side of the image is projected by an immersive projector.
- FIG. 21C shows the right side image of FIG. 21A joined to the left side image of FIG. 21B .
- FIGS. 22A and 22B show two additional examples of a left side image generated by an immersion projector joined to the right side image generated by a front projector.
- FIGS. 23A to 23C show an alternate application of the present invention to recreate in a real room a virtual image created in a virtual model room.
- FIG. 24 is an example of the application of the technique of FIGS. 23A-23C to project an image bigger than the projection space in a real room without image distortion.
- FIG. 25 is an exemplary projection system in accord with the present invention in a minimal form.
- FIG. 26 shows a prototype based on the design of FIG. 25 .
- FIG. 27 is an alternate view of the setup of FIG. 26 .
- FIG. 28A shows, under ambient lighting, a room with the projection system of FIGS. 26 and 27 installed.
- FIG. 28B shows the room of FIG. 28A under immersive projection lighting.
- FIG. 29A shows the room of FIG. 28A with an uncalibrated projected image.
- FIG. 29B shows the room of FIG. 28A with a calibrated projected image.
- FIG. 30A shows the room of FIG. 28A with an uncalibrated blank projection.
- FIG. 30B shows the room of FIG. 28A with a calibrated blank projection.
- FIG. 31 is a first step in the application of the present invention to a dome mirror projector.
- FIG. 32 is a second step in the application of the present invention to a dome mirror projector.
- FIG. 33 is a third step in the application of the present invention to a dome mirror projector.
- FIG. 34 demonstrates the application of the present invention to a desired image, to produce a precisely distorted image for projection on a dome mirror projector.
- FIG. 35 demonstrates the result of projecting the distorted image of FIG. 34 .
- FIG. 36 is an alternative design for ceiling-mounted operation.
- FIG. 37 is an alternate configuration of the present invention.
- FIG. 38 is still another configuration of the present invention.
- FIG. 39 shows a set up for mosaicing two, or more, projector-camera systems to create composite image.
- FIG. 40 shows a composite of images from multiple projector-camera systems.
- FIG. 41 shows an image as projected by a first of two projector-camera systems in a multi-projector-camera system.
- FIG. 42 shows an image as projected by a second of two projector-camera systems in a multi-projector-camera system.
- FIG. 43 is an example of the present invention applied to the projection of a checkerboard pattern onto a complex surface.
- FIG. 44A shows a design that uses a single curved mirror 125 and multiple projector-camera pairs 145 .
- FIG. 44B shows a design that uses a single mirror pyramid 151 and multiple projector-camera pairs 145 .
- FIG. 45 shows how multiple large FOV projectors 153 a and 153 b can be used to achieve an even larger overall projection FOV.
- FIG. 46 shows image generation configuration in accord with the present invention.
- FIG. 47 illustrates an example of pre-processing display pipeline.
- FIG. 48 illustrates an example of post-processing display pipeline.
- FIG. 49 illustrates a display pipeline having both pre-processing and post-processing modules.
- FIG. 50 is a simplified configuration of the display pipeline of FIG. 49 in accord with the present invention.
- FIG. 51 is a further simplified configuration of the display pipeline of FIG. 49 in accord with the present invention.
- projector-camera systems could be treated like multi-camera systems, and standard camera calibration techniques (described above) might be used to calibrate projector-camera systems.
- a projector could be treated as a pseudo-camera, then it could be calibrated along with a real camera in a manner similar to the camera calibration stage of the multi-camera system described above, and the “bootstrapping” projector calibration stage previously used for calibrating projector-camera systems might be eliminated.
- an imaging setup in accord with the present invention may include a real projector 21 and a real camera 25 .
- Real projector 21 is preferably a digital projector and has an imaging element including an imaging projection array (i.e., projector pixel array 27 ), consisting of p rows and q columns of individual imaging projection elements (i.e., projector pixels j)
- Projector pixel array 27 is internal to real projector 21 , and is shown for discussion purposes as crossed lines within a dotted square in FIG. 2A .
- Real projector 21 is preferably of the liquid crystal display (LCD) type, digital light processing (DLP) type, liquid crystal on silicon (LCOS) type, or other digital projection technology type.
- LCD liquid crystal display
- DLP digital light processing
- LCOS liquid crystal on silicon
- real camera 25 is a digital camera having an image sensor including a camera sensor pixel array 29 (i.e., light receptor array or image sensor array), consisting of m rows by n columns of individual camera pixels i (i.e., image sensor elements or light receptor pixels).
- camera sensor pixel array 29 is shown on real camera 25 , but it is to be understood that camera sensor pixel array 29 is internal to real camera 25 .
- This physical setup using real projector 21 and real camera 25 is preferably called the ‘primal’ setup.
- Light rays emitted from individual projector pixels j within real projector 21 form real image 23 by bouncing off a projection surface (i.e., display environment or scene), which may have an irregular or flat shape, and some of the light rays eventually reach the image sensor within real camera 25 .
- a projection surface i.e., display environment or scene
- each light ray is dispersed, reflected, and refracted in the scene and hits the camera's image sensor at a number of different locations throughout camera sensor pixel array 29 .
- the individually projected light ray forms an m-by-n image on camera sensor pixel array 29 , with each individual camera pixel i receiving a certain amount of light intensity contribution from the projected light ray.
- each light ray emitted from each individual projector pixel j generates a different set, or array, of individual light transport coefficients, one for each camera pixel i within camera sensor pixel array 29 . Consequently, each set (i.e., array) will consist of (m ⁇ n) [i.e. m -multiplied-by-n] individual light transport coefficients, one for each camera pixel i.
- each set of light transport coefficients is arranged as a column of coefficients to form a composite light transport matrix, T, then the composite light transport matrix T will have a different column of light transport coefficients for each individual projector pixel j. Furthermore, since there is a one-to-one correspondence between each light transport coefficient entry (i.e., matrix element) within each column and each camera pixel i, each column represents the entire image captured at camera 25 resulting from a single projector pixel j being turned ON. Accordingly, the entire (i.e., composite) light transport matrix T will consist of (p ⁇ q) [i.e. p multiplied-by q] columns (one column [i.e. captured image] for each individually turned ON projector pixel j) and (m ⁇ n) rows (one row for each individual camera pixel i).
- Rprjct i.e., real projection vector
- Rcptr i.e., real captured image vector
- each projector pixel j results in a light ray that is scattered across the entire camera sensor pixel array 29 , each individual camera pixel i will have a differently valued light transport coefficient indicative of an intensity value of a predefined light characteristic it received from each individual projector pixel j.
- this light characteristic is preferably a measure of light intensity. Therefore, each projector pixel j will result in a column of (m ⁇ n) individual light transport coefficients, each coefficient indicating an amount of light intensity received by each camera pixel i.
- light transport matrix T Since real projector 21 has (p ⁇ q) projector pixels j, light transport matrix T will have (p ⁇ q) columns [one for each projector pixel j] and (m ⁇ n) rows [one for each camera pixel i] of individual light transport coefficients. Thus, light transport matrix T has traditionally been necessarily huge, consisting of (p ⁇ q ⁇ m ⁇ n) individual light transport coefficients.
- a “dual” setup is one where real projector 21 is replaced by a virtual camera 21 ′′ having a virtual camera sensor pixel array 27 ′′ of equal size as real projector pixel array 27 .
- virtual camera 21 ′′ has a virtual camera sensor pixel array 27 ′′ comprised of p-rows by q-columns of virtual camera pixels j′′.
- real camera 25 is replaced by a virtual projector 25 ′′ having a virtual projector pixel array 29 ′′ of equal size as real camera sensor pixel array 29 . Therefore, virtual projector 25 ′′ has a virtual projector pixel array 29 ′′ comprised of m-rows by n-columns of virtual projector pixels i′′.
- a virtual image 23 ′′ (as projected by virtual projector 25 ′′) would be represented by an (m ⁇ n) virtual projection vector, Vprjct′′.
- a virtually captured image captured by virtual camera 21 ′′ would be represented by a (p ⁇ q) virtual captured image vector, Vcptr′′. Therefore in this dual setup, virtual captured image vector Vcptr′′ relates to virtual Projection vector Vprjct′′ by a “dual” light transport matrix T′′. By the principle of Helmholtz reciprocity, the light transport is equal in both directions.
- the dual light transport matrix T′′ for the dual setup i.e., the duality transformation setup
- Vcptr′′ virtual captured image vector
- Vprjct′′ virtual projection vector
- the transpose matrix operation of a general [x ⁇ y] matrix A is denoted by the addition of a superscript letter “T” (such as A T , for example) and is defined by an [y ⁇ x] matrix whose first column is the first row of matrix A, and whose second column is the second row of matrix A, whose third column is the third row of matrix A, and so on.
- this matrix operation simply flips the original matrix A about its first element, such that its first element (i.e., at position (1,1)) remains unchanged and the bottom of the first column becomes the end of the first row.
- the dual light transport matrix T′′ for the dual setup is readily computable by flipping the real light transport matrix T to obtain its transpose, T T , as described.
- dual light transport matrix T′′ and transposed light transport matrix T T may be used interchangeably.
- real light transport matrix T holds the individual light transport coefficients corresponding between each individual projector pixel j of real projector 21 and all the individual camera pixels i of real camera 25 . Therefore, a determination of each individual light transport coefficient corresponding between an individual projector pixel j and all camera pixels i should avoid light ray contributions from other projector pixels j in projector pixel array 27 .
- an initial, real projection image_j created by setting a j th projector pixel to a value of 1 (i.e., is turned ON) and setting to a zero value (i.e., turning OFF) all other elements in projector pixel array 27 .
- the j th projector pixel is the projector pixel under-test for which the light transport coefficients is to be determined.
- These test conditions are annotated as a real projection vector Rprjct_j that has a value of one at an entry location associated with the j th projector pixel, and has all other entry locations corresponding to all other projector pixels set to a value of zero.
- One may then capture real projection image_j to determine its corresponding (m ⁇ n) real captured image vector Rcptr_j, which defines the j th column of matrix T.
- This method of acquiring a column of light transport coefficients for matrix T for a given j th projector pixel suggests that a systematic method for capturing the entire matrix T is to sequentially turn ON each projector pixel j of real projector 21 (one projector pixel at a time), and to capture its corresponding real image, Rcptr_j, with real camera 25 .
- all the captured image vectors Rcptr_ 1 to Rcptr_(p ⁇ q) are grouped to form matrix T.
- Each captured image vector Rcptr_j constitutes a column of light transport coefficient entries in matrix T. This results in a matrix T having (p ⁇ q) columns and (m ⁇ n) rows of individual light transport coefficients.
- a feature of the present invention proposes a method of reducing the number of procedural steps in the determination of real light transport matrix T. That is, instead of requiring (p ⁇ q) image projection-and-capture steps (and the storing of p ⁇ q captured images), the presently proposed method captures at most “p plus q” [i.e. (p+q)] images, and under specific circumstances, described below, the number can be further reduced dramatically.
- the present method is based on the following assumption: for most projector-camera display applications, any two distinct light rays b and c emitted from real projector 21 will typically hit camera sensor pixel array 29 at distinct parts. That is, any light ray emitted from a projector pixels j will not be dispersed across the entirety of camera sensor pixel array 29 , but will be primarily concentrated within a small group of camera pixels i forming a light footprint within a distinct region of camera sensor pixel array 29 . Furthermore, there will be no, or negligible, overlap in adjacent light footprints (i.e., negligible overlap in the camera pixels i hit by distinct projected light rays from distinct projector pixels j). This ideal light ray distribution condition may be assumed when the resolution of the real camera 25 is much higher (i.e., at least two times higher) than the resolution real projector 21 .
- the ideal light ray distribution condition may not be achievable when the resolution of real camera is 25 is comparable to, or smaller than, the resolution of real projector 21 , or when a light diffusing material is located between the light path of a projected light ray from real projector 21 to camera sensor pixel array 29 .
- the projection scene i.e., display environment or projection surface/area
- the projected light rays will be diffused and the likelihood of significant light overlap between the different light rays at camera sensor pixel array 29 is greatly increased.
- a display setup designed to ensure high resolution projections is likely to be devoid of such light diffusing material, and it is virtually guaranteed that each projected light footprint will be substantially distinct from the next. That is, in venues, or settings, where high resolution projections are desired, it is likely that the venue will be clear of light diffusing articles along the light path of a projected image.
- the pixel resolution of real camera 25 is at least four times greater than that of real projector 21 and that the projection scene is devoid of any light diffusing material, such that the requirements for the ideal light ray distribution condition are met.
- alternative embodiments for more general situations where the resolution of real camera 25 is not restricted to be greater than real projector 21 , or where the light path between real projector 21 and real camera 25 is not limited to be devoid of light diffusing material are provided below.
- FIGS. 3A to 3C A first pictorial representation of the present assumptions and vector generation method is shown in FIGS. 3A to 3C .
- an illustrative example of the present invention shows real projector pixel array 27 having a pixel resolution smaller than that of real camera sensor pixel array 29 .
- each square within projector pixel array 27 represents a different projector pixel j.
- the created light ray footprint Ft 1 on camera sensor pixel array 29 covers several camera pixels i, with the camera pixels at the center of light ray footprint Ft 1 (indicated as white squares) receiving the most light intensity, and the camera pixels i along the periphery of light ray footprint Ft 1 (indicated as light gray squares) receiving less light intensity, and those camera pixels i not within light ray footprint Ft 1 (indicated as dark gray squares) receiving no light.
- projector pixel array 27 is first separated into six rows, row_ 1 to row_ 6 , and each row is then turned 90 degrees to form six column segments col_seg_r 1 to col_seg_r 6 .
- element col_ 1 _row_ 19 (i.e., the 19 th element down from the top of column col_ 1 ) corresponds to a projector pixel j at array location (5,3) of projector pixel array 27 .
- Element col_ 1 _row_ 19 is shown as a white square, while all other elements are shown as darkened squares, to indicate that element col_ 1 _row_ 19 is the only projector pixel turned ON.
- camera sensor pixel array 29 in FIG. 3C A similar process is applied to camera sensor pixel array 29 in FIG. 3C to construct real captured image vector Rcptr.
- camera sensor pixel array 29 would be broken up into m rows (i.e., 12 rows) that are each turned 90 degrees to form 12 column segments, not shown.
- the 12 column segments are joined end-to-end to form one composite column cmra_col_ 1 (i.e., vector Rcptr) of (m ⁇ n) elements (i.e., (12 ⁇ 14) or 168 elements).
- the light ray emitted from projector pixel j at array location (5,3) of projector pixel array 27 is assumed to form light ray footprint Ft 1 on camera sensor pixel array 29 .
- the span of light ray footprint Ft 1 is indicated by a circle, and encompasses twelve camera pixels i on four different rows of camera sensor pixel array 29 .
- the camera pixels at the center of light footprint circle Ft 1 receive the most light intensity, and are identified as white squares, and the camera pixels along the perimeter of light footprint circle Ft 1 receive less light intensity and are identified as light gray squares. Those camera pixels not within light ray footprint Ft 1 are shown as dark gray squares. As is illustrated in FIG. 3C , these white and light gray squares constitute nonzero elements within the captured image, and are shown as white and light gray squares (i.e., nonzero) valued NZ elements interspersed between many zero valued elements (i.e., dark gray squares) within vector Rcptr (i.e., column cmra_col_ 1 ).
- FIGS. 4A and 4B A second example providing a close-up view of how the light footprint of a single light ray from a projector pixel j may cover several camera pixels i is shown in FIGS. 4A and 4B .
- FIG. 4A a partial view of another exemplary camera sensor pixel array 29 shows individual camera pixels i numbered horizontally from 1 to n on the first row, continuing with (n+1) to (2n) on the second row, and (2n+1) to (3n) on the third row, and so on. Following this sequence, it is to be understood that camera pixels i along the bottom-most row would be numbered from (m ⁇ 1)n+1 to (mn).
- a second light ray footprint Ft 2 of a single light ray from another exemplary, single, projector pixel j impacting camera sensor pixel array 29 is denoted as a circle.
- those camera pixels i not within light ray footprint Ft 2 i.e. those camera pixels i not hit by the single light ray emitted from the j th projector pixel] are shown as deeply darkened, those camera pixels i partly covered by footprint Ft 2 are shown as lightly darkened, and those camera pixels i completely within footprint Ft 2 are shown as having no darkening.
- each camera pixel i that is at least partially covered by light ray footprint Ft 2 will register a light intensity value proportional to the amount of light it receives.
- This light intensity value may be assigned as the light transfer coefficient for that individual camera pixel i.
- the light transport coefficient of each camera pixel i may be made proportional to the light intensity value registered by the individual camera pixel i. Nonetheless, those camera pixels i that are not directly hit by the projection light ray from the j th projector pixel should have a light intensity value of zero (or close to zero, or below a predefined threshold light intensity value), and their corresponding light transport coefficient should likewise have a value of zero (or close to zero).
- FIG. 4B an example of a captured image vector Rcptr_j [or j th column of matrix T], as would correspond to the footprint Ft 2 example of FIG. 4A is shown.
- This j th column of matrix T is illustratively shown as a numbered vertical sequence of light transport coefficients, each corresponding to the numbered camera pixel i of camera sensor pixel array 29 of FIG. 4A .
- the numerical sequence of capture vector Rcptr_j preferably follows the horizontally numbered sequence of individual camera pixels i in camera sensor pixel array 29 shown in FIG. 4A .
- NZ non-zero values
- All other camera pixels have “ZERO” values for light transport coefficients.
- NZ represents any non-zero light transport coefficient value, and that this value would be related to the amount of light intensity received by the corresponding camera pixel i. Since light ray footprint Ft 2 spans several rows of camera sensor pixel array 29 , each row is sequentially listed in captured image vector Rcptr_j, with several long sequences of zero valued light transport coefficients interspersed between a few non-zero, NZ, valued light transport coefficients.
- first set of projector pixels S 1 within imaging projector pixel array 27 maps to a corresponding set of columns (one per projector pixel) in light transport matrix T. [i.e. S1 ⁇ 1, . . . ,(p ⁇ q) ⁇ ] Furthermore, it is assumed that the first set of projector pixels S 1 includes target projector pixel j, i.e. the target projector pixel under test.
- Rcptr_S 1 be a first real image captured by real camera 25 of a projected image created by the simultaneous activation of the first set of projector pixels S 1 .
- Rcptr_S 2 be a second real image captured by real camera 25 of a projected image created by the simultaneous activation of the second set of projector pixels S 2 .
- the light transport coefficients of the j th column of light transport matrix T (which corresponds to the target projector pixel under test, i.e.
- j may be directly obtain from real captured images Rcptr_S 1 and Rcptr_S 2 by identifying the one light ray footprint they share in common (i.e., similar to light ray footprints Ft 1 or Ft 2 in FIGS. 3A , 3 C or 4 A).
- This common light ray footprint would correspond to a light ray emitted from target projector pixel j, which is the only lit projector pixel j shared in common among first set S 1 and second set S 2 .
- the next step is to determine how to identify the one light ray footprint commonly shared by both real captured images Rcptr_S 1 and Rcptr_S 2 .
- a method of identifying this common light ray footprint is to conduct a pixel-by-pixel comparison of both captured images Rcptr_S 1 and Rcptr_S 2 , and to identify the dimmer of two compared pixels. For example, in first captured image Rcptr_S 1 only sensor pixels within individual light ray footprints, each corresponding to the simultaneous lighting of the first set of projector pixels S 1 , will have non-zero (NZ) light intensity values, and all other pixels in captured image Rcptr_S 1 will have zero values, i.e.
- the intersection of the lit regions (i.e., light ray footprints) of Rcptr_S 1 and Rcptr_S 2 is identified, and this identified light ray footprint will correspond to the target projector pixel under-test, j.
- a method of accomplishing this is to conduct a pixel-by-pixel comparison of both captured images Rcptr_S 1 and Rcptr_S 2 , and to retain only the darker (i.e., dimmer) of the two compared pixels.
- This process may be expressed as: Tj ⁇ MIN(Rcptr_S1, Rcptr_S2) where Tj is the j th column of matrix T, and “MIN” indicates that the lower valued camera pixel (i.e., the darker camera pixel having a lower captured light intensity value) in Rcptr_S 1 and Rcptr_S 2 is retained, and the higher valued (i.e., brighter) camera pixel is discarded. In this way, the only high intensity values that are retained correspond to a light ray footprint common to both S 1 and S 2 .
- the operation MIN(Rcptr_S 1 , Rcptr_S 2 ), provides an image where only pixels in set L [i.e. ⁇ L] are lit, which is a good approximation of Tj, i.e. the j th column in matrix T.
- the light transport coefficients for any individual projector pixel j may be obtained by comparing both collections and identifying the region L where a captured image of a lit column intersect a captured image of a lit row, the intersection corresponding to a light ray projected by activation of projector pixel j, alone.
- a method of determining transport matrix T is to collect a first set of images Rcptr_Sy_ 1 to Rcptr_Sy_q, corresponding to q captured images of q lit columns of projector pixels, and construct a second set of images Rcptr_Sx_ 1 to Rcptr_Sx_p corresponding to p captured images of p lit rows of projector pixels.
- the intersection of the captured image of a vertical light line and the captured image of a horizontal light line would include all the camera pixels i that correspond to a target projector pixel under-test, j, (i.e., all camera pixels i that lie within a light ray footprint created by a light ray emitted from project pixel under-test j).
- a scheme that satisfies this property is to use pixel coordinates: let Rprjct_Sx_j be a first projected image such that only pixels with an x-coordinate equal to j are turned ON, and let Rprjct_Sy_k be a second projected image such that only pixels with a y-coordinate equal to k are turned ON.
- a scene, or display environment is illustratively shown as a flat surface 41 with a checkerboard pattern.
- the checkerboard pattern is shown purely to facilitate the present description by providing a contrast for projected light lines, and the display environment need not have any pattern and may be of irregular shape.
- the relative location of each vertical light line and horizontal light line are known since it is known which projector pixels were turned in their creation, and their known relative displacement may be used to calibrate real projector 21 , as is more fully explained below.
- a bright vertical line, or vertical light beam, 47 — k (i.e., column of light rays emitted simultaneously from a column of projection pixels), is projected onto surface 41 by real projector 21 .
- bright vertical line 47 — k is generated by turning ON all projector pixels within projector pixel array 27 that have a y-coordinate equal to k.
- Real camera 25 then captures this real image, Rcptr_Sy_k, as one example of a lit column of projector pixels.
- real projector 21 projects a second bright vertical line 47 — t of light rays onto surface 41 .
- bright vertical line 47 — t is generated by turning ON all projector pixels having a y-coordinate equal to t.
- Real camera 25 then captures this image, Rcptr_Sy_t, as another example of a lit column of projector pixels. It is to be understood that real projector 21 could project a separate bright vertical line of light rays for each of the q columns of projector pixel array 27 , and real camera 25 could capture a separate image of each projected bright vertical line.
- real projector 21 preferably projects a bright horizontal line, i.e. horizontal light beam, 49 — j made up of light rays emitted simultaneously from a row of projection pixels onto projection surface 41 .
- Bright horizontal line 49 — j may be generated by turning ON all projector pixels having an x-coordinate equal to j.
- Real camera 25 then captures this real image, Rcptr_Sx_j, as one example of a lit row of projector pixels.
- real projector 21 projects a second bright horizontal line 49 — r (made up of simultaneously lit, individual light rays) onto surface 41 .
- bright horizontal line 49 — r may be generated by turning ON all projector pixels having an x-coordinate equal to r.
- Real camera 25 then captures this real image, Rcptr_Sx_r, as another example of a lit row of projector pixels. It is to be understood that real projector 21 could project a separate horizontal line of light rays for each of the p rows in projector pixel array 27 , and real camera 25 could capture a separate image of each projected bright horizontal line of light rays.
- both compared image pixels are brightly lit pixels showing an impact by a light ray. Comparison of these two image pixels within this intersection region will result in either of the two bright beam pixels being selected for image 41 ′. As a result, image 41 ′ will show a brightly lit region 53 corresponding to a projected light ray emitted from coordinates (j,k) of projector pixel array 27 .
- the light transport coefficients for the projector pixel having coordinates (j,k) can be extracted from generated image 53 without having to physically capture an image of a light ray individually projected from the projector pixel at (j,k).
- FIG. 7B A second example is shown in FIG. 7B , where the combination of real captured images corresponding to FIGS. 5B and 6B (which would respectively correspond to real captured images Rcptr_Sy_t and Rcptr_Sx_r following the above-described naming convention), results in a second brightly lit region 55 corresponding to a projected light ray emitted from coordinates (r,t) of projector pixel array 27 .
- a similar process may be followed to identify the light transport coefficients of every projector pixel j in projector pixel array 27 without having to individually turn ON and project each projector pixel j, one-at-a-time.
- This method of generating an image of a hypothetically, singularly activated projector pixel to obtain the projector pixel's light transport coefficients requires at most only (p+q) captured images, one for each row and column of projector pixels in projector pixel array 27 of real projector 21 .
- the (p+q) captured images may be discarded, and all that needs to be saved is an index and corresponding footprint information.
- FIG. 8 An example of this approach is shown in FIG. 8 , where a partial view of projector pixel array 27 (illustrated as a crosshatch pattern) is compared to a partial view of camera sensor pixel array 29 (also illustrated as a crosshatch pattern).
- the cross-hatch pattern illustrating the partial camera sensor pixel array 29 is made denser than the cross-hatch pattern representing projector pixel array 27 in order to better illustrate that, in the present example, pixel density (i.e., resolution) of real camera 25 is preferably greater than the resolution of real projector 21 , and thus a light ray emitted from a single projector pixel j may create a light footprint (such as F 1 ) spanning several camera pixels i.
- pixel density i.e., resolution
- an index of real projector pixel array 27 is represented as a partial array with circles 1 , 2 , 3 , . . . (q+1) . . . (2q+1) . . . etc. representing individual projector pixels j.
- real camera sensor pixel array 29 is shown superimposed by a corresponding partial array of circular light footprints F 1 , F 2 , F 3 , . . . F(q+1), . . . etc. representing the footprint information corresponding to individually activated projector pixels j (assuming light scattering is ignored).
- footprints F 1 , F 2 , F 3 , . . . F(q+1), . . . etc. respectively correspond to projector pixels 1 , 2 , 3 , . . . (q+1), etc.
- a first set of information corresponds to an index of projector pixels j and a second set of information corresponds to footprint information associating groups of camera pixels i with each projector pixel j.
- zero-valued coefficients need not be stored, which greatly reduces the memory requirements.
- FIG. 9 A second example of organizing this information is shown in FIG. 9 , where an index 61 of projector pixels j is shown to point to, or correspond to, group 63 of grayscale (i.e., non-zero valued, or “NZ Grayscale”) camera pixel i information (i.e., corresponding to a resultant light ray footprint).
- grayscale i.e., non-zero valued, or “NZ Grayscale”
- a light transport matrix T can be very large, and its use (or the use of its transpose, the dual light transport matrix T T ) requires large amounts of active memory (for example, DRAM) and excessive computational processing power/time. Therefore, general use of the dual image has heretofore not been practical.
- T T is the matrix transpose of light transport matrix T (i.e., matrix T turned on its diagonal), and the values of row T T j (where j is any value from 1 to (p ⁇ q)) therefore correspond to the j th column of matrix T (i.e., T COL _j). Since each column of T has (m ⁇ n) elements (i.e., equivalent to the pixel resolution of real camera 25 ), this would appear to be a very large number of elements. However, recalling that in the present implementation, only a limited number of elements in each column of matrix T are non-zero (i.e., only those corresponding to camera sensor pixels i upon which shone the intersection of a vertical and horizontal light beam, i.e.
- FIGS. 10A and 10B An example of a dual image generated using this method is shown in FIGS. 10A and 10B .
- FIG. 10A shows a primal image, as projected by a real projector.
- FIG. 10B shows the resultant dual image computed by an implementation of the present method.
- the dual image of FIG. 10B represents the image virtually captured by the real projector 27 (i.e., virtual camera 21 ′′), or stated differently, the image as “seen” by the real projector 21 .
- the above discussion shows how to compute dual images efficiently from a reduced set of images, which saves image capture time as well as computation time.
- the real captured images and dual captured images can be used to calibrate both real camera 25 and real projector 21 , respectively.
- projector-view images i.e., dual images
- the real camera can be calibrated by using the known dimensions of the object to compensate for distortions in the captured images arising from the different angle views.
- the virtual images, as seen by the real projector can then be generated from the same captured images using dual photography techniques, as described above, and the real projector may be calibrated in a manner analogous to the real camera.
- a possible setback associated with this straightforward method is the difficulty in generating and manipulating the light transport matrix T, and operating on the large image vectors resulting from the large number of camera and projector image pixels.
- this labor-intensive and expensive process is mitigated substantially by using the dual photography method described above, for purposes of calibrating real projector 21 in a projector-camera system, such as that shown in FIG. 2A , Applicants have developed a novel method that avoids the need for generating a full T matrix and for creating a full dual image, while still taking advantage of some of the benefits of using a dual image (i.e., an image as “seen” by real projector 21 ) to facilitate calibration of real projector 21 .
- the generation of a full T matrix can be avoided altogether by noting that to calibrate the projector; one does not need to construct an entire dual image, but only needs to determine the location of a strategic set of points, particularly if the projection surface is not very irregular.
- the strategic set of points may be the corners of the squares within the checkerboard pattern on flat surface 41 of FIG. 5A , as seen by the projector. The greater the irregularity of the projection scene surface, the greater the number of points in the strategically selected set of points. Conversely, the flatter the projection scene, the fewer the number of points in the strategic set of points.
- Applicants have adapted a homography-based method to achieve this goal of a reduced number of points in the strategic set, and thus avoid the generation and manipulation of a full T matrix, or a simplified T matrix described above, and further avoid the full dual-image generation process by incorporating some features of dual-image generation into the calibration process, itself.
- This alternate embodiment of the present invention directly computes the coordinates of the checker corner features (or other known feature) across the projector-view images without requiring the construction of the dual images and the detection of the corners from the constructed dual images.
- a checkerboard is being used purely for illustrative purposes, and any scene may be captured.
- the projector-camera system need not be calibrated beforehand. Rather, all elements of the projector-camera system are calibrated at the same time.
- projector images follow the so-called perspective projection model, which relates two (or more) views of a single scene as seen by two (or more) separated sources. That is, different viewing sources will “see” a different view (or image) of the same scene since the different sources are located at different angles to the scene.
- perspective projection model which relates two (or more) views of a single scene as seen by two (or more) separated sources. That is, different viewing sources will “see” a different view (or image) of the same scene since the different sources are located at different angles to the scene.
- there is only one real scene irrespective of the number of views of the scene
- one can generate a mathematical relationship between the different views that will associate any point on any one view to a corresponding point on the real scene (and thereby to a corresponding point in each of the other views).
- the perspective projection model (which relates the two views to the common, real scene) would permit one to extract from the captured real image some information relating to the virtual image, without generating a full dual image.
- the relationship between two image projections of a planar object from different views is a simple linear projective transformation or homography.
- This transformation relates the coordinates of any point on the planar object (i.e., a homogeneous coordinate) to the coordinates of a corresponding point on a specific view of the planar object.
- the projector-view image of the planar checkerboard is a homography of the corresponding camera image.
- 10 white points are preferably projected on a scene, such as the checkerboard pattern.
- An image of the checkerboard with the projected white points is then captured using a real camera, such as real camera 25 , and the coordinates of the 10 points in the camera image are computed.
- a real camera such as real camera 25
- the coordinates of the 10 points in the camera image are computed.
- the projector preferably projected the points in a known relation to each other, the coordinates of the points in the projected image are known. This results in 10 pairs of corresponding coordinates, one set as captured by the real camera and a second set as projected by the real projector.
- the coordinates of the checkerboard corners detected in the camera images can be directly transformed to compute the corresponding corner coordinates in the projector-view images.
- the projector parameters can then be calibrated using a camera calibration method, such as the one described above.
- FIG. 11 An example of this approach is shown in FIG. 11 .
- the feature capture results are as follows.
- the circles, or dots, not shown in outline (for example dots 81 ) were used for estimating homography, while the outlined circles, or dots, (for example dots 83 ) are the corner point features.
- the outlined dots 83 are on the actual corners, indicating that the projector coordinates for each detected corner has been correctly captured.
- this approach of using patterns to extract dot information includes a series of pattern projection and image capture steps.
- each projected pattern i.e., vertical or horizontal line
- any distortion of the projected pattern due to irregularities on the projection surface may be ignored for the moment.
- the spatial relation (or transformation) between the projector and camera can be made since the spatial relation between the vertical and horizontal lines are known. That is, the spatial relation between the vertical and horizontal lines, as projected, are known since one knows which projector pixels were turned on during their generation. Furthermore, since one knows the true orientation of the vertical and horizontal lines, one can compensate for surface irregularities on the projection scene and view angles.
- This approach borrows from the above-described, simplified method for generating the transport matrix, T, but reduces the number of needed image projection-and-capture steps from (p+q) to a fraction of (p+q) determined by the number of desired dots.
- As little as 4 dots may be used to calibrate a projector-camera system whose scene environment is a flat projection surface, in order to account for some irregularities in a projection surface, for possible errors in the image capture steps, and errors in identifying projected light lines or patterns due to existence of ambient light noise, it has been found that projecting seven horizontal light lines and seven vertical light lines to generate as many as fourteen to forty-nine dots is sufficient for overcoming most errors.
- a still alternate application of the above described method for generating transport matrix T can be applied toward solving the following problem: given a projector and camera pair, if one would like the camera “to see” a desired view on a given scene (i.e., projection environment), what should the projector illuminate onto the scene in order to produce that desired view? This task is termed “View Projection” hereinafter.
- a goal of the present invention is to devise a measurement process that is completely automatic and requires no user intervention or parameter tweaking beyond casual hardware setup.
- the projector-camera system that achieves View Projection capabilities should completely calibrate itself with a single touch of a button.
- a projection system that is capable of displaying correctly on complex surfaces under challenging settings has many real world applications. Making it fully automatic further allows it to be deployed in settings where professional display setup is not always available.
- Another goal of the present invention is to reduce the requirement on exotic imaging equipment, so that high quality calibration can be achieved with off-the-shelf components within an average consumer's reach. This goal may be achieved by combining some of the features and benefits of the previous embodiments in an automated and simplified process.
- real projector 21 has a p-row-by-q-column projector pixel array 27 while the real camera 25 has an m-row-by-n-column camera sensor pixel array 29 .
- Light rays emitting from real projector 21 bounce off a scene (i.e., projection surface) and some of them eventually reach real camera sensor pixel array 29 .
- each ray of light is dispersed, reflected, and refracted in the scene and hits camera sensor pixel array 29 at a number of different locations.
- the light ray emitted from projector pixel j reaches real camera 25 and forms an m-by-n image across the entirety of camera sensor pixel array 29 , where each camera pixel i receives a certain amount of light.
- the image projected is represented as a (p ⁇ q)-element real image vector Rprjct
- the image captured is represented as an (m ⁇ n)-element captured image vector Rcptr
- This enables “Dual Photography”, where one can synthesize images that appear as though they were captured by real projector 21 , with the scene appearing as if illuminated by light emanating from real camera 25 .
- View Projection the object of the present embodiment, addresses a different problem.
- View Projection one is interested in finding a projector image that, when used to illuminate a scene, allows the camera to capture a predefined, desired image.
- light transport terms one is provided with T and with a desired Rcptr, and one wants to recover, i.e. generate, Rprjct.
- the inverse of light transport matrix T has been used in the past for other purposes.
- the inverse of the light transport matrix T is used to analyze the way light bounces in arbitrary scenes.
- a scene is decomposed into a sum of ⁇ -bounce images, where each image records the contribution of light that bounces ⁇ times before reaching a camera.
- each ⁇ -bounce image is computed to infer how light propagates through the scene.
- T ⁇ 1 The inverse, T ⁇ 1 , of transport matrix T is however more difficult to compute than its transpose, T T , requiring much more computational resources. Indeed, the sheer size of T makes computing T ⁇ 1 an extremely challenging task requiring tremendous computational resources. Worse, it is not always possible to find the inverse of an arbitrary matrix. That is, some matrixes may not have an inverse.
- the multiplicative inverse of a matrix is typically defined in terms of identify matrix I.
- T ⁇ 1 ⁇ T T T ⁇ 1 ⁇ T T .
- one constraint used above in the generation of an estimation of T T is likewise useful in the construction of T ⁇ 1 .
- the Display Constraint may be violated if there is significant light scatter in the scene, such as the example given above where the scene consists of a glass of milk, and the light rays are diffused by the milk resulting in significant overlap.
- FIG. 12 Another example of a scene that violates the Display Constraint is shown in FIG. 12 , which includes two wine glasses between a projector and a projection scene, resulting in significant light scattering.
- a projected image consisting of lettering is projected onto the scene and shown to experience much optical distortion due to the wine glasses.
- each column of the transport matrix T is the projection image resulting from one pixel from the projector.
- all of the column entries have zero values except those corresponding to the camera pixels hit by light emitting from the corresponding projector pixel.
- FIG. 14 An example of this is shown in FIG. 14 , where four columns C 1 to C 4 of a light transport matrix T are shown adjacent each other.
- circles Pixel_ 1 c and Pixel_ 1 d identify the brightest parts (i.e., the center parts) of a first pixel's light footprint that is recorded as an image within column C 1 .
- the full light footprint includes other less bright pixel groups Pixel_ 1 b and Pixel_ 1 e .
- only the brightest parts of each light footprint within each column C 1 -C 4 is identified by circles, since the circles identify those pixel coefficient entries that will be retained in construction of a modified (i.e., estimated) light transport matrix T. As was shown in FIG.
- the brightest part of any one light footprint does not overlap with the brightest part of any other light footprint.
- perimeter sections of a light footprint i.e., Pixel_ 1 b and Pixel_ 1 e
- FIG. 14 if one compares adjacent circled groups of pixels along adjacent columns, one will note that none of the circled pixels in adjacent columns overlap.
- partial light footprint Pixel_ 1 c in column C 1 is offset in the horizontal direction from partial light footprint Pixel_ 2 c in column C 2 , which in turn is offset from partial light footprint Pixel_ 3 c in column C 3 , which is likewise offset from partial light footprint Pixel_ 4 c in column C 4 , and so on.
- none of the other bright partial light footprint circled sections i.e., Pixel_ 1 d , Pixel_ 2 b , Pixel_ 3 b , and Pixel_ 4 b ) within each column line up with each other in the horizontal direction.
- T the orthogonal nature of T may be assured by the way in which T is constructed in the above-described process for generating a modified T.
- the amount of memory necessary for storing T is likewise reduced by storing only nonzero values within each column.
- this process is simplified because the nonzero values to be stored can be quickly identified by storing only the brightest value within each row of T, which automatically creates resized light footprints. For example in FIG.
- the identified bright sections may then be combined into a vector representation 40 of modified matrix T (hereinafter light transport matrix T and modified light transport matrix T are used interchangeably, unless otherwise stated). As shown, vector representation 40 removes any overlap between adjacent columns of T by limiting itself to only the brightest pixel values within each row of T, and effectively imposes the Display Constraint on an arbitrary scene.
- T is orthogonal, by design. It is to be understood that to fully define modified light transport matrix T, one only needs a second matrix column, or second array, to hold index information indicating which groups of partial light footprint sections belong to the same resized light footprint. In the present case, for example, groups Pixel_ 1 c and Pixel_ 1 d are part of a single resized light footprint corresponding to a first pixel. Similarly, light footprint sections Pixel_ 2 b and Pixel_ 2 c together form a second resized light footprint for a second pixel, and so on.
- AA ⁇ 1 I
- the identity matrix I is comprised of a matrix having entry values set to a value of “one” (i.e., 1) along a diagonal from its top left corner (starting at matrix location (1,1) ) to its bottom right corner (ending at matrix location (r,g)), and having entry values set to “zero” (i.e., 0) everywhere else.
- ⁇ hacek over (T) ⁇ T is the inverse of the part of T that covers the overlapping area of the field-of-views of the projector and the camera. It only recovers the projector pixels in Rprjct that fall in the overlapping area and blacks out the other pixels.
- ⁇ hacek over (T) ⁇ T may be reduced to a representation consisting of a first column of nonzero entries and a second column of corresponding index values. Recall that the Display Constraint dictates that the non-zero entries in the distinct columns of T do not line up on the same row.
- the maximum valued entry in each row (i.e., the row-maximum value) may be identified and designated as the one non-zero entry for that row that corresponds to a light footprint. This eliminates any low valued entries due to light noise. By identifying these row-maximum entry values, one obtains the non-zero values of each column. Therefore, for each projector pixel, one identifies the corresponding set of camera pixels, along with the distribution of light intensity among these camera pixels. Typically, this set of corresponding camera pixels (associated to any given projector pixel) is a very small subset of the entire camera image. Since only the entries from these corresponding camera pixels need to be considered during matrix operations, performing view projection image transformation using this sparse matrix representation is very efficient.
- each pair of projected images has only one intersection point in common.
- One construction that satisfies this property uses the following pixel coordinates as indicators: let Xj be an image creating by turning on only those pixels whose x-coordinate is equal to j, and Yk be an image created by turning only those pixels whose y-coordinate is equal to k. Then MIN(Xj, Yk) gives an image with coordinates (j, k) turned ON. Using this method, capturing p+q images allows one to synthesize all p ⁇ q columns of T.
- a method for establishing projector-camera pixel correspondence is to project the X and Y coordinates in time sequential binary codes, in theory allowing all correspondence to be captured in log(N) time, where N is the maximum coordinate value.
- this method places stringent requirements on the camera sensor resolution, especially when the least significant bits are being projected.
- One may also propose to project multiple stripes simultaneously to cut down the number of images needed. This however requires some way to identify each of the stripes captured in an image, e.g.
- the view projection matrix, ⁇ hacek over (T) ⁇ T is capable of compensating for geometric and photometric distortions introduced by projection optics, display surfaces, and their relative positioning.
- the efficacy of the view projection capabilities of ⁇ hacek over (T) ⁇ T was tested by means of a number of experiments.
- FIG. 16 An example is shown in FIG. 16 .
- the Poster image on the left is shown under white light.
- the poster image on the right is shown under view projection illumination.
- the faces of the cubes are made to appear to have the same color.
- a sheet of white paper placed over the poster image on the right reveals the actual projected image used to produce this desired view. It was found that viewers quickly lose track of the original appearance of the poster after seeing the animation. While it is difficult to show in a paper, this scene as observed by the camera is quite close to that observed by the human eye.
- the poster was animated by cycling the colors in the poster and making the colors appear uniform. Movies were also shown over the poster while photometric compensation took place in real time to eliminate all traces of the underlying poster image.
- transport matrix T depends on the Display Constraint, which stipulates minimal overlap between adjacent light footprints resulting from adjacently lit projector pixels.
- Display Constraint stipulates minimal overlap between adjacent light footprints resulting from adjacently lit projector pixels.
- An example of a situation where the Display Constraint is not upheld is discussed above in reference to FIG. 12 , where two wine glasses are placed between a projector and a projection surface, or scene.
- the view projection matrix ⁇ hacek over (T) ⁇ T can enforce (i.e., impose) the Display Constraint on an arbitrary scene by selecting only the brightest pixel within each row of T in the creation of an approximation of the inverse of T.
- the method of generating view projection matrix ⁇ hacek over (T) ⁇ T which forces a pseudo Display Constraint on a projection environment, can be applied to the generation of the light transport matrix T.
- the simplified light transport matrix T as described above in reference to FIGS. 13-15 , one may select the brightest pixel in each row to identify components of a light footprint, whereby light diffusion error and light noise error can be greatly reduced or eliminated.
- a conventional front projector P 1 (similar to real projector 21 of FIG. 2A ) is used in conjunction with an immersive projector P 2 .
- FOV field-of-view
- the portion of a display surface covered by the FOV 91 of front projector P 1 is a subset of the field-of-view of immersive projector P 2 , as indicated by field-of-view lines 93 .
- FOV 91 of front projector P 1 is entirely within the scope of FOV 93 of immersive projector P 2 , it would be desirable to have immersive projector P 2 simulate the projected image produced by front projector P 1 .
- FOV 91 of front projector P 1 does not necessarily need to overlap any part of FOV 93 of immersive projector P 2 .
- FOV 93 of immersive projector P 2 it is desirable that two light transport matrices separately associating a camera C to front projector P 1 and to immersive projector P 2 be created. As it would be understood, the two transport matrices may be generated separately since the FOV's of P 1 and P 2 do not necessarily overlap.
- camera C is placed such that the FOV 95 of camera C is a superset of FOV 91 of front projector P 1 and a subset of FOV 93 of immersive projector P 2 .
- FOV lines 95 the field-of-vision of camera C completely encompasses FOV 91 of front projector P 1 , but is entirely engrossed by FOV 93 of immersive projector P 2 .
- T 1 a first light transport matrix
- the view projection matrices naturally convey the projector-camera correspondence into projector-projector correspondence.
- FIG. 19 Such a projection is shown in FIG. 19 , where immersive projector P 2 is used to simulate a front projector, such as projector P 1 of FIG. 18 .
- a virtual projector P 1 ′′ as simulated by immersive projector P 2 , is denoted by dotted lines. Therefore, image p 1 , as projected by front projector P 1 of FIG. 18 , can be recreated by projecting transformed image (T 2 ⁇ 1 ) ⁇ (T 1 p 1 ) on immersive projector P 2 of FIG. 19 .
- viewers 100 a , 100 b , and 100 c do not have to concern themselves with occluding any front projector, i.e. P 1 or P 1 ′′.
- T 2 ⁇ 1 can be replaced by an approximation matrix ⁇ hacek over (T) ⁇ T 2 , as explained above.
- view projection matrix ⁇ hacek over (T) ⁇ T which approximates an inverse matrix T ⁇ 1 , can be freely substituted for T ⁇ 1 in the following discussions, unless otherwise stated.
- FIG. 20 An example of an image generated using this virtual projector implementation is shown in FIG. 20 .
- a front projected image 101 is simulated using a large field-of-view display system.
- Projector 103 located along the bottom of FIG. 20 is part of the large field-of-view display system, and is used to generate image 101 shown in the center of FIG. 20 .
- FIGS. 21A to 21C illustrate the quality of the simulation by showing a real front-projected image ( FIG. 21A ) and a simulated front-projected image ( FIG. 21B ) seamlessly coupled together side-by-side ( FIG. 21C ).
- FIG. 21A shows the right side of a front-projected image projected by a real front projector, such as P 1 of FIG. 18 .
- FIG. 21B shows the corresponding left side of the front-projected image of FIG. 21A , but in FIG. 21B the left side of the front-projected image is projected by an immersive projector, such as P 2 of FIG. 19 , to simulate a virtual front projector, such as P 1 ′′ of FIG. 19 .
- FIG. 21C The quality of the simulated, left side, front-projected image created by the immersive projector is better illustrated in FIG. 21C , where the right side front-projected image of FIG. 21A is shown joined to the left side front-projected image of FIG. 21B , side-by-side, resulting in a seamless registration of the right side and left side images created by a real front projector and a simulated, virtual front projector, respectively.
- FIGS. 22A and 22B Two additional examples showing side-by-side comparisons of real front-projected images created by a real front projector and simulated front-projected images created by an immersive projector are shown in FIGS. 22A and 22B .
- the left half of the shown image is created by an immersive projector to simulate a display from a virtual front projector
- the right side half of the shown image is created by a real front projector.
- immersive projector P 2 of FIG. 23C will be used to create various ambient lighting effects (i.e., virtual environments). If the camera is positioned such that its FOV covers a significant portion of the display room, one can use view projection to create an immersive environment where the walls are lit according to a virtual model. To achieve this, camera C is therefore positioned such that its FOV covers a significant portion of a display room 111 , as shown in FIG. 23A . In FIG. 23A , camera C and immersive projector P 2 are positioned such that the FOV of camera C encompasses most, if not all of (and preferably more than) the FOV of immersive projector P 2 .
- P 2 is shown as an immersive projector, but projector P 2 may be any type of a projector, such a front projector.
- a light transport matrix T 3 relating camera C to projector P 2 is captured, i.e. determined, using any of the methods described above.
- This constructed virtual model room (i.e., virtual room) 111 ′′, shown in FIG. 23B may be a computer simulation, for example.
- virtual room 111 ′′ Once virtual room 111 ′′ is created, various simulated lighting effects (or projected images or floating images) may be added to virtual room 111 ′′.
- FIG. 23B shows virtual room 111 ′′ being lit by candle light from a large candle 113 .
- the computer model further models the position and resolution of camera C (of FIG. 23A ), shown as dotted box C in FIG. 23B .
- the computer model then “captures” (i.e., creates) a synthetic view c 3 ′′ of virtual room 111 ′′ from the viewpoint camera C to simulate a real image of virtual room 111 ′′ as if it had been captured by real camera C of FIG. 23A .
- the simulated lighting effects of FIG. 23B can then be recreated in real room 111 of FIG. 23C using P 2 by projecting transformed image (T 3 ⁇ 1 ; ) ⁇ (c 3 ′′).
- FIG. 24 An example of an application of this technique is shown in FIG. 24 .
- it is desired to project an image 117 that is bigger than the walls of a real room 111 .
- various techniques may be used to calibrate a real projector to compensate for the angles of the walls and ceiling to the projection wall of real room 111 , but the present invention solves this problem using a different approach.
- virtual room 111 ′′ (of FIG. 23B ) has dimensions similar to real room 111 (of FIGS. 23C and 24 ), and image 117 is superimposed in an undistorted fashion onto virtual room 111 ′′.
- a view c 3 ′′ (i.e., a synthetic captured image) of image 117 without distortion on virtual room 111 ′′ from the viewpoint of camera C is then created.
- Immersive projector P 2 is then made to project transformed image (T 3 ⁇ 1 ) ⁇ (c 3 ′′) to recreate the undistorted oversized image 117 on a wall of real room 111 .
- the result is an undistorted projection that did not require calibrating projector P 2 to compensate for curvatures (or other irregularities) on a projection surface.
- virtual projectors and environments can be combined to create an immersive movie viewer. Since the virtual environment is also an active visual field, one can animate the larger field of view display to create a more engaging experience.
- a large FOV creates a sense of immersion and provides a more engaging experience for a viewer.
- the present approach describes an immersive projection system with a very large FOV.
- the system is also designed with a built-in large FOV camera/light sensor that is able to capture light from the areas covered by projection's FOV.
- the sensor allows the system to adapt the projected light so as to optimize image quality and more generally allow the system to interact with its environment.
- the present system is primarily motivated by the desire to display surround video content, it is important to note that this new projection system can also be used to view conventional video content.
- an exemplary projection system in accord with the present invention in its minimal form consists of the following components: a projector 121 ; a camera 123 , which can be a digital still camera or a digital video camera; curved mirror 125 , which can be spherical or otherwise; and mounting mechanisms for the above components.
- Light from projector 121 is reflected off curved mirror 125 before reaching a display surface 127 , which can be any surface, including building walls, floors, ceilings, and dedicated projection screens. Display surface 127 can also be arbitrarily shaped. Reflecting the projected light off the curved mirror enlarges the projector FOV. Light rays from the environment, which may or may not have originated from the projector, also reflect off the curved mirror 125 before reaching the camera. This similarly enlarges the camera FOV.
- FIG. 26 shows a prototype based on the design of FIG. 25 , and all elements in FIG. 26 similar to those of FIG. 25 have similar reference characters and are described above.
- the present construction highlights one of the key applications of smart projector-camera systems, which is to build immersive multi-wall virtual environments.
- the present example uses a simple panoramic projection setup consisting of a conventional front projector 121 , a high-resolution digital still camera 123 , and a hemispherical curved mirror 125 .
- curved mirror 125 (which may be termed a ‘dome projector’) is a low-cost hemispherical plastic security mirror dome of the type used community convenience stores.
- mirror dome 125 is quite far from a true hemispherical surface (or any simple parametric form, for that matter).
- FIG. 27 is an alternate view of the setup of FIG. 26 , and shows the view of mirror 125 as seen (very roughly) from the viewpoint of camera 123 .
- camera 123 is able to “see” the floor, at least three vertical walls, and the ceiling by means of the reflection in mirror 125 .
- FIG. 28A a room with the present projection system installed is shown under ambient lighting.
- FIG. 29A shows an uncalibrated dome projector displaying an image of a checkerboard.
- FIG. 29B shows the same setup, but with geometric compensation using the view projection matrix. As the image was shot from the location of the reference camera, straight lines in the view remain straight across multiple walls.
- FIG. 30A shows an uncalibrated dome projector displaying a uniform intensity image. As can been seen, the resulting image is significantly darker towards the top left and right corners of the front wall.
- FIG. 30B the same uniform intensity image is projected in a calibrated dome projector, which is shown to produce a more uniform intensity.
- dome projector setup of FIGS. 25-30 can be used in place of immersive projector P 2 of FIGS. 18 and 19 to achieve simulation of a front projector, as described above, or in place of immersive projector P 2 in FIGS. 23A-23C to create virtual environments, as described above.
- FIG. 28B An example of immersive projection lighting created using the present projector-camera system is shown in FIG. 28B .
- the present projection-camera system is able to project images onto the two walls as well as the ceiling.
- An example of how this effect is achieved using the view projection matrix, ⁇ hacek over (T) ⁇ T is illustrated in FIGS. 31-33 .
- the view projection matrix ⁇ hacek over (T) ⁇ T is first generated using any of the methods described above. As is explained above, when a projection surface consists primarily of a flat surface (or conjoined flat surfaces), forty-nine (49) or fewer, reference points may be generated using seven vertical light lines and seven intersecting horizontal light lines to approximate a full view projection matrix, ⁇ hacek over (T) ⁇ T , and still achieve a good level of projector to camera calibration. In this case, the missing matrix entry values may be extrapolated from the 49 reference points since the projection surface is assumed to be flat.
- the projection surface consists of a curved mirror
- a full view projection matrix ⁇ hacek over (T) ⁇ T at the resolution of the projector be generated. Since in the present example projector 121 has a resolution of p ⁇ q projector pixels, calibration between projector 121 and camera 123 should be achieved by generating p ⁇ q light transport reference points.
- projector 121 of FIG. 26 individually projects a series of q vertical lines VL_ 1 to VL_q onto the projection surface (i.e. mirror 125 of FIG. 26 in the present case), which are individually, and automatically, captured by camera 123 of FIG. 26 .
- projector 121 then individually projects p horizontal lines HL_ 1 to HL_p that are in turn individually, and automatically, captured by camera 123 .
- the captured vertical and horizontal lines are each individually combined to identify their uniquely coincident reference point (i.e., light footprint). This process is continued until all unique intersecting points are identified (shown as white circles in FIG. 33 ), and their light transport information extracted. It is to be understood that although the vertical and horizontal lines emitted from projector 121 are perfectly vertical and horizontal, the resultant projected lines on dome mirror 125 will follow the curvature of dome mirror 125 .
- the Display Constraint be enforced in the construction of view projection matrix ⁇ hacek over (T) ⁇ T .
- the Display Constraint be enforced in the construction of view projection matrix ⁇ hacek over (T) ⁇ T .
- Desired Image C is written as a vector 200 consisting of m ⁇ n image pixel entry values, C 1 to C m ⁇ n .
- Vector 200 is multiplied with the created view projection matrix ⁇ hacek over (T) ⁇ T , which consists of (p ⁇ q) rows and (m ⁇ n) columns to produce a (p ⁇ q) transformed image P, written as vector 201 and consisting of (p ⁇ q) image pixel entry values to be respectively applied their corresponding one of the (p ⁇ q) projector pixels of projector 121 .
- the resultant transformed image P is shown to consist of p rows and q columns.
- transformed image P from FIG. 34 is sent to projector 121 as Projector LCD Image P, and is projected onto mirror 125 ( FIG. 26 ).
- the resultant image 203 on room 111 is an undistorted representation of Desired Image C of FIG. 35 .
- camera 123 and projector 121 were not calibrated prior to creating transformed image P. Rather, the distortion of transformed image P inherently compensates for issues related to a lack of calibration between camera 123 and projector 121 due to it having been constructed using view projection matrix ⁇ hacek over (T) ⁇ T , which includes calibration compensating information for the camera-projector pair.
- the view projection matrix ⁇ hacek over (T) ⁇ T would be applied to the video image. That is, since a video image is comprised of a plurality of still images arranged in sequence, one would apply the view projection matrix ⁇ hacek over (T) ⁇ T transformation to each of the sequenced still images to produce a transformed video projection.
- the FOV of projector 121 and the FOV of camera 123 are in general different, and may or may not overlap.
- images captured by camera 123 can be used as feedback for improving the quality of a projected image from projector 121 in a manner similar to those described above.
- feedback from camera 123 to projector 121 can be used to compensate for variations in a display surface reflectance properties and shape (as seen by camera 123 ) so that a projected image appears as though it were projected on a flat white surface.
- the FOV of camera 123 camera may also include areas not covered by the FOV of projector 121 .
- the FOV of projector 121 covers the front and side walls of test room 127 shown in FIGS. 28A and 28B
- the camera may capture areas outside the projector's FOV, possibly including areas where viewers are located. This allows the system to adapt and interact with viewers either by detecting and tracking the viewers or the viewers' pointing devices. It may be possible for camera 123 to track small lights mounted, for example, on remote controls and facilitate user interaction.
- FIG. 36 an alternate configuration based on the construct of FIG. 25 but geared toward ceiling-mounted operation is shown. All elements similar to those of FIG. 25 have similar reference characters and are described above.
- FIGS. 37 and 38 two additional alternate configurations are shown. All elements in FIGS. 37 and 38 similar to those of FIG. 25 have similar reference characters and are described above.
- FIG. 37 a planar mirror 141 is used to fold the optical path so that projector 121 and camera 123 can be placed under the curved mirror 125 , thereby achieving a smaller footprint.
- FIG. 38 shows a booth design for enclosing projector 121 , camera 123 , curved mirror 125 , and flat mirror 141 within a booth 143 for display booth operation. Using this construct, one can simultaneously produce two projection images; a first front (or rear) projection image on a first Display Surface A and a second rear projection image on a second Display Surface B.
- the projector and cameras do not have common optical centers.
- the Display Constraint comes from the observation that in a typical projector camera setup for information display purposes, any two distinct light rays emitting from distinct projector pixels will typically hit a camera sensor pixel array at distinct parts, i.e., there is usually little overlap in the camera pixels i hit by light from each of the distinct light rays. It implies that the columns of T are orthogonal to each other, which enables the normalization process in equation (1) to lead to the inverse of T.
- T in situations where the Display Constraint is not observed naturally, one can modify the construct of T to artificially impose the Display Constraint by declaring the brightest pixel in each row of T as being part of a light footprint resulting from a distinct light ray, and declaring all other pixels in the same row of T to be zero-valued entries. This operation forces T to become orthogonal, and thereby permits the application of equation (1).
- FIG. 1 An example of mosaicing using the present invention is presented using a two-projector display system. It is to be understood that the present approach to constructing a mosaic display may be extended to display systems having three or more projectors, since extension to systems consisting of more projectors is straightforward. That is, the process described below for combining first projection image from a first projector with second projection image from a second projector can be applied to combining the second projection image of the second projector with a third image of a third projector to create a mosaic image combining the first, second, and third projection images. Similarly, the same process can be applied to combine the third projection image with a fourth projection image from a fourth projector to create a mosaic image that combines the first, second, third, and fourth projection images.
- c is the desired image as observed by the camera.
- a camera pixel is lit by either one projector pixel from one of the projectors or two projector pixels simultaneously from respective projectors.
- the camera pixel gives a linear equation on the corresponding projector pixel in p 1 or p 2 .
- the camera pixel falls in an overlapping part of the FOVs of the two projectors, it gives a linear equation on the two corresponding projector pixels in p 1 or p 2 , respectively. Since each projector pixel covers a number of camera pixels, it is constrained by a number of linear equations. Thus such equations from all the camera pixels form an overconstrained linear system on the projector images p 1 or p 2 .
- ( ⁇ hacek over (T) ⁇ 1 ) T and ( ⁇ hacek over (T) ⁇ 2 ) T are the respective view projection matrices for the two projectors, and p 2 is set equal to zero at the initial iteration step. Since the view projection matrices naturally convey the correspondence and mapping between the images of the projectors and the camera, it does not take many iterations for p 1 and p 2 in formulas 1 and 2 (immediately above) to converge to their respective complementary image. In practice, the iteration process takes only a few iterations cycles (typically 5 or less) for p 1 and p 2 to converge to respective complementing images that, when combined form a mosaic image. That is, p 1 will converge to a first image and p 2 will converge to a second image, and when images of p 1 and p 2 are projected and superimposed, they will form a combined mosaic image.
- an example of a system that implements this process includes a first projector-camera system pair 221 and a second projector-camera system pair 223 .
- First projection-camera system pair 221 includes a first projector 21 a and a first camera 25 a related by a first view projection matrix ⁇ hacek over (T) ⁇ 1 constructed using any method described above.
- second projection-camera system pair 223 includes a second projector 21 b and a second camera 25 b related by a second view projection matrix ⁇ hacek over (T) ⁇ 2 also constructed using any of the above-described methods.
- ⁇ hacek over (T) ⁇ 1 and ⁇ hacek over (T) ⁇ 2 are each generated independently such that second projector-camera system pair 223 is off while ⁇ hacek over (T) ⁇ 1 is generated and first projector-camera system pair 221 is off while ⁇ hacek over (T) ⁇ 2 is generated.
- First projector-camera system pair 221 has a first field-of-vision FOV_ 1 defining a first projection region Reg_ 1
- second projector-camera system pair 223 has a second field-of-vision FOV_ 2 defining a second projection region Reg_ 2
- Reg_ 1 and Reg_ 2 overlap each other within an area identified by crosshatch marks.
- This overlap region is further labeled Reg_ 1 +2. It is to be understood that the size of overlap region Reg_ 1 +2 is made large for purposes of explanation and that a small overlap region is more typical, although the amount of overlap is not critical to the present application.
- full photometric information is also obtained due to each captured pixel in a light footprint having a value-entry for any light intensity variation of the three color (RGB) sub-components of each camera pixel (see for example, the white and shades of gray blocks that make up light footprints Ft 1 in FIG. 3 a or Ft 2 in FIG. 4A ).
- RGB three color
- matrix T were to be constructed in a binary ON/OFF manner (that is, pixels within a light footprint are classified as being fully ON (i.e., classified as having a light intensity value of 255 in a typical luminosity scale of 0 to 255) and pixels outside the light footprint are classified as being fully OFF (i.e., classified as having a light intensity value of 0), then this binary ON/OFF manner of constructing matrix T would not have much photometric information (since it would in effect be a black-and-white image). However, this binary ON/OFF manner of constructing matrix T would still have full geometric information so that the above two formulas (1) for p 1 and (2) for p 2 would still be able to determine the geometric mosaicking of multiple projection images. In such a case, however, an additional light blending step (described below) would be helpful to blend light intensities of multiple projection images when mosaicking the multiple projection images.
- Another situation where such an additional step for blending the light intensity of multiple projected images may be useful is in situations where light transport matrix T is created from a limited number of light footprint information. As is explained above, this would apply to situations where a projection surface (or scene) is flat, and the light transport matrix T is estimated using a limited number of intersecting patterns to generate a limited number of identified light footprints. Since the projection surface is flat, the missing geometric information within light transport matrix T can be inferred using homography techniques. The estimated light transport matrix T thus provides full geometric information, but it does not generally provide photometric information for locations between the limited number of identified light footprints.
- a light blending step may be skipped by simply insert white-color information as photometric information for all identified and inferred light footprint information.
- white-color information such as a white projection canvas or a white wall
- a light blending step may be skipped by simply insert white-color information as photometric information for all identified and inferred light footprint information.
- one assumes that the projection surface is of a uniform color, but not necessarily white then one can define an estimated photometric reading by assigning it the photometric information value of one (or of an average of two or more) identified light footprint(s). One can then populate the photometric information of all inferred light footprints with the thus defined estimated photometric reading.
- Reg_ 1 is the projection region provided by projector-camera system pair 221
- Reg_ 2 is the projection region provided by projector-camera system pair 223 .
- Reg_ 1 is denoted by vertical hatch lines
- Reg_ 2 is denoted by horizontal hatch lines.
- Overlap region Reg_ 1 +2 is therefore denoted by the intersection of vertical and horizontal hatch lines.
- a desired projection region 225 denoted by a darkened outline spanning across parts of regions Reg_ 1 , Reg_ 2 , and Reg_ 1 +2.
- Desired projection region 225 defines the region upon which a desired combined (i.e., mosaic) image is to be displayed. Desired projection region 225 has image contributions from both projector-camera systems 221 and 223 .
- Reg_A identifies that part of image region 225 provided solely by projector-camera system 221
- Reg_C identifies that part of image region 225 provided solely by projector-camera system 223
- Reg_B identifies that part of image region 225 provided by a combination of projector-camera systems 221 and 223 .
- Reg_B might also be provided solely by either one of projector-camera systems 221 or 223 , but it has been found that visual artifacts at an image border (where an image provided by one projector-camera system ends and a second image projected by a second projector-camera system begins) can be mitigated, or eliminated, by blending the transition between projector-camera systems. Therefore, in the presently preferred embodiment, both projector-camera systems 221 and 223 contribute to the image created within Reg_B. The question at hand is, how much (i.e., what parts) of an image within Reg_B each of projector-camera systems 221 and 223 provides.
- FIG. 41 shows Reg_ 1 and that part of desired image 225 within the FOV of projector-camera system 221 , i.e. Reg_B.
- the vertical hatch lines indicate that part of Reg_ 1 that is made dark due to it not contributing to desired image 225 .
- Arrows 1 A, 1 B, 1 C, and 1 D indicate how normalized light intensity is varied as one moves away from a border of region Reg_ 1 toward Reg_B, and approaches a border of desired image 225 that is provided by projector-camera system 223 .
- Area Reg_A is outside the combined section Reg_B, and is provided solely by projector-camera system 221 .
- arrow 1 A As one traverses arrow 1 A from the left border of Reg_ 1 and approaches Reg_B, arrow 1 A is shown dark to indicate that all image components are provided by projector-camera system 221 . Following arrow 1 A, as one enters Reg_B, arrow 1 A is initially dark and is then lightened (i.e., shown as thinning stripes) to indicate that the image intensity is initially strongly provided by projector-camera system 221 , but the intensity drops off as one traverses from left to right along Array 1 A toward the right border of Reg_B and Reg_ 1 .
- Arrow 1 D indicates that initially at the right end of Reg_B, no intensity is provided by projector-camera system 221 , but the light intensity from projector-camera system 221 is increased as one traverses from right to left within Reg_B toward the end-point of arrow 1 A.
- arrow 1 B indicates that the light intensity falls as one traverses down arrow 1 B away from a border of region 225 .
- arrow 1 C indicates that light intensity falls as one traverses up arrow 1 C away from a border of region 225 .
- light intensity variations in an image can be expressed as a factor of a defined maximum light intensity value, i.e. the normalized value.
- This normalized valued multiplied by a factor of 1 would provide the defined maximum light intensity value, and the same normalized value multiplied by a factor of 0.5 would provide half the defined maximum light intensity value.
- FIG. 42 A similar construct is shown in FIG. 42 from the point of view of projection-camera system 223 .
- the horizontal hatch lines indicate that part of Reg_ 2 that is made dark due to it not contributing to desired image 225 .
- Reg_B indicates the blending area where an image is provided by a combination of both projector-camera systems 221 and 223
- region Reg_C indicates that part of desired image region 225 provided solely by projector-camera system 223 .
- Arrows 2 A, 2 B, 2 C, and 2 D are indicative of how the normalized light intensity is varied as moves from Reg_ 2 toward the borders of desired region 225 , and within Reg_B toward that part of desired image 225 provided by projection-camera system 221 .
- Area Reg_C defines that part of desired image 225 that is outside combined region Reg_B, and is provided solely by projector-camera system 223 .
- arrow 2 A indicates that the image intensity is initially strongly provided by projector-camera system 223 , but the intensity drops off as one traverses from right to left along Array 2 A toward the left border of region Reg_B.
- Arrow 2 D indicates that initially at the left end of region Reg_B, no intensity is provided by projector-camera system 223 , but the light intensity from projector-camera system 223 is increased as one traverses from left to right within Reg_B.
- arrow 2 B indicates that the light intensity falls as one traverses down arrow 2 B away from a border of desired region 225 within Reg_B.
- arrow 2 C indicates that light intensity falls as one traverses up arrow 2 C away from a border of region 225 .
- These parameters affect the normalized light intensity of the projected pixel. The closer a projected pixel from projector-camera system 221 is to any one border of Reg_ 1 , the higher parameter contribution for that border, which makes for a brighter normalized intensity. The same is true for a projected pixel from projector-camera system 223 with determination of the projected pixel's proximity to the four borders of Reg_ 2 and Reg_B. Additionally, the light intensity of projected pixels close to border is adjusted as one approaches the border so as to avoid abrupt light intensity changes at the borders.
- region Reg_A As one travels along arrow A 1 in region Reg_A from left to right one will be moving from the left border of desired image region 225 , which is provided wholly by projector-camera system 221 , to the left border of region Reg_ 2 , and region Reg_B which is provided by both projector-camera systems 221 and 223 . As one moves along arrow A 1 from left to right, one is moving further away from the left border of Reg_ 1 , but the image is provided solely by projector-camera system 221 , and so the normalized light intensity of pixels produced by projector-camera system 221 is at its normal, highest value.
- projector-camera system 221 When one reaches the end of arrow A 1 and reaches the beginning of arrow A 2 , projector-camera system 221 is still at its highest normalized value since light blending is just about to begin. At the start of arrow A 2 , projector-camera system 221 has its highest normalized light intensity and projector camera system 223 has its lowest normalized light intensity. As one moves along arrow A 2 from left to right (i.e., from Reg_A provided exclusively by projector-camera system 221 toward Reg_C provide exclusively by projector-camera system 223 ), the normalized light intensity of projector-camera system 221 is lowered from its highest to it lowest normalized light intensity value.
- the normalized light intensity of projector-camera system 223 is raised from its lowest normalized light intensity value to its highest. It is to be understood that this light transition is not necessarily linear. For example, the greater changes in normalized light intensity of projector camera system 221 preferred occur as one approaches the right border or Reg_B along arrow A 2 .
- projector-camera system 223 is at its highest normalized light intensity, and projector-camera system 221 is at its lowest normalized light intensity.
- projector-camera system 223 provides all the projected light and no light contribution is provided by projector-camera system 221 .
- projector-camera system 223 may be maintained at its highest normalized light intensity.
- FIGS. 44A and 44B Construction of large FOV projector systems using the above method of combining multiple projectors is shown in FIGS. 44A and 44B .
- a single curved mirror 125 is used in combination with multiple projector-camera pairs 145 .
- a single mirror pyramid 151 is used with multiple projector-camera pairs 145 to achieve a large FOV.
- the optical centers of all projectors can be collocated within the mirror pyramid, creating a single virtual large FOV projector.
- the camera optical centers can also be collocated to create a single virtual large FOV camera.
- FIG. 45 shows that multiple large FOV projectors 153 a and 153 b (such as those shown in FIGS. 44A , 44 B, or other large FOV projector systems) can be used to achieve an even larger overall projection FOV.
- One or more conventional projectors 155 can also be used in combination.
- the FOV of projector 153 a as indicated by dotted line 157 overlaps the FOV of projector 154 b , as indicated by dotted line 159 , by an overlap amount 161 .
- the images from projectors 153 a and 153 b may be combined using the method described above for combining multiple projectors.
- the above described use of matrixes in the transformation of images permits the facilitating of image processing hardware and software.
- an image generation configuration in accord with the present invention may take many forms. In general, however, it typically includes an image source 400 , an image transformer 401 implementing any of the above described above light transport-related transformations, and an image reproduction mechanism 403 . This basic configuration may additionally include a feedback mechanism 405 to produce an interactive environment.
- the image source 400 may include an image reproduction apparatus 411 (such as a video player), image capture device 413 , video game console 415 , image reception tuner box 417 , and/or computer system 419 .
- image reproduction apparatus 411 which can reproduce images previously recorded on VHS tape, digital disk, or other analog/digital recording medium, such as a hard drive, nonvolatile silicon memory, etc.
- Image capture device 413 i.e., still-image or video camera
- Video game console 417 typically can play previously created images, but also generates new images in response to interaction with a game player.
- Image reception tuner box 417 typically outputs received image transmissions (by air, wire, fiber optic line, etc.) that originate from a distant image transmitter source, not shown.
- Computer system 419 may include all the capabilities of devices 411 - 417 , and can further generate its own new images for display on its screen and/or for optional output for display by image reproduction mechanism 403 .
- Image transformer 401 implements any or all of the above described light transport transformation techniques for compensating for irregularities within an image reproduction scene. That is, any of the above described methods for generating or using the light transport matrix T, the inverse light transport matrix T ⁇ 1 (or ⁇ hacek over (T) ⁇ T ), the View Projection matrix ⁇ hacek over (T) ⁇ T , and/or any derivation there from are accomplished by image transformer 401 . It is to be understood that image transformer 401 is shown separate from image source 400 and image reproduction mechanism 403 purely for ease of explanation. As is explained above, image transformer 401 may be an integral part of image source 400 and/or image reproduction mechanism 403 .
- Image reproduction mechanism 403 may include a bank of projectors 421 consisting of one or more individual projectors 421 a through 421 z .
- Image reproduction mechanism may further include one or more image monitors 427 , and one or more image capture devices 425 a / 425 b .
- Image capture devices as exemplarily embodied by still-image camera 425 a and video camera 425 b , are shown for completeness, but may be omitted after projector(s) 421 a - 421 z have been calibrated.
- the present implementation of the light transport matrix T requires the use of at least one projector-camera system, and thus cameras 425 a and 425 b are shown to be optionally coupled to any of projectors 421 a - 421 z .
- cameras 425 a and 425 z may be removed from the system until a new distortion is introduce to the projection scene requiring that projectors 421 a - 421 z be re-calibrated.
- cameras 425 a and/or 425 b may be a permanent part of image reproduction mechanism 403 , and used to introduce additional utility.
- an interaction system maybe produce wherein cameras 425 a and/or 425 b monitor an audience for user movement and convey such movement as feedback information to image source 400 for modifying a provided image in a predefined manner.
- cameras 425 a / 425 b may follow a user's arm movement, and feedback this movement information to cause a specified projection object (such as a curser, or other key object) to move in a manner following the tracked armed movement.
- a specified projection object such as a curser, or other key object
- specific user gestures, or hand movements may convey image modification information to image source 400 .
- covering a particular part of a projection image may signal that the user wishes to resize the projection image, adjust color or contrast, modify brightness levels, etc.
- This modification information is captured by cameras 421 a / 421 b and sent to image source 400 for implementation.
- the captured information may be conveyed directly to projectors 421 a - 421 z , and they projectors themselves may implement the desired image modifications without requiring any assistance from, or interaction with, image source 400 .
- images produced by a projector such as any of projectors 421 a - 421 z may be simultaneously reproduce on image monitor 427 (i.e., computer monitor, television monitor, or any other video monitor) for personal viewing.
- image monitor 427 i.e., computer monitor, television monitor, or any other video monitor
- an individual viewer may be playing a video game on image monitor 427 , while at the same time image shown on image monitor 427 are reproduced on one or more of projectors 421 a - 421 z .
- Image monitor 427 may additionally include a video camera 431 to track user's movement. If a user is playing a video game, than interaction with video game console 415 may be conveyed by means of a joystick (not shown), or other input device.
- an optic sensor 433 may track joystick movement by means of frequency transmission, gyroscope information, and/or other known movement tracking mechanism.
- the feedback information may be conveyed directly to projectors 421 a - 421 z if the feedback information convey an image modification that may be implemented by projectors 421 a - 421 z , or alternatively are sent by means of feedback sensor 405 to image source 400 for interpretation by the appropriate device 411 - 419 that makes up image source 400 .
- feedback sensor 405 is shown purely for ease of explanation and it may constitute a signal transmission means coupled to image source 400 , or may be an integral part of image source 400 .
- a source image (either previously record or newly generated in real time by image source 400 ) is subjected to a matrix transformation by means of image transformer 401 , which may be part of image source 400 , may be a part of image reproduction mechanism 403 , or may be an independent, stand-alone, add-on box.
- image transformer 401 which may be part of image source 400 , may be a part of image reproduction mechanism 403 , or may be an independent, stand-alone, add-on box.
- a source images may need to be modified (i.e., color, hue, contrast, brightness, etc.) prior to being transformed. This can be accomplished by a display pipeline consisting of a separate control module for each type of change.
- the display pipeline may apply a source image to a color control module, which itself is response to a first control signal, followed by control module responsive to a second control input before being applied to an image transforming module 401 (i.e., image transformer 401 ) to produce the display image sent to image reproduction mechanism 403 .
- an image may be modified by image reproduction mechanism 403 after being transformed by image transformer 401 , as shown in FIG. 48 .
- a transformed image from transformation module 401 may be applied to display pipeline consisting, for example, of convolution module 447 responsive to a third control input followed by a contrast control module 449 responsive to fourth control input.
- Each of control inputs control_ 1 to control_ 4 may be generated by an image adjustment input image source 400 , or at image reproduction mechanism 403 , or by a feedback signal created by user-interaction with a user (i.e., video player, as described above).
- each of theses signal modifications is traditionally accomplished by a separate image modification module, a illustrated in FIGS. 47 and 48 , and the more modifications that are implemented, the longer the display pipeline and the more time that is required to run an image through all the needed modification modules. If a signal is pre-recorded, these image modifications are generally not a problem since a source image signal may be delayed until it passes through all its image modifications modules prior to being synchronized with its audio signal and output by a projector 421 a - 421 z.
- image reproduction mechanism 403 provide at least one feedback mechanism for identifying real-time user-submitted image modifications request, and further desired that these image modifications be implemented in real time.
- An example of such real-time modifications is an interactive video game, where a real-time image response to a user-input is highly desirable. In such a circumstance, it is no longer acceptable to delay a projection image until the source image passes through an image pipeline made up of appropriate number of modification modules, since this would introduced an unacceptable response delay (i.e., display latency) to a user's game image.
- the present invention addresses the issue of how to improve the response time to transformed images that require additional real-time, user-requested image modifications.
- the image/video processing pipeline in a display system performs operations on input image/video so that they can be output on the display hardware, such as LCD projectors 421 a - 421 z or flat panel TV 425 .
- these operations include resizing, sharpening, geometric warping, and other image enhancement/adaptation operations.
- HDTV high definition television
- these image processing operations become more sophisticated and often the display pipeline requires more time to process the image/video. This introduces a time delay (i.e., display latency) to the display.
- display latency For traditional media like movies that are typically passively viewed, this is not a significant problem and many home theater receivers are capable of delaying an audio signal to cope with display latency.
- displays are being used for interactive applications like video games and animated touch screens. In these applications where immediate visual feedback is critical, display latency becomes a major problem.
- the presently preferred architecture is for a new kind of display pipeline where a long sequence of image/video processing operations (or modification modules) can be ‘pre-concatenated’ and be represented as a single matrix. This allows the entire chain of operations (i.e., one or more display pipelines) to be performed in a single step, minimizing the amount of display latency introduced.
- the image c is first multiplied with scaling operation S before the product is multiplied by T ⁇ 1 (or ⁇ hacek over (T) ⁇ T ).
- FIGS. 49-51 A pictorial representation of this reduction method is shown in FIGS. 49-51 .
- a captured image c is shown applied to a set of pre-processing operations 461 - 465 prior to being sent to transformation operation T ⁇ 1 , 470 .
- pre-processing operations 461 - 465 are also individually labeled ABC . . . , and collectively represented by a single pre-processing matrix Q, 467 .
- the output from transformation operation T ⁇ 1 , 470 is applied to a set of post-processing operations 481 - 485 to produce a desired projection image p.
- post-processing operation 481 - 485 are individually labeled UVW . . . , and collectively represented by a single post-processing matrix V, 487 .
- FIG. 49 The structure of FIG. 49 is re-represented in FIG. 50 in simplified form using pre-processing matrix Q and post-processing matrix V. Processing matrixes Q, T 1 , and V are then combined into a single processing matrix X, 490 , which combines the multiple operations of processing operations 461 - 465 , 470 , 481 - 485 into a single matrix operation X.
- the resultant simplified circuit/process is shown in FIG. 51 .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
Rcptr=T Rprjct
where T is a composite light transport matrix that relates
Vcptr″=T″Vprjct″
[i.e. S1⊂{1, . . . ,(p×q)}]
Furthermore, it is assumed that the first set of projector pixels S1 includes target projector pixel j, i.e. the target projector pixel under test.
[i.e. S1∩S2={j}]
Let Rcptr_S2 be a second real image captured by
Tj≈MIN(Rcptr_S1, Rcptr_S2)
where Tj is the jth column of matrix T, and “MIN” indicates that the lower valued camera pixel (i.e., the darker camera pixel having a lower captured light intensity value) in Rcptr_S1 and Rcptr_S2 is retained, and the higher valued (i.e., brighter) camera pixel is discarded. In this way, the only high intensity values that are retained correspond to a light ray footprint common to both S1 and S2.
[i.e. L⊂{1, . . . , (m×n)}]
It should again be noted that the target projector pixel, j, is the intersection of projector pixel sets S1 and S2, (i.e., j is the only projector pixel common to both sets S1 and S2), such that
S1∩S2={j}
Therefore, among the captured camera pixels (in both Rcptr_S1 and Rcptur_S2) that do not correspond to the target projector pixel, j, (i.e., those camera pixels not in set L, i.e. ∉L), at least one of the compared camera pixels in either Rcpt_S1 or Rcptr_S2 will not have received light. Since camera pixels receiving light will be brighter than camera pixels not receiving light, the operation MIN(Rcptr_S1, Rcptr_S2), provides an image where only pixels in set L [i.e. εL] are lit, which is a good approximation of Tj, i.e. the jth column in matrix T.
where each projection image Rprjct_Sy_1 to Rprjct_Sy_q is paired with any of projection images Rprjct_Sx_1 to Rprjct_Sx_p such that each pair of projection images shares only one light footprint in common. That is,
∀jε{1, . . . , (p×q)}∃Rprjct— Sy — α, Rprjct— Sx — b|Rprjct— Sy — a∩Rprjct— Sx — b={j}
The above formula being interpreted to mean that for all projector pixels j in {1 . . . (p×q)} there exist a pair of projection images, each having a differently constructed pattern such that the intersection of the constructed patterns intersect at a single point (i.e., pattern section, or a single light footprint region) corresponding to a common projector pixel, j. A basic example of such pairs of constructed patterns would be projected pairs of vertical light lines and horizontal light lines. In this case, the intersection of the captured image of a vertical light line and the captured image of a horizontal light line would include all the camera pixels i that correspond to a target projector pixel under-test, j, (i.e., all camera pixels i that lie within a light ray footprint created by a light ray emitted from project pixel under-test j).
Vcptr″=TTVprjct″
Since the virtual camera
Vcptr″(j)=T T j Vprjct″
where TT j refers to the jth row in TT.
SVcptr″(G)={a|∀zε{1, . . . , (p×q)}T
Since in general ∥SVcptr″(G)∥<<(p×q), it takes significantly less time to compute:
than to compute:
Vcptr″(j)=T T j Vprjct″
Up=λHUc
where λ is a scalar and H is a 3×3 homography transformation matrix (as is known in the art) of which the bottom right entry is set to 1. The pair of corresponding coordinates provides 3 linear equations, where one of the equations determines the scalar and the other two are used to determine H, the homography transformation matrix. Since there are 8 unknown entries in 3×3 matrix H, given the correspondence between N coordinate points (where N≧4) on the checker board, the homography between the projector-view image and the camera image can be recovered by solving the 2N linear equations. The greater the number of N, the lower the error relating coordinate points between the projector-view and the camera image.
Rcptr=T Rprjct
where T is the light transport matrix. As is illustrated in
Vcptr″=TT Vprjct″
can be used to model a “dual” setup where
Rprjct=T −1 Rcptr
if one can determine the inverse, T−1, of the light transport matrix T.
AI=IA=A
If matrix A were a matrix of order m by n, then the pre-multiplicative identity matrix I would be of order m by m, while the post-multiplicative identity matrix I would be of order n by n.
{hacek over (T)}r=Tr/(∥Tr∥)2 , r=1, 2, 3, . . . , pq
where {hacek over (T)}r is the rth column of {hacek over (T)}. Since matrix operation ∥Tr∥ defines the square root of the sum of the squares of all values in column r of matrix T, the square of ∥Tr∥ is simply the sum of the squares of all the values in column r. That is,
By dividing each value entry in column r by the sum of the squares of all the values entries in column r, operation {Tr//(∥Tr∥)2} has the effect of normalizing the value entries in column r of matrix T. If one now takes the transpose of {hacek over (T)}r, i.e. flips it on its side such that the first column becomes the top row and the last column becomes the bottom row, the result will be rows of elements that are the normalized values of corresponding columns of elements in T. Therefore, for every column in T, one has the following result:
({hacek over (T)}r T)×(Tr)=1
and
({hacek over (T)}r T)×(Tω))=0, for r≠ω
In other words, multiplying a column of T with a corresponding row in {hacek over (T)}rT always results in
Tj≈MIN(CS1, CS2)
where Tj is the jth column of T. CS1, CS2 are the sum of the respective columns of T, i.e. CS=ΣjεS Tj. Given that the contribution of each individual projector pixel j is mapped to distinct parts of the camera sensor, there is a common set of pixels l□{1, . . . , m×n} in the captured images CS1, CS2 that corresponds to projector pixel j. Now, as S1∩S2={j}, for the rest of the captured image pixels∉l, at least one of the images would not have received light from one of the projector pixel sets, S1 or S2. Since pixels receiving light will be brighter than pixels not receiving light, MIN(CS1, CS2) renders an image where only pixelsεl are lit, which is a good approximation of Tj.
Y1, . . . , Yp, X1, . . . ,Xq,
where
∀iε{1, . . . , (p×q)}, ∃X j , Y k |Xj∩Y k ={i},
one can synthesize Tj where j=1, . . . , p×q from images CX1, . . . , CXq and CY1, . . . , CYp.
c1=T1p1
and
c2=T2p2
In order to simulate projected image p1 from front projector P1 using immersive projector P2, one needs c1 (i.e., the captured, projected image from front projector P1) to be the same as c2 (i.e., the captured, projected image from immersive projector P2), i.e. one needs
c2=c1
which lead to the relation:
T2p2=T1p1
solving for p2 (i.e., the image projected by immersive projector P2), one obtains the following relation:
p 2=(T 2 −1)(T 1 p 1)
This means that to create image p1, one can project the image directly using front projector P1, or the same effect can be achieved by projecting a transformed image [defined as (T2 −1)(T1p1)] on immersive projector P2. Note that the view projection matrices naturally convey the projector-camera correspondence into projector-projector correspondence.
c3=T3p3
which results in
p3=(T 3 −1)×(c 3)
Consequently, one can build a virtual model of display surfaces of
{hacek over (T)}r=Tr/(∥Tr∥)2 , r=1, 2, 3, . . . , pq (1)
where {hacek over (T)}r is the rth column of {hacek over (T)} and pq is the number of columns in T. It was then shown that under the Display Constraint, one can define the view projection matrix {hacek over (T)}T as:
{hacek over (T)}T=T−1
which leads to the following relation:
p={hacek over (T)}Tc
c1=T1p1
and
c2=T2p2
c=c 1 +c 2
or
c=(T 1 p 1)+(T 2 p 2)=[T 1 T 2 ][p 1 p 2]T (2)
-
- 1. Given the current estimate of p2, compute p1=({hacek over (T)}1)T(c−T2p2)
- 2. Given the current estimate of p1, compute p2=({hacek over (T)}2)T(c−T1p1)
c=Tp
where each column of the light transport matrix T is the projection image of one pixel from the projector.
p=T−1c
{hacek over (T)}r=Tr/(∥Tr∥)2 , r=1, 2, 3, . . . , pq
where {hacek over (T)}r is the rth column of {hacek over (T)}. Since
({hacek over (T)}r)T Tr=1 and ({hacek over (T)}r)T Tz=0 for z≠r
it was shown by the display constraint that
{hacek over (T)}T≈T−1
which leads to the following relation:
p≈{hacek over (T)}Tc
p=T −1(Sc)
p=(T −1 S)c
where the matrices T−1 and S are multiplied before the product is multiplied with c. One can therefore write
M=(T −1 S)=({hacek over (T)} T S)
which produces the following result, indicting that once matrix M is generated, future images c can be quickly transformed and scaled
p=Mc
p=S(T −1 c)
indicating that source image c is modified by transformation matrix T−1 (or {hacek over (T)}T) before the result is applied to scaling module S. On can now define
N=ST −1
and again re-write the operation as
p=Nc
Thus, a 2-step process is reduced to a1 step process.
p=(UVW . . . )T −1(ABC . . . )c
the entire operational chain (i.e., display pipeline) can be rewritten as
p=Xc
where
X=(UVW . . . )T −1(ABC . . . )
Claims (9)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/331,281 US8013904B2 (en) | 2008-12-09 | 2008-12-09 | View projection matrix based high performance low latency display pipeline |
US13/196,401 US8508615B2 (en) | 2008-12-09 | 2011-08-02 | View projection matrix based high performance low latency display pipeline |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/331,281 US8013904B2 (en) | 2008-12-09 | 2008-12-09 | View projection matrix based high performance low latency display pipeline |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/196,401 Continuation US8508615B2 (en) | 2008-12-09 | 2011-08-02 | View projection matrix based high performance low latency display pipeline |
Publications (2)
Publication Number | Publication Date |
---|---|
US20100141780A1 US20100141780A1 (en) | 2010-06-10 |
US8013904B2 true US8013904B2 (en) | 2011-09-06 |
Family
ID=42230618
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/331,281 Expired - Fee Related US8013904B2 (en) | 2008-12-09 | 2008-12-09 | View projection matrix based high performance low latency display pipeline |
US13/196,401 Active 2029-05-06 US8508615B2 (en) | 2008-12-09 | 2011-08-02 | View projection matrix based high performance low latency display pipeline |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/196,401 Active 2029-05-06 US8508615B2 (en) | 2008-12-09 | 2011-08-02 | View projection matrix based high performance low latency display pipeline |
Country Status (1)
Country | Link |
---|---|
US (2) | US8013904B2 (en) |
Cited By (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100299642A1 (en) * | 2009-05-22 | 2010-11-25 | Thomas Merrell | Electronic Device with Sensing Assembly and Method for Detecting Basic Gestures |
US20100299390A1 (en) * | 2009-05-22 | 2010-11-25 | Rachid Alameh | Method and System for Controlling Data Transmission to or From a Mobile Device |
US20110242332A1 (en) * | 2010-04-01 | 2011-10-06 | Mcfadyen Doug | Method and Apparatus for Calibrating a Projector for Image Warping |
US8106949B2 (en) * | 2009-03-26 | 2012-01-31 | Seiko Epson Corporation | Small memory footprint light transport matrix capture |
US8391719B2 (en) | 2009-05-22 | 2013-03-05 | Motorola Mobility Llc | Method and system for conducting communication between mobile devices |
US8542186B2 (en) | 2009-05-22 | 2013-09-24 | Motorola Mobility Llc | Mobile device with user interaction capability and method of operating same |
US8619029B2 (en) | 2009-05-22 | 2013-12-31 | Motorola Mobility Llc | Electronic device with sensing assembly and method for interpreting consecutive gestures |
US8751056B2 (en) | 2010-05-25 | 2014-06-10 | Motorola Mobility Llc | User computer device with temperature sensing capabilities and method of operating same |
US8831367B2 (en) | 2011-09-28 | 2014-09-09 | Pelican Imaging Corporation | Systems and methods for decoding light field image files |
US8861089B2 (en) | 2009-11-20 | 2014-10-14 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8866912B2 (en) * | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
US8885059B1 (en) | 2008-05-20 | 2014-11-11 | Pelican Imaging Corporation | Systems and methods for measuring depth using images captured by camera arrays |
US8928793B2 (en) | 2010-05-12 | 2015-01-06 | Pelican Imaging Corporation | Imager array interfaces |
US8963845B2 (en) | 2010-05-05 | 2015-02-24 | Google Technology Holdings LLC | Mobile device with temperature sensing capability and method of operating same |
US9001226B1 (en) * | 2012-12-04 | 2015-04-07 | Lytro, Inc. | Capturing and relighting images using multiple devices |
US9100635B2 (en) | 2012-06-28 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for detecting defective camera arrays and optic arrays |
US9100586B2 (en) | 2013-03-14 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for photometric normalization in array cameras |
US9103732B2 (en) | 2010-05-25 | 2015-08-11 | Google Technology Holdings LLC | User computer device with temperature sensing capabilities and method of operating same |
US9106784B2 (en) | 2013-03-13 | 2015-08-11 | Pelican Imaging Corporation | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
US9124831B2 (en) | 2013-03-13 | 2015-09-01 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
US9123118B2 (en) | 2012-08-21 | 2015-09-01 | Pelican Imaging Corporation | System and methods for measuring depth using an array camera employing a bayer filter |
US9128228B2 (en) | 2011-06-28 | 2015-09-08 | Pelican Imaging Corporation | Optical arrangements for use with an array camera |
US9143711B2 (en) | 2012-11-13 | 2015-09-22 | Pelican Imaging Corporation | Systems and methods for array camera focal plane control |
US9185276B2 (en) | 2013-11-07 | 2015-11-10 | Pelican Imaging Corporation | Methods of manufacturing array camera modules incorporating independently aligned lens stacks |
US9197821B2 (en) | 2011-05-11 | 2015-11-24 | Pelican Imaging Corporation | Systems and methods for transmitting and receiving array camera image data |
US9210392B2 (en) | 2012-05-01 | 2015-12-08 | Pelican Imaging Coporation | Camera modules patterned with pi filter groups |
US9214013B2 (en) | 2012-09-14 | 2015-12-15 | Pelican Imaging Corporation | Systems and methods for correcting user identified artifacts in light field images |
US9247117B2 (en) | 2014-04-07 | 2016-01-26 | Pelican Imaging Corporation | Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array |
US9253380B2 (en) | 2013-02-24 | 2016-02-02 | Pelican Imaging Corporation | Thin form factor computational array cameras and modular array cameras |
US9412206B2 (en) | 2012-02-21 | 2016-08-09 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
US9426361B2 (en) | 2013-11-26 | 2016-08-23 | Pelican Imaging Corporation | Array camera configurations incorporating multiple constituent array cameras |
US9438888B2 (en) | 2013-03-15 | 2016-09-06 | Pelican Imaging Corporation | Systems and methods for stereo imaging with camera arrays |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9462164B2 (en) | 2013-02-21 | 2016-10-04 | Pelican Imaging Corporation | Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information |
US9497370B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Array camera architecture implementing quantum dot color filters |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
US9516222B2 (en) | 2011-06-28 | 2016-12-06 | Kip Peli P1 Lp | Array cameras incorporating monolithic array camera modules with high MTF lens stacks for capture of images used in super-resolution processing |
US9521416B1 (en) | 2013-03-11 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for image data compression |
US9519972B2 (en) | 2013-03-13 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
US9521319B2 (en) | 2014-06-18 | 2016-12-13 | Pelican Imaging Corporation | Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor |
US9578259B2 (en) | 2013-03-14 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US9638883B1 (en) | 2013-03-04 | 2017-05-02 | Fotonation Cayman Limited | Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process |
US9766380B2 (en) | 2012-06-30 | 2017-09-19 | Fotonation Cayman Limited | Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors |
US9774789B2 (en) | 2013-03-08 | 2017-09-26 | Fotonation Cayman Limited | Systems and methods for high dynamic range imaging using array cameras |
US9794476B2 (en) | 2011-09-19 | 2017-10-17 | Fotonation Cayman Limited | Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures |
US9813616B2 (en) | 2012-08-23 | 2017-11-07 | Fotonation Cayman Limited | Feature based high resolution motion estimation from low resolution images captured using an array source |
US9888194B2 (en) | 2013-03-13 | 2018-02-06 | Fotonation Cayman Limited | Array camera architecture implementing quantum film image sensors |
US9898856B2 (en) | 2013-09-27 | 2018-02-20 | Fotonation Cayman Limited | Systems and methods for depth-assisted perspective distortion correction |
US9942474B2 (en) | 2015-04-17 | 2018-04-10 | Fotonation Cayman Limited | Systems and methods for performing high speed video capture and depth estimation using array cameras |
US10089740B2 (en) | 2014-03-07 | 2018-10-02 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
US10119808B2 (en) | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10205896B2 (en) | 2015-07-24 | 2019-02-12 | Google Llc | Automatic lens flare detection and correction for light-field images |
US10250871B2 (en) | 2014-09-29 | 2019-04-02 | Fotonation Limited | Systems and methods for dynamic calibration of array cameras |
US10275892B2 (en) | 2016-06-09 | 2019-04-30 | Google Llc | Multi-view scene segmentation and propagation |
US10275898B1 (en) | 2015-04-15 | 2019-04-30 | Google Llc | Wedge-based light-field video capture |
US10298834B2 (en) | 2006-12-01 | 2019-05-21 | Google Llc | Video refocusing |
US10334151B2 (en) | 2013-04-22 | 2019-06-25 | Google Llc | Phase detection autofocus using subaperture images |
US10341632B2 (en) | 2015-04-15 | 2019-07-02 | Google Llc. | Spatial random access enabled video system with a three-dimensional viewing volume |
US10354399B2 (en) | 2017-05-25 | 2019-07-16 | Google Llc | Multi-view back-projection to a light-field |
US10390005B2 (en) | 2012-09-28 | 2019-08-20 | Fotonation Limited | Generating images from light fields utilizing virtual viewpoints |
US10412373B2 (en) | 2015-04-15 | 2019-09-10 | Google Llc | Image capture for virtual reality displays |
US10419737B2 (en) | 2015-04-15 | 2019-09-17 | Google Llc | Data structures and delivery methods for expediting virtual reality playback |
US10440407B2 (en) | 2017-05-09 | 2019-10-08 | Google Llc | Adaptive control for immersive experience delivery |
US10444931B2 (en) | 2017-05-09 | 2019-10-15 | Google Llc | Vantage generation and interactive playback |
US10469873B2 (en) | 2015-04-15 | 2019-11-05 | Google Llc | Encoding and decoding virtual reality video |
US10474227B2 (en) | 2017-05-09 | 2019-11-12 | Google Llc | Generation of virtual reality with 6 degrees of freedom from limited viewer data |
US20190347814A1 (en) * | 2016-06-02 | 2019-11-14 | Verily Life Sciences Llc | System and method for 3d scene reconstruction with dual complementary pattern illumination |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
US10540818B2 (en) | 2015-04-15 | 2020-01-21 | Google Llc | Stereo image generation and interactive playback |
US10546424B2 (en) | 2015-04-15 | 2020-01-28 | Google Llc | Layered content delivery for virtual and augmented reality experiences |
US10545215B2 (en) | 2017-09-13 | 2020-01-28 | Google Llc | 4D camera tracking and optical stabilization |
US10552947B2 (en) | 2012-06-26 | 2020-02-04 | Google Llc | Depth-based image blurring |
US10565734B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline |
US10567464B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video compression with adaptive view-dependent lighting removal |
US10594945B2 (en) | 2017-04-03 | 2020-03-17 | Google Llc | Generating dolly zoom effect using light field image data |
US10679361B2 (en) | 2016-12-05 | 2020-06-09 | Google Llc | Multi-view rotoscope contour propagation |
US20200358992A1 (en) * | 2018-02-20 | 2020-11-12 | Canon Kabushiki Kaisha | Image processing apparatus, display system, image processing method, and medium |
US10965862B2 (en) | 2018-01-18 | 2021-03-30 | Google Llc | Multi-camera navigation interface |
US11270110B2 (en) | 2019-09-17 | 2022-03-08 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11302012B2 (en) | 2019-11-30 | 2022-04-12 | Boston Polarimetrics, Inc. | Systems and methods for transparent object segmentation using polarization cues |
US11328446B2 (en) | 2015-04-15 | 2022-05-10 | Google Llc | Combining light-field data with active depth data for depth map generation |
US11525906B2 (en) | 2019-10-07 | 2022-12-13 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
US20220405891A1 (en) * | 2019-11-25 | 2022-12-22 | Microsoft Technology Licensing, Llc | Image sharpening for subjects imaged through display |
US11580667B2 (en) | 2020-01-29 | 2023-02-14 | Intrinsic Innovation Llc | Systems and methods for characterizing object pose detection and measurement systems |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US12172310B2 (en) | 2021-06-29 | 2024-12-24 | Intrinsic Innovation Llc | Systems and methods for picking objects using 3-D geometry and segmentation |
US12175741B2 (en) | 2021-06-22 | 2024-12-24 | Intrinsic Innovation Llc | Systems and methods for a vision guided end effector |
US12293535B2 (en) | 2021-08-03 | 2025-05-06 | Intrinsic Innovation Llc | Systems and methods for training pose estimators in computer vision |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7901093B2 (en) * | 2006-01-24 | 2011-03-08 | Seiko Epson Corporation | Modeling light transport in complex display systems |
US8042954B2 (en) * | 2007-01-24 | 2011-10-25 | Seiko Epson Corporation | Mosaicing of view projections |
US10155156B2 (en) | 2008-06-03 | 2018-12-18 | Tweedletech, Llc | Multi-dimensional game comprising interactive physical and virtual components |
US8602857B2 (en) | 2008-06-03 | 2013-12-10 | Tweedletech, Llc | Intelligent board game system with visual marker based game object tracking and identification |
US9649551B2 (en) | 2008-06-03 | 2017-05-16 | Tweedletech, Llc | Furniture and building structures comprising sensors for determining the position of one or more objects |
EP2328662A4 (en) | 2008-06-03 | 2013-05-29 | Tweedletech Llc | An intelligent game system for putting intelligence into board and tabletop games including miniatures |
US8974295B2 (en) * | 2008-06-03 | 2015-03-10 | Tweedletech, Llc | Intelligent game system including intelligent foldable three-dimensional terrain |
US8013904B2 (en) * | 2008-12-09 | 2011-09-06 | Seiko Epson Corporation | View projection matrix based high performance low latency display pipeline |
US7901095B2 (en) * | 2009-03-27 | 2011-03-08 | Seiko Epson Corporation | Resolution scalable view projection |
JP5570316B2 (en) * | 2010-06-17 | 2014-08-13 | キヤノン株式会社 | Imaging apparatus, control method thereof, and program |
JP5993856B2 (en) | 2010-09-09 | 2016-09-14 | トウィードルテック リミテッド ライアビリティ カンパニー | Board game with dynamic feature tracking |
US9560314B2 (en) * | 2011-06-14 | 2017-01-31 | Microsoft Technology Licensing, Llc | Interactive and shared surfaces |
JP5898475B2 (en) * | 2011-11-28 | 2016-04-06 | クラリオン株式会社 | In-vehicle camera system, calibration method thereof, and calibration program thereof |
US8830373B2 (en) * | 2012-01-09 | 2014-09-09 | Pathway Innovations And Technologies, Inc. | Imaging device having multiple optics |
JP5924020B2 (en) * | 2012-02-16 | 2016-05-25 | セイコーエプソン株式会社 | Projector and projector control method |
DE102012206851A1 (en) * | 2012-04-25 | 2013-10-31 | Robert Bosch Gmbh | Method and device for determining a gesture executed in the light cone of a projected image |
KR102048361B1 (en) * | 2013-02-28 | 2019-11-25 | 엘지전자 주식회사 | Distance detecting device and Image processing apparatus including the same |
WO2014145722A2 (en) * | 2013-03-15 | 2014-09-18 | Digimarc Corporation | Cooperative photography |
WO2014184274A1 (en) * | 2013-05-15 | 2014-11-20 | Koninklijke Philips N.V. | Imaging a patient's interior |
WO2015134961A1 (en) | 2014-03-07 | 2015-09-11 | Brown University | Method and system for unsynchronized structured lighting |
US10169909B2 (en) * | 2014-08-07 | 2019-01-01 | Pixar | Generating a volumetric projection for an object |
US9937420B2 (en) * | 2015-09-29 | 2018-04-10 | Sony Interactive Entertainment Inc. | Method and apparatus for the projection of images, video, and/or holograms generated by a computer simulation |
US9792674B2 (en) * | 2016-03-10 | 2017-10-17 | Netflix, Inc. | Perspective correction for curved display screens |
TWI653563B (en) * | 2016-05-24 | 2019-03-11 | 仁寶電腦工業股份有限公司 | Projection touch image selection method |
US10366674B1 (en) * | 2016-12-27 | 2019-07-30 | Facebook Technologies, Llc | Display calibration in electronic displays |
JP7159057B2 (en) * | 2017-02-10 | 2022-10-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Free-viewpoint video generation method and free-viewpoint video generation system |
CN107689028A (en) * | 2017-08-22 | 2018-02-13 | 深圳市爱培科技术股份有限公司 | Adaptive interface display methods, system and storage device based on ADAS |
CN111684793A (en) * | 2018-02-08 | 2020-09-18 | 索尼公司 | Image processing device, image processing method, program, and projection system |
CN110784693A (en) * | 2018-07-31 | 2020-02-11 | 中强光电股份有限公司 | Projector calibration method and projection system using this method |
JP7347205B2 (en) * | 2019-12-26 | 2023-09-20 | セイコーエプソン株式会社 | Projection system control method, projection system and control program |
CN111311686B (en) * | 2020-01-15 | 2023-05-02 | 浙江大学 | A Defocus Correction Method for Projectors Based on Edge Sensing |
JP2022036737A (en) * | 2020-08-24 | 2022-03-08 | キヤノン株式会社 | Projection apparatus, control method, and program |
US11790556B2 (en) * | 2021-02-24 | 2023-10-17 | Nvidia Corporation | Determining optical center in an image |
CN113538308A (en) * | 2021-06-29 | 2021-10-22 | 上海联影医疗科技股份有限公司 | Image data processing method, image data processing device, computer equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4553220A (en) | 1983-05-19 | 1985-11-12 | Gti Corporation | Matrix multiplier with normalized output |
US20010003456A1 (en) | 1999-12-09 | 2001-06-14 | Shuichi Kagawa | Image display device |
US20020097256A1 (en) | 2000-12-06 | 2002-07-25 | Miller Daniel J. | Methods and systems for processing media content |
US20020126302A1 (en) | 2001-01-26 | 2002-09-12 | Canon Kabushiki Kaisha | Image processing apparatus and method, and image processing system |
US6807296B2 (en) | 2001-03-14 | 2004-10-19 | Imagica Corp. | Color conversion method and apparatus for chromakey processing |
US20050134599A1 (en) * | 2003-07-02 | 2005-06-23 | Shree Nayar | Methods and systems for compensating an image projected onto a surface having spatially varying photometric properties |
US7265781B2 (en) | 2001-08-22 | 2007-09-04 | Fujifilm Corporation | Method and apparatus for determining a color correction matrix by minimizing a color difference maximum or average value |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6028608A (en) * | 1997-05-09 | 2000-02-22 | Jenkins; Barry | System and method of perception-based image generation and encoding |
US6733138B2 (en) * | 2001-08-15 | 2004-05-11 | Mitsubishi Electric Research Laboratories, Inc. | Multi-projector mosaic with automatic registration |
US8355539B2 (en) * | 2007-09-07 | 2013-01-15 | Sri International | Radar guided vision system for vehicle validation and vehicle motion characterization |
US8013904B2 (en) * | 2008-12-09 | 2011-09-06 | Seiko Epson Corporation | View projection matrix based high performance low latency display pipeline |
-
2008
- 2008-12-09 US US12/331,281 patent/US8013904B2/en not_active Expired - Fee Related
-
2011
- 2011-08-02 US US13/196,401 patent/US8508615B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4553220A (en) | 1983-05-19 | 1985-11-12 | Gti Corporation | Matrix multiplier with normalized output |
US20010003456A1 (en) | 1999-12-09 | 2001-06-14 | Shuichi Kagawa | Image display device |
US20020097256A1 (en) | 2000-12-06 | 2002-07-25 | Miller Daniel J. | Methods and systems for processing media content |
US20020126302A1 (en) | 2001-01-26 | 2002-09-12 | Canon Kabushiki Kaisha | Image processing apparatus and method, and image processing system |
US6807296B2 (en) | 2001-03-14 | 2004-10-19 | Imagica Corp. | Color conversion method and apparatus for chromakey processing |
US7265781B2 (en) | 2001-08-22 | 2007-09-04 | Fujifilm Corporation | Method and apparatus for determining a color correction matrix by minimizing a color difference maximum or average value |
US20050134599A1 (en) * | 2003-07-02 | 2005-06-23 | Shree Nayar | Methods and systems for compensating an image projected onto a surface having spatially varying photometric properties |
Cited By (241)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10298834B2 (en) | 2006-12-01 | 2019-05-21 | Google Llc | Video refocusing |
US10142560B2 (en) | 2008-05-20 | 2018-11-27 | Fotonation Limited | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US9055213B2 (en) | 2008-05-20 | 2015-06-09 | Pelican Imaging Corporation | Systems and methods for measuring depth using images captured by monolithic camera arrays including at least one bayer camera |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US11412158B2 (en) | 2008-05-20 | 2022-08-09 | Fotonation Limited | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US12022207B2 (en) | 2008-05-20 | 2024-06-25 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US9749547B2 (en) | 2008-05-20 | 2017-08-29 | Fotonation Cayman Limited | Capturing and processing of images using camera array incorperating Bayer cameras having different fields of view |
US9077893B2 (en) | 2008-05-20 | 2015-07-07 | Pelican Imaging Corporation | Capturing and processing of images captured by non-grid camera arrays |
US9060120B2 (en) | 2008-05-20 | 2015-06-16 | Pelican Imaging Corporation | Systems and methods for generating depth maps using images captured by camera arrays |
US9060142B2 (en) | 2008-05-20 | 2015-06-16 | Pelican Imaging Corporation | Capturing and processing of images captured by camera arrays including heterogeneous optics |
US9060124B2 (en) | 2008-05-20 | 2015-06-16 | Pelican Imaging Corporation | Capturing and processing of images using non-monolithic camera arrays |
US9712759B2 (en) | 2008-05-20 | 2017-07-18 | Fotonation Cayman Limited | Systems and methods for generating depth maps using a camera arrays incorporating monochrome and color cameras |
US9060121B2 (en) | 2008-05-20 | 2015-06-16 | Pelican Imaging Corporation | Capturing and processing of images captured by camera arrays including cameras dedicated to sampling luma and cameras dedicated to sampling chroma |
US9576369B2 (en) | 2008-05-20 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for generating depth maps using images captured by camera arrays incorporating cameras having different fields of view |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US9055233B2 (en) | 2008-05-20 | 2015-06-09 | Pelican Imaging Corporation | Systems and methods for synthesizing higher resolution images using a set of images containing a baseline image |
US8885059B1 (en) | 2008-05-20 | 2014-11-11 | Pelican Imaging Corporation | Systems and methods for measuring depth using images captured by camera arrays |
US8896719B1 (en) | 2008-05-20 | 2014-11-25 | Pelican Imaging Corporation | Systems and methods for parallax measurement using camera arrays incorporating 3 x 3 camera configurations |
US8902321B2 (en) | 2008-05-20 | 2014-12-02 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US9049381B2 (en) | 2008-05-20 | 2015-06-02 | Pelican Imaging Corporation | Systems and methods for normalizing image data captured by camera arrays |
US10027901B2 (en) | 2008-05-20 | 2018-07-17 | Fotonation Cayman Limited | Systems and methods for generating depth maps using a camera arrays incorporating monochrome and color cameras |
US9049367B2 (en) | 2008-05-20 | 2015-06-02 | Pelican Imaging Corporation | Systems and methods for synthesizing higher resolution images using images captured by camera arrays |
US9049390B2 (en) | 2008-05-20 | 2015-06-02 | Pelican Imaging Corporation | Capturing and processing of images captured by arrays including polychromatic cameras |
US12041360B2 (en) | 2008-05-20 | 2024-07-16 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US9485496B2 (en) | 2008-05-20 | 2016-11-01 | Pelican Imaging Corporation | Systems and methods for measuring depth using images captured by a camera array including cameras surrounding a central camera |
US9124815B2 (en) | 2008-05-20 | 2015-09-01 | Pelican Imaging Corporation | Capturing and processing of images including occlusions captured by arrays of luma and chroma cameras |
US9188765B2 (en) | 2008-05-20 | 2015-11-17 | Pelican Imaging Corporation | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US9049391B2 (en) | 2008-05-20 | 2015-06-02 | Pelican Imaging Corporation | Capturing and processing of near-IR images including occlusions using camera arrays incorporating near-IR light sources |
US9094661B2 (en) | 2008-05-20 | 2015-07-28 | Pelican Imaging Corporation | Systems and methods for generating depth maps using a set of images containing a baseline image |
US9049411B2 (en) | 2008-05-20 | 2015-06-02 | Pelican Imaging Corporation | Camera arrays incorporating 3×3 imager configurations |
US9235898B2 (en) | 2008-05-20 | 2016-01-12 | Pelican Imaging Corporation | Systems and methods for generating depth maps using light focused on an image sensor by a lens element array |
US9191580B2 (en) | 2008-05-20 | 2015-11-17 | Pelican Imaging Corporation | Capturing and processing of images including occlusions captured by camera arrays |
US9041829B2 (en) | 2008-05-20 | 2015-05-26 | Pelican Imaging Corporation | Capturing and processing of high dynamic range images using camera arrays |
US9041823B2 (en) | 2008-05-20 | 2015-05-26 | Pelican Imaging Corporation | Systems and methods for performing post capture refocus using images captured by camera arrays |
US8106949B2 (en) * | 2009-03-26 | 2012-01-31 | Seiko Epson Corporation | Small memory footprint light transport matrix capture |
US20100299642A1 (en) * | 2009-05-22 | 2010-11-25 | Thomas Merrell | Electronic Device with Sensing Assembly and Method for Detecting Basic Gestures |
US8619029B2 (en) | 2009-05-22 | 2013-12-31 | Motorola Mobility Llc | Electronic device with sensing assembly and method for interpreting consecutive gestures |
US20100299390A1 (en) * | 2009-05-22 | 2010-11-25 | Rachid Alameh | Method and System for Controlling Data Transmission to or From a Mobile Device |
US8788676B2 (en) * | 2009-05-22 | 2014-07-22 | Motorola Mobility Llc | Method and system for controlling data transmission to or from a mobile device |
US8970486B2 (en) | 2009-05-22 | 2015-03-03 | Google Technology Holdings LLC | Mobile device with user interaction capability and method of operating same |
US8344325B2 (en) | 2009-05-22 | 2013-01-01 | Motorola Mobility Llc | Electronic device with sensing assembly and method for detecting basic gestures |
US8391719B2 (en) | 2009-05-22 | 2013-03-05 | Motorola Mobility Llc | Method and system for conducting communication between mobile devices |
US8542186B2 (en) | 2009-05-22 | 2013-09-24 | Motorola Mobility Llc | Mobile device with user interaction capability and method of operating same |
US8861089B2 (en) | 2009-11-20 | 2014-10-14 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US9264610B2 (en) | 2009-11-20 | 2016-02-16 | Pelican Imaging Corporation | Capturing and processing of images including occlusions captured by heterogeneous camera arrays |
US10306120B2 (en) | 2009-11-20 | 2019-05-28 | Fotonation Limited | Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps |
US20110242332A1 (en) * | 2010-04-01 | 2011-10-06 | Mcfadyen Doug | Method and Apparatus for Calibrating a Projector for Image Warping |
US8212945B2 (en) * | 2010-04-01 | 2012-07-03 | Seiko Epson Corporation | Method and apparatus for calibrating a projector for image warping |
US8963845B2 (en) | 2010-05-05 | 2015-02-24 | Google Technology Holdings LLC | Mobile device with temperature sensing capability and method of operating same |
US10455168B2 (en) | 2010-05-12 | 2019-10-22 | Fotonation Limited | Imager array interfaces |
US9936148B2 (en) | 2010-05-12 | 2018-04-03 | Fotonation Cayman Limited | Imager array interfaces |
US8928793B2 (en) | 2010-05-12 | 2015-01-06 | Pelican Imaging Corporation | Imager array interfaces |
US8751056B2 (en) | 2010-05-25 | 2014-06-10 | Motorola Mobility Llc | User computer device with temperature sensing capabilities and method of operating same |
US9103732B2 (en) | 2010-05-25 | 2015-08-11 | Google Technology Holdings LLC | User computer device with temperature sensing capabilities and method of operating same |
US9361662B2 (en) | 2010-12-14 | 2016-06-07 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
US11423513B2 (en) | 2010-12-14 | 2022-08-23 | Fotonation Limited | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US11875475B2 (en) | 2010-12-14 | 2024-01-16 | Adeia Imaging Llc | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US10366472B2 (en) | 2010-12-14 | 2019-07-30 | Fotonation Limited | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US12243190B2 (en) | 2010-12-14 | 2025-03-04 | Adeia Imaging Llc | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US9041824B2 (en) | 2010-12-14 | 2015-05-26 | Pelican Imaging Corporation | Systems and methods for dynamic refocusing of high resolution images generated using images captured by a plurality of imagers |
US9047684B2 (en) | 2010-12-14 | 2015-06-02 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using a set of geometrically registered images |
US9866739B2 (en) | 2011-05-11 | 2018-01-09 | Fotonation Cayman Limited | Systems and methods for transmitting and receiving array camera image data |
US10742861B2 (en) | 2011-05-11 | 2020-08-11 | Fotonation Limited | Systems and methods for transmitting and receiving array camera image data |
US9197821B2 (en) | 2011-05-11 | 2015-11-24 | Pelican Imaging Corporation | Systems and methods for transmitting and receiving array camera image data |
US10218889B2 (en) | 2011-05-11 | 2019-02-26 | Fotonation Limited | Systems and methods for transmitting and receiving array camera image data |
US9578237B2 (en) | 2011-06-28 | 2017-02-21 | Fotonation Cayman Limited | Array cameras incorporating optics with modulation transfer functions greater than sensor Nyquist frequency for capture of images used in super-resolution processing |
US9128228B2 (en) | 2011-06-28 | 2015-09-08 | Pelican Imaging Corporation | Optical arrangements for use with an array camera |
US9516222B2 (en) | 2011-06-28 | 2016-12-06 | Kip Peli P1 Lp | Array cameras incorporating monolithic array camera modules with high MTF lens stacks for capture of images used in super-resolution processing |
US10375302B2 (en) | 2011-09-19 | 2019-08-06 | Fotonation Limited | Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures |
US9794476B2 (en) | 2011-09-19 | 2017-10-17 | Fotonation Cayman Limited | Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures |
US9031335B2 (en) | 2011-09-28 | 2015-05-12 | Pelican Imaging Corporation | Systems and methods for encoding light field image files having depth and confidence maps |
US12052409B2 (en) | 2011-09-28 | 2024-07-30 | Adela Imaging LLC | Systems and methods for encoding image files containing depth maps stored as metadata |
US9036928B2 (en) | 2011-09-28 | 2015-05-19 | Pelican Imaging Corporation | Systems and methods for encoding structured light field image files |
US8831367B2 (en) | 2011-09-28 | 2014-09-09 | Pelican Imaging Corporation | Systems and methods for decoding light field image files |
US9036931B2 (en) | 2011-09-28 | 2015-05-19 | Pelican Imaging Corporation | Systems and methods for decoding structured light field image files |
US9031342B2 (en) | 2011-09-28 | 2015-05-12 | Pelican Imaging Corporation | Systems and methods for encoding refocusable light field image files |
US9129183B2 (en) | 2011-09-28 | 2015-09-08 | Pelican Imaging Corporation | Systems and methods for encoding light field image files |
US9811753B2 (en) | 2011-09-28 | 2017-11-07 | Fotonation Cayman Limited | Systems and methods for encoding light field image files |
US11729365B2 (en) | 2011-09-28 | 2023-08-15 | Adela Imaging LLC | Systems and methods for encoding image files containing depth maps stored as metadata |
US9031343B2 (en) | 2011-09-28 | 2015-05-12 | Pelican Imaging Corporation | Systems and methods for encoding light field image files having a depth map |
US10275676B2 (en) | 2011-09-28 | 2019-04-30 | Fotonation Limited | Systems and methods for encoding image files containing depth maps stored as metadata |
US9864921B2 (en) | 2011-09-28 | 2018-01-09 | Fotonation Cayman Limited | Systems and methods for encoding image files containing depth maps stored as metadata |
US9042667B2 (en) | 2011-09-28 | 2015-05-26 | Pelican Imaging Corporation | Systems and methods for decoding light field image files using a depth map |
US9025895B2 (en) | 2011-09-28 | 2015-05-05 | Pelican Imaging Corporation | Systems and methods for decoding refocusable light field image files |
US9025894B2 (en) | 2011-09-28 | 2015-05-05 | Pelican Imaging Corporation | Systems and methods for decoding light field image files having depth and confidence maps |
US10430682B2 (en) | 2011-09-28 | 2019-10-01 | Fotonation Limited | Systems and methods for decoding image files containing depth maps stored as metadata |
US20180197035A1 (en) | 2011-09-28 | 2018-07-12 | Fotonation Cayman Limited | Systems and Methods for Encoding Image Files Containing Depth Maps Stored as Metadata |
US9536166B2 (en) | 2011-09-28 | 2017-01-03 | Kip Peli P1 Lp | Systems and methods for decoding image files containing depth maps stored as metadata |
US10019816B2 (en) | 2011-09-28 | 2018-07-10 | Fotonation Cayman Limited | Systems and methods for decoding image files containing depth maps stored as metadata |
US10984276B2 (en) | 2011-09-28 | 2021-04-20 | Fotonation Limited | Systems and methods for encoding image files containing depth maps stored as metadata |
US9412206B2 (en) | 2012-02-21 | 2016-08-09 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
US10311649B2 (en) | 2012-02-21 | 2019-06-04 | Fotonation Limited | Systems and method for performing depth based image editing |
US9754422B2 (en) | 2012-02-21 | 2017-09-05 | Fotonation Cayman Limited | Systems and method for performing depth based image editing |
US9210392B2 (en) | 2012-05-01 | 2015-12-08 | Pelican Imaging Coporation | Camera modules patterned with pi filter groups |
US9706132B2 (en) | 2012-05-01 | 2017-07-11 | Fotonation Cayman Limited | Camera modules patterned with pi filter groups |
US10552947B2 (en) | 2012-06-26 | 2020-02-04 | Google Llc | Depth-based image blurring |
US9100635B2 (en) | 2012-06-28 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for detecting defective camera arrays and optic arrays |
US9807382B2 (en) | 2012-06-28 | 2017-10-31 | Fotonation Cayman Limited | Systems and methods for detecting defective camera arrays and optic arrays |
US10334241B2 (en) | 2012-06-28 | 2019-06-25 | Fotonation Limited | Systems and methods for detecting defective camera arrays and optic arrays |
US10261219B2 (en) | 2012-06-30 | 2019-04-16 | Fotonation Limited | Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors |
US11022725B2 (en) | 2012-06-30 | 2021-06-01 | Fotonation Limited | Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors |
US9766380B2 (en) | 2012-06-30 | 2017-09-19 | Fotonation Cayman Limited | Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors |
US9129377B2 (en) | 2012-08-21 | 2015-09-08 | Pelican Imaging Corporation | Systems and methods for measuring depth based upon occlusion patterns in images |
US10380752B2 (en) | 2012-08-21 | 2019-08-13 | Fotonation Limited | Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints |
US12002233B2 (en) | 2012-08-21 | 2024-06-04 | Adeia Imaging Llc | Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints |
US9123118B2 (en) | 2012-08-21 | 2015-09-01 | Pelican Imaging Corporation | System and methods for measuring depth using an array camera employing a bayer filter |
US9235900B2 (en) | 2012-08-21 | 2016-01-12 | Pelican Imaging Corporation | Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints |
US9858673B2 (en) | 2012-08-21 | 2018-01-02 | Fotonation Cayman Limited | Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints |
US9123117B2 (en) | 2012-08-21 | 2015-09-01 | Pelican Imaging Corporation | Systems and methods for generating depth maps and corresponding confidence maps indicating depth estimation reliability |
US9147254B2 (en) | 2012-08-21 | 2015-09-29 | Pelican Imaging Corporation | Systems and methods for measuring depth in the presence of occlusions using a subset of images |
US9240049B2 (en) | 2012-08-21 | 2016-01-19 | Pelican Imaging Corporation | Systems and methods for measuring depth using an array of independently controllable cameras |
US9813616B2 (en) | 2012-08-23 | 2017-11-07 | Fotonation Cayman Limited | Feature based high resolution motion estimation from low resolution images captured using an array source |
US10462362B2 (en) | 2012-08-23 | 2019-10-29 | Fotonation Limited | Feature based high resolution motion estimation from low resolution images captured using an array source |
US9214013B2 (en) | 2012-09-14 | 2015-12-15 | Pelican Imaging Corporation | Systems and methods for correcting user identified artifacts in light field images |
US10390005B2 (en) | 2012-09-28 | 2019-08-20 | Fotonation Limited | Generating images from light fields utilizing virtual viewpoints |
US9143711B2 (en) | 2012-11-13 | 2015-09-22 | Pelican Imaging Corporation | Systems and methods for array camera focal plane control |
US9749568B2 (en) | 2012-11-13 | 2017-08-29 | Fotonation Cayman Limited | Systems and methods for array camera focal plane control |
US9001226B1 (en) * | 2012-12-04 | 2015-04-07 | Lytro, Inc. | Capturing and relighting images using multiple devices |
US10009538B2 (en) | 2013-02-21 | 2018-06-26 | Fotonation Cayman Limited | Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information |
US9462164B2 (en) | 2013-02-21 | 2016-10-04 | Pelican Imaging Corporation | Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information |
US9774831B2 (en) | 2013-02-24 | 2017-09-26 | Fotonation Cayman Limited | Thin form factor computational array cameras and modular array cameras |
US9253380B2 (en) | 2013-02-24 | 2016-02-02 | Pelican Imaging Corporation | Thin form factor computational array cameras and modular array cameras |
US9374512B2 (en) | 2013-02-24 | 2016-06-21 | Pelican Imaging Corporation | Thin form factor computational array cameras and modular array cameras |
US9743051B2 (en) | 2013-02-24 | 2017-08-22 | Fotonation Cayman Limited | Thin form factor computational array cameras and modular array cameras |
US9638883B1 (en) | 2013-03-04 | 2017-05-02 | Fotonation Cayman Limited | Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process |
US9774789B2 (en) | 2013-03-08 | 2017-09-26 | Fotonation Cayman Limited | Systems and methods for high dynamic range imaging using array cameras |
US9917998B2 (en) | 2013-03-08 | 2018-03-13 | Fotonation Cayman Limited | Systems and methods for measuring scene information while capturing images using array cameras |
US11985293B2 (en) | 2013-03-10 | 2024-05-14 | Adeia Imaging Llc | System and methods for calibration of an array camera |
US11272161B2 (en) | 2013-03-10 | 2022-03-08 | Fotonation Limited | System and methods for calibration of an array camera |
US9124864B2 (en) * | 2013-03-10 | 2015-09-01 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
US10560684B2 (en) * | 2013-03-10 | 2020-02-11 | Fotonation Limited | System and methods for calibration of an array camera |
US20200252597A1 (en) * | 2013-03-10 | 2020-08-06 | Fotonation Limited | System and Methods for Calibration of an Array Camera |
US10958892B2 (en) * | 2013-03-10 | 2021-03-23 | Fotonation Limited | System and methods for calibration of an array camera |
US9986224B2 (en) | 2013-03-10 | 2018-05-29 | Fotonation Cayman Limited | System and methods for calibration of an array camera |
US10225543B2 (en) * | 2013-03-10 | 2019-03-05 | Fotonation Limited | System and methods for calibration of an array camera |
US20220239890A1 (en) * | 2013-03-10 | 2022-07-28 | Fotonation Limited | System and Methods for Calibration of an Array Camera |
US8866912B2 (en) * | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
US20150035992A1 (en) * | 2013-03-10 | 2015-02-05 | Pelican Imaging Corporation | System and Methods for Calibration of an Array Camera |
US20240348765A1 (en) * | 2013-03-10 | 2024-10-17 | Adeia Imaging Llc | System and Methods for Calibration of an Array Camera |
US11570423B2 (en) * | 2013-03-10 | 2023-01-31 | Adeia Imaging Llc | System and methods for calibration of an array camera |
US9521416B1 (en) | 2013-03-11 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for image data compression |
US9733486B2 (en) | 2013-03-13 | 2017-08-15 | Fotonation Cayman Limited | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
US9888194B2 (en) | 2013-03-13 | 2018-02-06 | Fotonation Cayman Limited | Array camera architecture implementing quantum film image sensors |
US9741118B2 (en) | 2013-03-13 | 2017-08-22 | Fotonation Cayman Limited | System and methods for calibration of an array camera |
US9124831B2 (en) | 2013-03-13 | 2015-09-01 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
US9800856B2 (en) | 2013-03-13 | 2017-10-24 | Fotonation Cayman Limited | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
US9106784B2 (en) | 2013-03-13 | 2015-08-11 | Pelican Imaging Corporation | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
US10127682B2 (en) | 2013-03-13 | 2018-11-13 | Fotonation Limited | System and methods for calibration of an array camera |
US9519972B2 (en) | 2013-03-13 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
US10412314B2 (en) | 2013-03-14 | 2019-09-10 | Fotonation Limited | Systems and methods for photometric normalization in array cameras |
US9100586B2 (en) | 2013-03-14 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for photometric normalization in array cameras |
US10547772B2 (en) | 2013-03-14 | 2020-01-28 | Fotonation Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US10091405B2 (en) | 2013-03-14 | 2018-10-02 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9787911B2 (en) | 2013-03-14 | 2017-10-10 | Fotonation Cayman Limited | Systems and methods for photometric normalization in array cameras |
US9578259B2 (en) | 2013-03-14 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9800859B2 (en) | 2013-03-15 | 2017-10-24 | Fotonation Cayman Limited | Systems and methods for estimating depth using stereo array cameras |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10542208B2 (en) | 2013-03-15 | 2020-01-21 | Fotonation Limited | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9438888B2 (en) | 2013-03-15 | 2016-09-06 | Pelican Imaging Corporation | Systems and methods for stereo imaging with camera arrays |
US10638099B2 (en) | 2013-03-15 | 2020-04-28 | Fotonation Limited | Extended color processing on pelican array cameras |
US10182216B2 (en) | 2013-03-15 | 2019-01-15 | Fotonation Limited | Extended color processing on pelican array cameras |
US10674138B2 (en) | 2013-03-15 | 2020-06-02 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10455218B2 (en) | 2013-03-15 | 2019-10-22 | Fotonation Limited | Systems and methods for estimating depth using stereo array cameras |
US9602805B2 (en) | 2013-03-15 | 2017-03-21 | Fotonation Cayman Limited | Systems and methods for estimating depth using ad hoc stereo array cameras |
US9955070B2 (en) | 2013-03-15 | 2018-04-24 | Fotonation Cayman Limited | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9497370B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Array camera architecture implementing quantum dot color filters |
US10334151B2 (en) | 2013-04-22 | 2019-06-25 | Google Llc | Phase detection autofocus using subaperture images |
US9898856B2 (en) | 2013-09-27 | 2018-02-20 | Fotonation Cayman Limited | Systems and methods for depth-assisted perspective distortion correction |
US10540806B2 (en) | 2013-09-27 | 2020-01-21 | Fotonation Limited | Systems and methods for depth-assisted perspective distortion correction |
US9185276B2 (en) | 2013-11-07 | 2015-11-10 | Pelican Imaging Corporation | Methods of manufacturing array camera modules incorporating independently aligned lens stacks |
US9426343B2 (en) | 2013-11-07 | 2016-08-23 | Pelican Imaging Corporation | Array cameras incorporating independently aligned lens stacks |
US9924092B2 (en) | 2013-11-07 | 2018-03-20 | Fotonation Cayman Limited | Array cameras incorporating independently aligned lens stacks |
US9264592B2 (en) | 2013-11-07 | 2016-02-16 | Pelican Imaging Corporation | Array camera modules incorporating independently aligned lens stacks |
US11486698B2 (en) | 2013-11-18 | 2022-11-01 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US10767981B2 (en) | 2013-11-18 | 2020-09-08 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US10119808B2 (en) | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US9426361B2 (en) | 2013-11-26 | 2016-08-23 | Pelican Imaging Corporation | Array camera configurations incorporating multiple constituent array cameras |
US9813617B2 (en) | 2013-11-26 | 2017-11-07 | Fotonation Cayman Limited | Array camera configurations incorporating constituent array cameras and constituent cameras |
US10708492B2 (en) | 2013-11-26 | 2020-07-07 | Fotonation Limited | Array camera configurations incorporating constituent array cameras and constituent cameras |
US9456134B2 (en) | 2013-11-26 | 2016-09-27 | Pelican Imaging Corporation | Array camera configurations incorporating constituent array cameras and constituent cameras |
US10089740B2 (en) | 2014-03-07 | 2018-10-02 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
US10574905B2 (en) | 2014-03-07 | 2020-02-25 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
US9247117B2 (en) | 2014-04-07 | 2016-01-26 | Pelican Imaging Corporation | Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array |
US9521319B2 (en) | 2014-06-18 | 2016-12-13 | Pelican Imaging Corporation | Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor |
US11546576B2 (en) | 2014-09-29 | 2023-01-03 | Adeia Imaging Llc | Systems and methods for dynamic calibration of array cameras |
US10250871B2 (en) | 2014-09-29 | 2019-04-02 | Fotonation Limited | Systems and methods for dynamic calibration of array cameras |
US10275898B1 (en) | 2015-04-15 | 2019-04-30 | Google Llc | Wedge-based light-field video capture |
US10419737B2 (en) | 2015-04-15 | 2019-09-17 | Google Llc | Data structures and delivery methods for expediting virtual reality playback |
US10341632B2 (en) | 2015-04-15 | 2019-07-02 | Google Llc. | Spatial random access enabled video system with a three-dimensional viewing volume |
US10546424B2 (en) | 2015-04-15 | 2020-01-28 | Google Llc | Layered content delivery for virtual and augmented reality experiences |
US10469873B2 (en) | 2015-04-15 | 2019-11-05 | Google Llc | Encoding and decoding virtual reality video |
US10540818B2 (en) | 2015-04-15 | 2020-01-21 | Google Llc | Stereo image generation and interactive playback |
US10412373B2 (en) | 2015-04-15 | 2019-09-10 | Google Llc | Image capture for virtual reality displays |
US11328446B2 (en) | 2015-04-15 | 2022-05-10 | Google Llc | Combining light-field data with active depth data for depth map generation |
US10567464B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video compression with adaptive view-dependent lighting removal |
US10565734B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline |
US9942474B2 (en) | 2015-04-17 | 2018-04-10 | Fotonation Cayman Limited | Systems and methods for performing high speed video capture and depth estimation using array cameras |
US10205896B2 (en) | 2015-07-24 | 2019-02-12 | Google Llc | Automatic lens flare detection and correction for light-field images |
US20190347814A1 (en) * | 2016-06-02 | 2019-11-14 | Verily Life Sciences Llc | System and method for 3d scene reconstruction with dual complementary pattern illumination |
US10937179B2 (en) * | 2016-06-02 | 2021-03-02 | Verily Life Sciences Llc | System and method for 3D scene reconstruction with dual complementary pattern illumination |
US10275892B2 (en) | 2016-06-09 | 2019-04-30 | Google Llc | Multi-view scene segmentation and propagation |
US10679361B2 (en) | 2016-12-05 | 2020-06-09 | Google Llc | Multi-view rotoscope contour propagation |
US10594945B2 (en) | 2017-04-03 | 2020-03-17 | Google Llc | Generating dolly zoom effect using light field image data |
US10444931B2 (en) | 2017-05-09 | 2019-10-15 | Google Llc | Vantage generation and interactive playback |
US10440407B2 (en) | 2017-05-09 | 2019-10-08 | Google Llc | Adaptive control for immersive experience delivery |
US10474227B2 (en) | 2017-05-09 | 2019-11-12 | Google Llc | Generation of virtual reality with 6 degrees of freedom from limited viewer data |
US10354399B2 (en) | 2017-05-25 | 2019-07-16 | Google Llc | Multi-view back-projection to a light-field |
US11562498B2 (en) | 2017-08-21 | 2023-01-24 | Adela Imaging LLC | Systems and methods for hybrid depth regularization |
US10818026B2 (en) | 2017-08-21 | 2020-10-27 | Fotonation Limited | Systems and methods for hybrid depth regularization |
US11983893B2 (en) | 2017-08-21 | 2024-05-14 | Adeia Imaging Llc | Systems and methods for hybrid depth regularization |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
US10545215B2 (en) | 2017-09-13 | 2020-01-28 | Google Llc | 4D camera tracking and optical stabilization |
US10965862B2 (en) | 2018-01-18 | 2021-03-30 | Google Llc | Multi-camera navigation interface |
US11962946B2 (en) * | 2018-02-20 | 2024-04-16 | Canon Kabushiki Kaisha | Image processing apparatus, display system, image processing method, and medium |
US20200358992A1 (en) * | 2018-02-20 | 2020-11-12 | Canon Kabushiki Kaisha | Image processing apparatus, display system, image processing method, and medium |
US11270110B2 (en) | 2019-09-17 | 2022-03-08 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
US11699273B2 (en) | 2019-09-17 | 2023-07-11 | Intrinsic Innovation Llc | Systems and methods for surface modeling using polarization cues |
US12099148B2 (en) | 2019-10-07 | 2024-09-24 | Intrinsic Innovation Llc | Systems and methods for surface normals sensing with polarization |
US11525906B2 (en) | 2019-10-07 | 2022-12-13 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
US11982775B2 (en) | 2019-10-07 | 2024-05-14 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
US20220405891A1 (en) * | 2019-11-25 | 2022-12-22 | Microsoft Technology Licensing, Llc | Image sharpening for subjects imaged through display |
US12125179B2 (en) * | 2019-11-25 | 2024-10-22 | Microsoft Technology Licensing, Llc | Image sharpening for subjects imaged through display |
US11842495B2 (en) | 2019-11-30 | 2023-12-12 | Intrinsic Innovation Llc | Systems and methods for transparent object segmentation using polarization cues |
US11302012B2 (en) | 2019-11-30 | 2022-04-12 | Boston Polarimetrics, Inc. | Systems and methods for transparent object segmentation using polarization cues |
US11580667B2 (en) | 2020-01-29 | 2023-02-14 | Intrinsic Innovation Llc | Systems and methods for characterizing object pose detection and measurement systems |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11683594B2 (en) | 2021-04-15 | 2023-06-20 | Intrinsic Innovation Llc | Systems and methods for camera exposure control |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US12175741B2 (en) | 2021-06-22 | 2024-12-24 | Intrinsic Innovation Llc | Systems and methods for a vision guided end effector |
US12172310B2 (en) | 2021-06-29 | 2024-12-24 | Intrinsic Innovation Llc | Systems and methods for picking objects using 3-D geometry and segmentation |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
US12293535B2 (en) | 2021-08-03 | 2025-05-06 | Intrinsic Innovation Llc | Systems and methods for training pose estimators in computer vision |
Also Published As
Publication number | Publication date |
---|---|
US20110298960A1 (en) | 2011-12-08 |
US20100141780A1 (en) | 2010-06-10 |
US8508615B2 (en) | 2013-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8013904B2 (en) | View projection matrix based high performance low latency display pipeline | |
US8106949B2 (en) | Small memory footprint light transport matrix capture | |
US8310525B2 (en) | One-touch projector alignment for 3D stereo display | |
US8042954B2 (en) | Mosaicing of view projections | |
US8189957B2 (en) | View projection for dynamic configurations | |
US7901095B2 (en) | Resolution scalable view projection | |
US8218003B2 (en) | Optimization strategies for GPU view projection matrix implementation | |
US7901093B2 (en) | Modeling light transport in complex display systems | |
US8197070B2 (en) | Color-based feature identification | |
US10701324B2 (en) | Gestural control of visual projectors | |
Raskar et al. | A low-cost projector mosaic with fast registration | |
Bimber et al. | Embedded entertainment with smart projectors | |
US8201951B2 (en) | Catadioptric projectors | |
US9137504B2 (en) | System and method for projecting multiple image streams | |
US7800628B2 (en) | System and method for generating scale maps | |
US9052575B2 (en) | Determining correspondence mappings from infrared patterns projected during the projection of visual content | |
US7018050B2 (en) | System and method for correcting luminance non-uniformity of obliquely projected images | |
Harville et al. | Practical methods for geometric and photometric correction of tiled projector | |
US20070291184A1 (en) | System and method for displaying images | |
WO2003017679A1 (en) | Multi-projector mosaic with automatic registration | |
JP2008171431A (en) | Method for generating an estimated inverse matrix of a reference matrix, method for simulating a first projection image from a first projector using a second projection image from a second projector, method for generating a projection image, and projection system | |
Majumder | Large format displays | |
Sajadi | Auto-registration techniques for immersive multi-projector displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SEIKO EPSON CORPORATION,JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONEHARA, TOMIO;REEL/FRAME:021950/0377 Effective date: 20081024 Owner name: EPSON RESEARCH AND DEVELOPMENT, INC.,CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAN, KAR-HAN;REEL/FRAME:021950/0389 Effective date: 20081203 Owner name: SEIKO EPSON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONEHARA, TOMIO;REEL/FRAME:021950/0377 Effective date: 20081024 Owner name: EPSON RESEARCH AND DEVELOPMENT, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAN, KAR-HAN;REEL/FRAME:021950/0389 Effective date: 20081203 |
|
AS | Assignment |
Owner name: SEIKO EPSON CORPORATION,JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EPSON RESEARCH AND DEVELOPMENT, INC.;REEL/FRAME:022106/0888 Effective date: 20081211 Owner name: SEIKO EPSON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EPSON RESEARCH AND DEVELOPMENT, INC.;REEL/FRAME:022106/0888 Effective date: 20081211 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20230906 |