US5699497A - Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint - Google Patents
Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint Download PDFInfo
- Publication number
- US5699497A US5699497A US08/613,893 US61389396A US5699497A US 5699497 A US5699497 A US 5699497A US 61389396 A US61389396 A US 61389396A US 5699497 A US5699497 A US 5699497A
- Authority
- US
- United States
- Prior art keywords
- texture
- terrain
- data
- levels
- viewpoint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000009877 rendering Methods 0.000 title description 2
- 230000015654 memory Effects 0.000 claims abstract description 53
- 238000003491 array Methods 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 11
- 238000002156 mixing Methods 0.000 claims 1
- 239000012634 fragment Substances 0.000 abstract description 5
- 238000000034 method Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 7
- 239000002131 composite material Substances 0.000 description 4
- 238000013507 mapping Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Definitions
- the present invention relates to a system with improved memory organization for generating real time dynamic images on a display device representative of large-surface perspective scenes, such as the terrain beneath an aircraft.
- a computer image generator combines elevation and texture data to drive a display that reveals dynamic perspective images. While practical systems have been accomplished, a need continues to exist for improved systems of greater economy and capable of providing displays that are substantially void of unwanted artifacts. That is, in relation to the generation of dynamic terrain images, a need exists for a system for organizing and selectively paging portions of an extremely large continuous terrain map with a reasonably sized active memory while assuring that map data is available as needed.
- elevation and planimetric data for use by an image generator to provide a dynamic terrain display is organized as a rectangular-coordinate field, e.g., a height field of digitized grid post data.
- the height field may consist of elevation samples arrayed at coordinate points in a rectangular grid aligned with X-Y datum. Elevation values at coordinate X-Y points then manifest heights (Z) of the terrain or skin.
- a terrain representation so defined, it can be processed to obtain data for individual picture elements (pixels) of a display in relation to a selected viewpoint.
- the data can be computed in eye space to generate a basic form that is textured, shaded and scan converted for the display.
- anti-aliasing getting rid of the unwanted artifacts created by the projection process).
- MIP maps of different texture resolutions for the same area.
- traditional MIP map techniques are effective for texturing certain objects, a difficulty arises in attempting to texture terrain polygons. That is, because terrain is a continuous surface, made up of many relatively small polygons, requirements are for either an extremely large number of texture maps or one extremely large texture map. Small maps create problems in matching the texture across all the boundaries. Generally, in accordance herewith, an extremely large continuous terrain texture map is organized and paged in such a manner as to require only a reasonable amount of active memory and to assure the availability of texture data. In an unconventional format, as described in detail below, for each projected area of an image, depending on distance from the viewpoint, forms of MIP maps are selected and interpolated to derive the intensity and color components.
- the system hereof involves utilizing active memory to contain select portions of terrain data as needed for creating an image from a current eye point. It is recognized that only the areas surrounding the eye point can have any effect on the viewed image, and in that regard, the highest level texture map (the map with the most detail) need only be available for the immediate area. As the projected pixel size increases relative to the projected map texture element (texel) size, lower level maps are used.
- each level map that is, the parts that may be needed at any instant, may be the same size. Accordingly, with increasing distance from the viewpoint, a larger area is embraced by texels but less detail is provided. Thus, substantially the same amount of data is involved for levels of each map.
- a series of texture maps are maintained available, as to represent various distance levels, each of which has the same number of texels but embraces an area of a different size with a compensating reduction in detail.
- FIG. 1 is a diagrammatic illustration of a height field related to a viewpoint and texture mapping
- FIG. 2 is a vertical section diagram illustrating a terrain profile in relation to a viewpoint for graphics display
- FIG. 3 is a graphic representation of sample picture element (pixel) patterns as occur in accordance herewith;
- FIG. 4 is a diagrammatic vertical sectional view illustrating texture mapping in relation to distance in accordance herewith;
- FIG. 5 is a perspective view illustrating multiple exemplary texture maps as used herein;
- FIG. 6 is a top plan view of the maps of FIG. 5;
- FIG. 7 is a diagrammatic perspective view of components of the maps of FIG. 5;
- FIG. 8 is an enlarged circular fragment of FIG. 6, illustrating exemplary texture mapping operations in accordance herewith;
- FIG. 9 is a chart illustrating exemplary data paging into active memory in accordance herewith.
- FIG. 10 is a block diagram of a system in accordance herewith for executing the operations as illustrated in the above-mentioned FIGURES.
- a significant aspect of the system of the present invention is based on recognizing that it is possible to predict or anticipate those portions of an extensive terrain texturing map that will be needed for creating an image from any specific viewpoint or eye point. Accordingly, the disclosed system exemplifies creating a terrain display from a digitized grid post and texturing the display using a plurality of prepared maps (somewhat distinct from conventional MIP maps) of different resolution for the same area. Some basic considerations of the graphics processing will be presented initially.
- FIG. 1 a rectangular fragment of a height field or digitized grid post is shown in a cube 12 and exemplified by a single vertical line 10 extending from a reference grid 14 to a skin 16 representing a continuous terrain or surface.
- the grid 14 is designated in X (east) and Y (north) coordinate dimensions, with the height dimension indicated in terms of Z. It is to be understood that FIG. 1 is a graphic illustration, not to scale or in proportion, but rather distorted and fragmentary in the interests of simplification and explanation.
- fragment 12 showing the skin 16 is illustrative, it is to be understood that height field data will exist from an eye point or viewpoint 20 out in all directions to the extent of the modeled range. However, the illustrated fragment 12 will illustrate the manner in which a represented texture map 18 is projected onto the skin 16 to attain an enhanced and effective image display.
- the skin 16 is displayed in relation to the selected viewpoint 20, a rectangular component of the view being represented by a pyramid of vision 22.
- systems are well known in the art wherein texture from a map, e.g., the map 18, is projected onto a skin, as the skin 16, to accomplish a realistic display.
- a map e.g., the map 18
- the viewpoint 20 is selected and data is processed to provide a dynamic image showing a changing terrain for the moving viewpoint 20, as would appear from an aircraft.
- rays extend from the viewpoint 20 to the skin 16 to define individual picture elements (pixels).
- the pixels are processed to impart form, texture and shading in a composite display. Such operations are disclosed in U.S. patent application Ser. No. 08/137,907 entitled DIRECT RENDERING OF TEXTURED HEIGHT FIELDS, Michael A. Cosman, hereby incorporated by reference herein.
- the area of the terrain or skin 16 embraced by a pixel varies with remoteness from the viewpoint 20.
- scales and size relationships are somewhat disregarded in FIG. 1; however, at opposed corners of a pyramid of vision 22, pixel areas 24 and 26 are represented impinging the terrain or skin 16 at significantly different distances from the viewpoint 20.
- the pixel area 24 is much closer to the viewpoint 20 than the pixel area 26. Accordingly, it will be apparent that the area 24 is smaller than the area 26.
- pixels representative of areas on the skin 16 will embrace larger areas with remoteness from the viewpoint 20. That is, the size of an area represented by a pixel will vary with the distance from the viewpoint 20 to the terrain skin 16.
- a related consideration involves texturing the terrain areas with the texture map 18.
- texture map 18 Generally, detailed texture elements 27 (texels) of the texture map 18 are not appropriate for pixel areas of all ranges. That is, as indicated above, one detailed texture map is not appropriate for pixels of all sizes (embracing areas of different size). Consider the texture map 18 in greater detail.
- the texture data represented by the map 18, consists of an array of data that describes the surface (intensity and color) of the skin 16 at regular positions of the grid 14 in terms of X-Y coordinates.
- the map 18 is much like the height field defining the skin 16, and some processes that are applicable to process the height field also may be used to process the texture map.
- the purpose of the texture map 18 is to "paint" the skin 16 in detail to be representative of a terrain in the display. Accordingly, as indicated, it is desirable that the texture space be of higher resolution than the height field. As explained above, with progressive remoteness from the viewpoint 20, it is appropriate to sacrifice detail in the texture as the embraced area of pixels grows.
- Such operations have been implemented using techniques of so-called “MIP” maps.
- MIP stands for "multum in parvo", Latin for "many things in a small space”. Essentially, reference is to the fact that each simpler, coarser representation of the data for texturing a more remote area is the result of filtering the previous matrix of data to eliminate changes that cannot be represented at the coarser resolution.
- the filtering process averages or smoothes the high-frequency detail to derive a more slowly varying representation that can be adequately represented with fewer samples.
- a detailed treatment of MIP strategy and its implementation is disclosed in a publication entitled Computer Graphics, Volume 17, No. 3, Jul. 1983, specifically in an article by Lance Williams entitled "Pyramidal Parametrics".
- MIP map techniques involve storing a number of maps, each of a different resolution for the same area. For each projected pixel area of an image, depending on its remoteness from the viewpoint, an interpolation is made from the maps to derive intensity and color components. As the projected pixel size increases relative to the projected map size, lower level, that is, less detailed, maps are used. However, difficulty arises in utilizing the conventional techniques for applying texture to terrain polygons.
- texture MIP maps are loaded, map by map, into active working memory as they are needed. Because terrain is a continuous surface, and is treated as many relatively small polygons, the operation requires either an extremely large number of basic texture maps or one extremely large map. In general, as disclosed below, the present system organizes and pages MIP maps constituting portions of an extremely large continuous terrain map so as to require only a moderate amount of active memory and to assure that the map portions needed are available at any time.
- the system is based on the realization that it is possible to anticipate and select those portions of a terrain texture map that are needed for creating the image from a given eye point. Essentially, only those areas surrounding the viewpoint have any effect on the image.
- the highest level MIP map (the map with the most detail) need only be available for the area out to, say one thousand feet.
- the next level map embraces a distance out to two thousand feet, and the next level map to four thousand feet and so on. The reason is that as the projected pixel size increases relative to the projected map texel size, lower level MIP maps are used.
- rays are extended from a viewpoint to the height field skin 16 for defining individual pixels.
- view triangles 28 and 29 illustrate the projection embraced by a pair of rays from the viewpoint 20 to the terrain skin 16.
- the bases 30 and 31 of the triangles 28 and 29, respectively, indicate the pixel areas to be represented. Note as explained above, the base 31 is larger than the base 30 as a result of the greater distance to the viewpoint 20.
- the range from the eye point to the surface establishes an "upper limit" on the texture level map to be used. As the orientation of the surface becomes more oblique, a lower level of texture map will be used. That is, the appropriate level of the texture map is determined by the ratio of the projected pixel size to the projected texel size. This ratio is a function of both range from the eye point and the surface orientation to the eye point.
- FIG. 3 shows a sectional image-space view through sampling rays 33 and 35.
- FIG. 3 is somewhat a sectional view through rectangular patterns 37 and 38 of rays 33 and 35 as would be centered in the projection triangles 28 and 29, respectively. Accordingly, the spacing of the patterns 37 and 38 illustrates the enlarged area embraced by pixels as distance increases from a viewpoint.
- the skin 16 is shown as a profile from a different angle to illustrate vertical extensions of view triangles 28 and 29 from the viewpoint 20.
- the triangles 28 and 29 are shown to extend vertically to simply illustrate an exemplary positioning of map levels L1, L2, L3 and L4 for texture maps of varying detail.
- the map L1 is displaced from the viewpoint 20 by a distance of one thousand feet. Beyond the map level L1, levels are displaced as follows: level L2--two thousand feet, level L3--four thousand feet, level L4--eight thousand feet and so on.
- FIGS. 5 and 6 illustrate rectangular arrays of texels for maps of levels L1-L4 (FIG. 4). It is to be understood that the basic height field and texture database (in disk storage) are very extensive, however, only portions of each are carried in active memory, as described below.
- FIGS. 5 and 6 illustrate representative level maps L1-L4 that are simplified and compromised for purposes of explanation. Though greatly reduced in texels represented, for purposes of explanation, the drawings can be thought of as showing active portions of level maps L1-L4 currently contained in active memory.
- each of the first twelve layers contain approximately one million texels (1024 ⁇ 1024).
- the remaining level maps contain fewer texels as described below, as the viewpoint moves, data is paged to replace rows and/or columns in the grid maps.
- a ray 41 is shown extending from the viewpoint 20 through the maps L1 and L2 to intersect the profile of terrain 16 at a point 43.
- the point 43 specifies a pixel area that is to be textured by interpolating as between the maps L2 and L3, the map levels being derived and stored from the texture database fragmentarily represented by the texture map 18 (FIG. 1).
- polygons are marked for global texture and on the basis of X-Y coordinates of the vertices and texture is mapped from conventionally identified U, V coordinates.
- texture is applied to the polygons on a pixel-by-pixel basis to accomplish data for a display.
- the operation of the present system involves interpolation between texels of the map levels L2 and L3.
- the ray 41 pierces the texels 54 and 56.
- the texel 56 embraces a substantially larger area than the texel 54; however, note that the texel 54 is more detailed. Accordingly, as indicated above, the texels 54 and 56 may contain substantially similar amounts of data.
- the consequence of processing the terrain intersection point 52 is an interpolation involving the four surrounding coordinate corners or points of the texels 54 and 56 at the two map levels, specifically points 58, 59, 60 and 61 of the texel 54 in map L2 and points 62, 63, 64 and 65 of the texel 56 in map L3.
- Interpolation (usually but not necessarily linear, as well known in the art) is used to calculate a texture value from the eight surrounding values of the points 58-65. Accordingly, a value (intensity and color) is determined for texturing the pixel identified by the impact point 52.
- FIG. 8 illustrates the demarcations.
- the map L1 (highest level of detail) is applied to the extent of a range R1 as illustrated by a vector. Beyond the range R1 to a range R2, transition occurs from the map L1 to the map L2 utilizing interpolation techniques. As illustrated, a lower level of detail is then introduced between the ranges R2 and R3.
- the ranges R1, R2 and R3 define annular rings A1, A2, A3 that are aligned within the maps L1 and L2.
- the corners of the square maps L1-Ln complete grid arrays for the convenience of paging data into active memory as will be explained in detail below. Note that the corner data is used if the eye point moves such that the data falls within the annular ring.
- texture maps L1-Ln are carried in active memory for a terrain.
- the different level maps are maintained in active memory for an area about the viewpoint 20.
- the viewpoint 20 may be exemplified as an aircraft in flight.
- maps are maintained indicative of the terrain currently visible beneath the aircraft.
- the active terrain texture maps reflect the terrain (represented by the height field data) beneath the aircraft.
- relative height field data also is paged into active memory for processing in terms of polygons as well known in the prior art.
- the texture data is far more dense than the height field data. Consequently, the demands attendant texture data are far greater than for height field data. Consequently, in accordance herewith, various arrangements can be utilized.
- height field data can be paged into active memory using the same techniques as described below for paging texture map data, or various known paging techniques can be employed for the height field data in view of the lesser volume.
- texture data is paged by columns and rows of texels into arrays defined in active memory to be available as needed.
- it is possible to predict which portions of terrain map are needed for creating the image from any given viewpoint.
- the eye point or viewpoint 20 may take the form of an aircraft traversing a terrain represented by the height field data (skin 16, FIG. 1) and correlated to the level maps L1-Ln (FIG. 5).
- the system maintains a traveling envelope of active data in memory including height field data and terrain texture data in the form of level maps L1-Ln.
- the system of paging data into memory for maps L1-Ln in accordance with movement of the viewpoint 20 will now be considered with reference to FIG. 9.
- the viewpoint 20 (FIG. 5) is symbolized in plan view with respect to the level map L1 by an aircraft silhouette 70 (FIG. 9). Specifically, two representations of the map L1 are shown, designated at times T1 and T2, reflecting two different points in the travel of the aircraft 70 over the coordinate grid. Note that the active level map L1 is simplified indicating only an eight-by-eight grid of sixty-four texels.
- the representations at times T1 and T2 illustrate the contents of active memory for the map L1 at two distinct times, coinciding to specific positions of the viewpoint 20 symbolized by the aircraft 70.
- Texels in maps represented at the times T1 and T2 are designated by column and row, specifically, columns C12 through C19 and rows R1 through R8.
- the representation of time T1 locates the aircraft 70, traveling to the right in row R4, at column C15. Accordingly, in the representation of time T1, a forward view from the aircraft 70 encompasses the texels located in columns C16, C17, C18 and C19. Consequently, those texels are utilized to map the terrain as explained and illustrated above.
- data for each of the other level maps L1-Ln are similarly available, and, as indicated above, a substantially greater number of texels likely would be present in an operating system.
- next to the highest level map has texels representing twice the area of the highest level, it need be paged only half as often.
- the third level map need be paged only one-fourth as often as the highest level map and thus all levels can be paged in twice the time required for the highest level map only. Consequently, ordinary disk bandwidths are fast enough to move through the basic database accommodating travel of a very fast aircraft.
- the paging operation will now in considered in further detail.
- FIG. 10 A comprehensive height-field database (FIG. 1) is contained in a storage 102 (left, FIG. 10) while a texture database is contained in a storage 104.
- the storage 102 contains an extensive height field defining an extremely large area of terrain.
- the texture data may take the form of texture maps as explained above or may involve theme data accommodating the use of theme cells to construct texture data in accordance with previously proposed systems.
- a theme cell texturing system is disclosed in U.S. patent application Ser. No. 08/088,447 entitled IMAGE TEXTURING SYSTEM HAVING THEME CELLS, Michael Cosman and Thomas Brown, hereby incorporated by reference. Essentially, in theme cell texturing, texture is synthesized based on characteristics and generic terrain.
- the storage units 102 and 104 may be variously implemented and alternatively, a single storage structure or disk may be utilized.
- height field data and texture data of current interest (depending on eye point location) is selectively paged into active memory within an image generator 106. Accordingly, data is preserved accessible for a moving eye point to maintain a dynamic display.
- the image generator 106 may take a multitude of different forms; however, for the disclosed embodiment, a front end processor 108 and a pixel processor 110 are illustrated. The division is imposed for purposes of illustration and it is to be understood that a single processor or various other arrangements could accomplish both operations.
- the front end processor 108 accomplishes the traversal of the database, performs field of view culls, level of detail determination and geometric transformations. Such operations are well known in the field of computer graphics, for example, see the book Fundamentals of Interactive Computer Graphics, Foley and Van Dam, Addison-Wesley Publishing Co., 1984, also, Principles of Interactive Computer Graphics, Newman and Sproull, McGraw-Hill Book Company, 1979. In the final analysis, the front end processor 108 generates the terrain skin from height field data.
- the pixel processor 110 (back end of the image generator) performs the occultation, and the intensity/color calculations for each pixel of the display.
- the texturing of features and terrain is part of the process.
- an important consideration in accordance herewith, is that texture can be applied to any non-vertical surface, e.g., building tops, airport runways, parking lots and so on. The operation is not limited to surface terrain.
- height field data is fetched from the storage 102 by a data pager 112 for placement in an active memory 114 of the front end processor 108.
- the pager 112 is controlled by the front end processor 108 to maintain data of current interest available to the operating subsystems.
- an active memory 114 is maintained and utilized by a control 116, coupled through a bus 118 to a front end processing unit 120, a terrain generation unit 122 and a geometric transformation structure 124. Signals indicative of the accomplished transformations are supplied from the front end processor 108 to the pixel processor 110 for further processing.
- Terrain data of current interest is fetched from the storage 104 by a data pager 126 under control of the pixel processor 110. Specifically, a control 128 actuates the data pager 126 to maintain texture data of current interest in an active texture memory 130 organized and maintained as explained with reference to FIGS. 5, 6 and 9. So organized, the memory 130 is coupled through a bus 132 to a pixel intensity calculation unit 134. Processed pixels comprising a data image are developed in the frame buffer 136 utilizing processes as well known in the art. Pixel data from the frame buffer 136 provides an output image to a display unit 138.
- a viewpoint 20 (FIGS. 2 and 3) is maintained in the front end processor 108 of the image generator 106. From the stored viewpoint 20, projected rays define pixels for a desired image. Accordingly, terrain form is computed from the height field (FIG. 1). Height field data for such calculations are preserved in the active memory 114 as explained above. With the terrain representations developed, appropriate signals are supplied to the pixel processor 110 (FIG. 10) for pixel calculations including texturing operations. As explained above, texture data is maintained in the active memory 130 by the data pager 126 writing fresh column or row data as illustrated with reference to FIG. 9. Accordingly, data of current interest is preserved in active memory.
- the maintenance of the active memory 130 in accordance herewith and as described above affords an effective basis for texturing global terrains. Maintaining current MIP type maps in the active memory 130 as illustrated in FIGS. 5 and 6, is effective, as illustrated in FIG. 9.
- the texture maps as described herein, for different levels include texels of increasingly greater area. As stressed above, moving from the viewpoint, the texels progressively embrace larger areas, however, are less detailed. Thus, the volume of data is somewhat balanced.
- texture data is interpolated as explained with reference to FIG. 7, to accomplish the final color and intensity for each pixel of a display as accumulated in the frame buffer 136. Accordingly, frame-by-frame, composite images are completed in sequence for a dynamic display by the unit 138.
- texture data may not always be available at every level for a given terrain. In instances where higher levels of detail are not available, lower levels may be adapted to accomplish satisfactory images.
- theme cell operation may be employed as disclosed in detail in the above-referenced patent application, IMAGE TEXTURING SYSTEM HAVING THEME CELLS. Accordingly, a combination of techniques may be employed combining theme cells with terrain data to accomplish a composite display.
- a system may well utilize either one or many processors configured in a parallel mode where each of the processors do all of the tasks for a part of the scene or a part of the database.
- the database storage may be variously contained and arranged as may the pagers. Accordingly, in view of the possibilities, the scope hereof should be determined with reference to the claims as set out below.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
Description
______________________________________ CHART 1 VIEWPOINT (AIRCRAFT 70) FORWARD VIEW LOCATION (COLUMN) TEXELS ______________________________________ C15 C16, C17, C18, C19 C16 C17, C18, C19, C20 C17 C18, C19, C20, C21 C18 C19, C20, C21, C22 C19 C20, C21, C22, C23 C20 C21, C22, C23, C24 ______________________________________
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/613,893 US5699497A (en) | 1994-02-17 | 1996-03-11 | Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US19795794A | 1994-02-17 | 1994-02-17 | |
US08/613,893 US5699497A (en) | 1994-02-17 | 1996-03-11 | Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US19795794A Continuation | 1994-02-17 | 1994-02-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
US5699497A true US5699497A (en) | 1997-12-16 |
Family
ID=22731430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/613,893 Expired - Lifetime US5699497A (en) | 1994-02-17 | 1996-03-11 | Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint |
Country Status (1)
Country | Link |
---|---|
US (1) | US5699497A (en) |
Cited By (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5841440A (en) * | 1996-12-17 | 1998-11-24 | Apple Computer, Inc. | System and method for using a pointing device to indicate movement through three-dimensional space |
US6020893A (en) * | 1997-04-11 | 2000-02-01 | Novalogic, Inc. | System and method for realistic terrain simulation |
WO2000011607A1 (en) * | 1998-08-20 | 2000-03-02 | Apple Computer, Inc. | Deferred shading graphics pipeline processor |
WO2000039755A2 (en) * | 1998-12-23 | 2000-07-06 | Silicon Graphics, Inc. | Method, system, and computer program product for modified blending between clip-map tiles |
US6157386A (en) * | 1997-10-10 | 2000-12-05 | Cirrus Logic, Inc | MIP map blending in a graphics processor |
US6243099B1 (en) * | 1996-11-14 | 2001-06-05 | Ford Oxaal | Method for interactive viewing full-surround image data and apparatus therefor |
US6288721B1 (en) | 1999-07-07 | 2001-09-11 | Litton Systems, Inc. | Rendering process and method for digital map illumination intensity shading |
EP1139293A2 (en) * | 2000-03-24 | 2001-10-04 | Konami Computer Entertainment Japan Inc. | Game system and computer readable storage medium game program |
US20020060685A1 (en) * | 2000-04-28 | 2002-05-23 | Malcolm Handley | Method, system, and computer program product for managing terrain rendering information |
US6504550B1 (en) | 1998-05-21 | 2003-01-07 | Mitsubishi Electric & Electronics Usa, Inc. | System for graphics processing employing semiconductor device |
US6525731B1 (en) | 1999-11-09 | 2003-02-25 | Ibm Corporation | Dynamic view-dependent texture mapping |
US6535218B1 (en) | 1998-05-21 | 2003-03-18 | Mitsubishi Electric & Electronics Usa, Inc. | Frame buffer memory for graphic processing |
US20030059743A1 (en) * | 2001-08-29 | 2003-03-27 | The Boeing Company | Method and apparatus for automatically generating a terrain model for display during flight simulation |
US6549201B1 (en) * | 1999-11-23 | 2003-04-15 | Center For Advanced Science And Technology Incubation, Ltd. | Method for constructing a 3D polygonal surface from a 2D silhouette by using computer, apparatus thereof and storage medium |
WO2003032127A2 (en) * | 2001-10-10 | 2003-04-17 | Sony Computer Entertainment America Inc. | Dynamically loaded game software for smooth play. |
US6559851B1 (en) | 1998-05-21 | 2003-05-06 | Mitsubishi Electric & Electronics Usa, Inc. | Methods for semiconductor systems for graphics processing |
US6577317B1 (en) | 1998-08-20 | 2003-06-10 | Apple Computer, Inc. | Apparatus and method for geometry operations in a 3D-graphics pipeline |
US6597363B1 (en) | 1998-08-20 | 2003-07-22 | Apple Computer, Inc. | Graphics processor with deferred shading |
US6661421B1 (en) | 1998-05-21 | 2003-12-09 | Mitsubishi Electric & Electronics Usa, Inc. | Methods for operation of semiconductor memory |
US6700573B2 (en) | 2001-11-07 | 2004-03-02 | Novalogic, Inc. | Method for rendering realistic terrain simulation |
US20050052462A1 (en) * | 2000-03-17 | 2005-03-10 | Kiyomi Sakamoto | Map display device and navigation device |
EP1596339A1 (en) * | 2004-05-14 | 2005-11-16 | Microsoft Corporation | Terrain rendering using nested regular grids |
US20050268044A1 (en) * | 2004-06-01 | 2005-12-01 | Arcas Blaise A Y | Efficient data cache |
US20060104545A1 (en) * | 2004-11-18 | 2006-05-18 | Ziosoft, Inc. | Computer readable medium for image processing and image processing method |
US20060176305A1 (en) * | 2003-03-05 | 2006-08-10 | Arcas Blaise A Y | System and method for managing communication and/or storage of image data |
US20060235941A1 (en) * | 2005-03-29 | 2006-10-19 | Microsoft Corporation | System and method for transferring web page data |
US20060267982A1 (en) * | 2003-03-05 | 2006-11-30 | Seadragon Software, Inc. | System and method for exact rendering in a zooming user interface |
US20070047101A1 (en) * | 2004-03-17 | 2007-03-01 | Seadragon Software, Inc. | Methods and apparatus for navigating an image |
US20070094083A1 (en) * | 2005-10-25 | 2007-04-26 | Podbridge, Inc. | Matching ads to content and users for time and space shifted media network |
US20070094363A1 (en) * | 2005-10-25 | 2007-04-26 | Podbridge, Inc. | Configuration for ad and content delivery in time and space shifted media network |
US20070182743A1 (en) * | 2003-05-30 | 2007-08-09 | Microsoft Corporation | Displaying visual content using multiple nodes |
US20080031527A1 (en) * | 2004-10-08 | 2008-02-07 | Arcas Blaise Aguera Y | System and method for efficiently encoding data |
US20080050024A1 (en) * | 2003-03-05 | 2008-02-28 | Seadragon Software, Inc. | Method for encoding and serving geospatial or other vector data as images |
US7433191B2 (en) | 2005-09-30 | 2008-10-07 | Apple Inc. | Thermal contact arrangement |
US20080252641A1 (en) * | 2007-04-11 | 2008-10-16 | Fujiflm Corporation | Projection image generation apparatus and program |
US20090046095A1 (en) * | 2007-08-16 | 2009-02-19 | Southwest Research Institute | Image Analogy Filters For Terrain Modeling |
WO2009040154A2 (en) * | 2007-09-24 | 2009-04-02 | Robert Bosch Gmbh | Method and system for representing geographic objects on a display device |
US7577930B2 (en) | 2005-06-23 | 2009-08-18 | Apple Inc. | Method and apparatus for analyzing integrated circuit operations |
US7599044B2 (en) | 2005-06-23 | 2009-10-06 | Apple Inc. | Method and apparatus for remotely detecting presence |
US7598711B2 (en) | 2005-11-23 | 2009-10-06 | Apple Inc. | Power source switchover apparatus and method |
US20100033480A1 (en) * | 1995-11-15 | 2010-02-11 | Halo Vision Ltd. | Method for Interactively Viewing Full-Surround Image Data and Apparatus Therefor |
US20100177098A1 (en) * | 2009-01-14 | 2010-07-15 | Cellius, Inc. | Image generation system, image generation method, and computer program product |
US7895076B2 (en) | 1995-06-30 | 2011-02-22 | Sony Computer Entertainment Inc. | Advertisement insertion, profiling, impression, and feedback |
US7891818B2 (en) | 2006-12-12 | 2011-02-22 | Evans & Sutherland Computer Corporation | System and method for aligning RGB light in a single modulator projector |
US8077378B1 (en) | 2008-11-12 | 2011-12-13 | Evans & Sutherland Computer Corporation | Calibration system and method for light modulation device |
US8133115B2 (en) | 2003-10-22 | 2012-03-13 | Sony Computer Entertainment America Llc | System and method for recording and displaying a graphical path in a video game |
US8204272B2 (en) | 2006-05-04 | 2012-06-19 | Sony Computer Entertainment Inc. | Lighting control of a user environment via a display device |
US8243089B2 (en) | 2006-05-04 | 2012-08-14 | Sony Computer Entertainment Inc. | Implementing lighting control of a user environment |
US8267783B2 (en) | 2005-09-30 | 2012-09-18 | Sony Computer Entertainment America Llc | Establishing an impression area |
US8284310B2 (en) | 2005-06-22 | 2012-10-09 | Sony Computer Entertainment America Llc | Delay matching in audio/video systems |
US8289325B2 (en) | 2004-10-06 | 2012-10-16 | Sony Computer Entertainment America Llc | Multi-pass shading |
US8358317B2 (en) | 2008-05-23 | 2013-01-22 | Evans & Sutherland Computer Corporation | System and method for displaying a planar image on a curved surface |
US20130321407A1 (en) * | 2012-06-02 | 2013-12-05 | Schlumberger Technology Corporation | Spatial data services |
US8626584B2 (en) | 2005-09-30 | 2014-01-07 | Sony Computer Entertainment America Llc | Population of an advertisement reference list |
US8645992B2 (en) | 2006-05-05 | 2014-02-04 | Sony Computer Entertainment America Llc | Advertisement rotation |
US8676900B2 (en) | 2005-10-25 | 2014-03-18 | Sony Computer Entertainment America Llc | Asynchronous advertising placement based on metadata |
US8702248B1 (en) | 2008-06-11 | 2014-04-22 | Evans & Sutherland Computer Corporation | Projection method for reducing interpixel gaps on a viewing surface |
US20140152664A1 (en) * | 2012-11-30 | 2014-06-05 | Thales | Method of rendering a terrain stored in a massive database |
US8751310B2 (en) | 2005-09-30 | 2014-06-10 | Sony Computer Entertainment America Llc | Monitoring advertisement impressions |
US8763157B2 (en) | 2004-08-23 | 2014-06-24 | Sony Computer Entertainment America Llc | Statutory license restricted digital media playback on portable devices |
US8763090B2 (en) | 2009-08-11 | 2014-06-24 | Sony Computer Entertainment America Llc | Management of ancillary content delivery and presentation |
US8769558B2 (en) | 2008-02-12 | 2014-07-01 | Sony Computer Entertainment America Llc | Discovery and analytics for episodic downloaded media |
US20140189598A1 (en) * | 1999-07-22 | 2014-07-03 | Tavusi Data Solutions Llc | Graphic-information flow method and system for visually analyzing patterns and relationships |
US8892495B2 (en) | 1991-12-23 | 2014-11-18 | Blanding Hovenweep, Llc | Adaptive pattern recognition based controller apparatus and method and human-interface therefore |
US9272203B2 (en) | 2007-10-09 | 2016-03-01 | Sony Computer Entertainment America, LLC | Increasing the number of advertising impressions in an interactive environment |
US9298311B2 (en) | 2005-06-23 | 2016-03-29 | Apple Inc. | Trackpad sensitivity compensation |
US9342817B2 (en) | 2011-07-07 | 2016-05-17 | Sony Interactive Entertainment LLC | Auto-creating groups for sharing photos |
US9347792B2 (en) | 2011-10-31 | 2016-05-24 | Honeywell International Inc. | Systems and methods for displaying images with multi-resolution integration |
US9535563B2 (en) | 1999-02-01 | 2017-01-03 | Blanding Hovenweep, Llc | Internet appliance system and method |
US9641826B1 (en) | 2011-10-06 | 2017-05-02 | Evans & Sutherland Computer Corporation | System and method for displaying distant 3-D stereo on a dome surface |
US9864998B2 (en) | 2005-10-25 | 2018-01-09 | Sony Interactive Entertainment America Llc | Asynchronous advertising |
US10218814B2 (en) * | 2000-12-27 | 2019-02-26 | Bradium Technologies Llc | Optimized image delivery over limited bandwidth communication channels |
US10657538B2 (en) | 2005-10-25 | 2020-05-19 | Sony Interactive Entertainment LLC | Resolution of advertising rules |
US10786736B2 (en) | 2010-05-11 | 2020-09-29 | Sony Interactive Entertainment LLC | Placement of user information in a game space |
US10846779B2 (en) | 2016-11-23 | 2020-11-24 | Sony Interactive Entertainment LLC | Custom product categorization of digital media content |
US10860987B2 (en) | 2016-12-19 | 2020-12-08 | Sony Interactive Entertainment LLC | Personalized calendar for digital media content-related events |
US10931991B2 (en) | 2018-01-04 | 2021-02-23 | Sony Interactive Entertainment LLC | Methods and systems for selectively skipping through media content |
US11004089B2 (en) | 2005-10-25 | 2021-05-11 | Sony Interactive Entertainment LLC | Associating media content files with advertisements |
US11315327B1 (en) | 2018-11-02 | 2022-04-26 | Facebook Technologies, Llc. | Beam-racing fallbacks in a display engine |
US11429363B2 (en) * | 2017-07-31 | 2022-08-30 | Sony Interactive Entertainment Inc. | Information processing apparatus and file copying method |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4343037A (en) * | 1979-06-15 | 1982-08-03 | Redifon Simulation Limited | Visual display systems of the computer generated image type |
US4586038A (en) * | 1983-12-12 | 1986-04-29 | General Electric Company | True-perspective texture/shading processor |
US4692880A (en) * | 1985-11-15 | 1987-09-08 | General Electric Company | Memory efficient cell texturing for advanced video object generator |
US4727365A (en) * | 1983-08-30 | 1988-02-23 | General Electric Company | Advanced video object generator |
US4780084A (en) * | 1987-05-08 | 1988-10-25 | General Electric Company | Landmass simulator |
US4807158A (en) * | 1986-09-30 | 1989-02-21 | Daleco/Ivex Partners, Ltd. | Method and apparatus for sampling images to simulate movement within a multidimensional space |
US4821212A (en) * | 1984-08-08 | 1989-04-11 | General Electric Company | Three dimensional texture generator for computed terrain images |
US4855934A (en) * | 1986-10-03 | 1989-08-08 | Evans & Sutherland Computer Corporation | System for texturing computer graphics images |
US4952922A (en) * | 1985-07-18 | 1990-08-28 | Hughes Aircraft Company | Predictive look ahead memory management for computer image generation in simulators |
US4974176A (en) * | 1987-12-18 | 1990-11-27 | General Electric Company | Microtexture for close-in detail |
US5097427A (en) * | 1988-07-06 | 1992-03-17 | Hewlett-Packard Company | Texture mapping for computer graphics display controller system |
US5222205A (en) * | 1990-03-16 | 1993-06-22 | Hewlett-Packard Company | Method for generating addresses to textured graphics primitives stored in rip maps |
US5317689A (en) * | 1986-09-11 | 1994-05-31 | Hughes Aircraft Company | Digital visual and sensor simulation system for generating realistic scenes |
US5480305A (en) * | 1993-10-29 | 1996-01-02 | Southwest Research Institute | Weather simulation system |
US5495563A (en) * | 1990-01-15 | 1996-02-27 | U.S. Philips Corporation | Apparatus for converting pyramidal texture coordinates into corresponding physical texture memory addresses |
US5579456A (en) * | 1993-10-15 | 1996-11-26 | Evans & Sutherland Computer Corp. | Direct rendering of textured height fields |
-
1996
- 1996-03-11 US US08/613,893 patent/US5699497A/en not_active Expired - Lifetime
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4343037A (en) * | 1979-06-15 | 1982-08-03 | Redifon Simulation Limited | Visual display systems of the computer generated image type |
US4727365A (en) * | 1983-08-30 | 1988-02-23 | General Electric Company | Advanced video object generator |
US4727365B1 (en) * | 1983-08-30 | 1999-10-05 | Lockheed Corp | Advanced video object generator |
US4586038A (en) * | 1983-12-12 | 1986-04-29 | General Electric Company | True-perspective texture/shading processor |
US4821212A (en) * | 1984-08-08 | 1989-04-11 | General Electric Company | Three dimensional texture generator for computed terrain images |
US4952922A (en) * | 1985-07-18 | 1990-08-28 | Hughes Aircraft Company | Predictive look ahead memory management for computer image generation in simulators |
US4692880A (en) * | 1985-11-15 | 1987-09-08 | General Electric Company | Memory efficient cell texturing for advanced video object generator |
US5317689A (en) * | 1986-09-11 | 1994-05-31 | Hughes Aircraft Company | Digital visual and sensor simulation system for generating realistic scenes |
US4807158A (en) * | 1986-09-30 | 1989-02-21 | Daleco/Ivex Partners, Ltd. | Method and apparatus for sampling images to simulate movement within a multidimensional space |
US4855934A (en) * | 1986-10-03 | 1989-08-08 | Evans & Sutherland Computer Corporation | System for texturing computer graphics images |
US4780084A (en) * | 1987-05-08 | 1988-10-25 | General Electric Company | Landmass simulator |
US4974176A (en) * | 1987-12-18 | 1990-11-27 | General Electric Company | Microtexture for close-in detail |
US5097427A (en) * | 1988-07-06 | 1992-03-17 | Hewlett-Packard Company | Texture mapping for computer graphics display controller system |
US5495563A (en) * | 1990-01-15 | 1996-02-27 | U.S. Philips Corporation | Apparatus for converting pyramidal texture coordinates into corresponding physical texture memory addresses |
US5222205A (en) * | 1990-03-16 | 1993-06-22 | Hewlett-Packard Company | Method for generating addresses to textured graphics primitives stored in rip maps |
US5579456A (en) * | 1993-10-15 | 1996-11-26 | Evans & Sutherland Computer Corp. | Direct rendering of textured height fields |
US5480305A (en) * | 1993-10-29 | 1996-01-02 | Southwest Research Institute | Weather simulation system |
Cited By (144)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8892495B2 (en) | 1991-12-23 | 2014-11-18 | Blanding Hovenweep, Llc | Adaptive pattern recognition based controller apparatus and method and human-interface therefore |
US7895076B2 (en) | 1995-06-30 | 2011-02-22 | Sony Computer Entertainment Inc. | Advertisement insertion, profiling, impression, and feedback |
US8077176B2 (en) | 1995-11-15 | 2011-12-13 | Grandeye Ltd. | Method for interactively viewing full-surround image data and apparatus therefor |
US20100033480A1 (en) * | 1995-11-15 | 2010-02-11 | Halo Vision Ltd. | Method for Interactively Viewing Full-Surround Image Data and Apparatus Therefor |
US6243099B1 (en) * | 1996-11-14 | 2001-06-05 | Ford Oxaal | Method for interactive viewing full-surround image data and apparatus therefor |
US5841440A (en) * | 1996-12-17 | 1998-11-24 | Apple Computer, Inc. | System and method for using a pointing device to indicate movement through three-dimensional space |
US6020893A (en) * | 1997-04-11 | 2000-02-01 | Novalogic, Inc. | System and method for realistic terrain simulation |
US6157386A (en) * | 1997-10-10 | 2000-12-05 | Cirrus Logic, Inc | MIP map blending in a graphics processor |
US6504550B1 (en) | 1998-05-21 | 2003-01-07 | Mitsubishi Electric & Electronics Usa, Inc. | System for graphics processing employing semiconductor device |
US6661421B1 (en) | 1998-05-21 | 2003-12-09 | Mitsubishi Electric & Electronics Usa, Inc. | Methods for operation of semiconductor memory |
US6559851B1 (en) | 1998-05-21 | 2003-05-06 | Mitsubishi Electric & Electronics Usa, Inc. | Methods for semiconductor systems for graphics processing |
US6535218B1 (en) | 1998-05-21 | 2003-03-18 | Mitsubishi Electric & Electronics Usa, Inc. | Frame buffer memory for graphic processing |
US7808503B2 (en) | 1998-08-20 | 2010-10-05 | Apple Inc. | Deferred shading graphics pipeline processor having advanced features |
US6552723B1 (en) | 1998-08-20 | 2003-04-22 | Apple Computer, Inc. | System, apparatus and method for spatially sorting image data in a three-dimensional graphics pipeline |
US6476807B1 (en) | 1998-08-20 | 2002-11-05 | Apple Computer, Inc. | Method and apparatus for performing conservative hidden surface removal in a graphics processor with deferred shading |
US7167181B2 (en) | 1998-08-20 | 2007-01-23 | Apple Computer, Inc. | Deferred shading graphics pipeline processor having advanced features |
US6717576B1 (en) | 1998-08-20 | 2004-04-06 | Apple Computer, Inc. | Deferred shading graphics pipeline processor having advanced features |
WO2000011607A1 (en) * | 1998-08-20 | 2000-03-02 | Apple Computer, Inc. | Deferred shading graphics pipeline processor |
US6525737B1 (en) | 1998-08-20 | 2003-02-25 | Apple Computer, Inc. | Graphics processor with pipeline state storage and retrieval |
US6693639B2 (en) | 1998-08-20 | 2004-02-17 | Apple Computer, Inc. | Graphics processor with pipeline state storage and retrieval |
US6268875B1 (en) | 1998-08-20 | 2001-07-31 | Apple Computer, Inc. | Deferred shading graphics pipeline processor |
US7164426B1 (en) | 1998-08-20 | 2007-01-16 | Apple Computer, Inc. | Method and apparatus for generating texture |
US6664959B2 (en) | 1998-08-20 | 2003-12-16 | Apple Computer, Inc. | Method and apparatus for culling in a graphics processor with deferred shading |
US6771264B1 (en) | 1998-08-20 | 2004-08-03 | Apple Computer, Inc. | Method and apparatus for performing tangent space lighting and bump mapping in a deferred shading graphics processor |
US6229553B1 (en) * | 1998-08-20 | 2001-05-08 | Apple Computer, Inc. | Deferred shading graphics pipeline processor |
US6577317B1 (en) | 1998-08-20 | 2003-06-10 | Apple Computer, Inc. | Apparatus and method for geometry operations in a 3D-graphics pipeline |
US6577305B1 (en) | 1998-08-20 | 2003-06-10 | Apple Computer, Inc. | Apparatus and method for performing setup operations in a 3-D graphics pipeline using unified primitive descriptors |
US6597363B1 (en) | 1998-08-20 | 2003-07-22 | Apple Computer, Inc. | Graphics processor with deferred shading |
US6614444B1 (en) | 1998-08-20 | 2003-09-02 | Apple Computer, Inc. | Apparatus and method for fragment operations in a 3D-graphics pipeline |
WO2000039755A2 (en) * | 1998-12-23 | 2000-07-06 | Silicon Graphics, Inc. | Method, system, and computer program product for modified blending between clip-map tiles |
US6373482B1 (en) | 1998-12-23 | 2002-04-16 | Microsoft Corporation | Method, system, and computer program product for modified blending between clip-map tiles |
WO2000039755A3 (en) * | 1998-12-23 | 2002-09-19 | Silicon Graphics Inc | Method, system, and computer program product for modified blending between clip-map tiles |
US9535563B2 (en) | 1999-02-01 | 2017-01-03 | Blanding Hovenweep, Llc | Internet appliance system and method |
US6288721B1 (en) | 1999-07-07 | 2001-09-11 | Litton Systems, Inc. | Rendering process and method for digital map illumination intensity shading |
US20140189598A1 (en) * | 1999-07-22 | 2014-07-03 | Tavusi Data Solutions Llc | Graphic-information flow method and system for visually analyzing patterns and relationships |
US6525731B1 (en) | 1999-11-09 | 2003-02-25 | Ibm Corporation | Dynamic view-dependent texture mapping |
US6549201B1 (en) * | 1999-11-23 | 2003-04-15 | Center For Advanced Science And Technology Incubation, Ltd. | Method for constructing a 3D polygonal surface from a 2D silhouette by using computer, apparatus thereof and storage medium |
US9015747B2 (en) | 1999-12-02 | 2015-04-21 | Sony Computer Entertainment America Llc | Advertisement rotation |
US10390101B2 (en) | 1999-12-02 | 2019-08-20 | Sony Interactive Entertainment America Llc | Advertisement rotation |
US20050052462A1 (en) * | 2000-03-17 | 2005-03-10 | Kiyomi Sakamoto | Map display device and navigation device |
EP1139293A2 (en) * | 2000-03-24 | 2001-10-04 | Konami Computer Entertainment Japan Inc. | Game system and computer readable storage medium game program |
US6831656B2 (en) * | 2000-03-24 | 2004-12-14 | Konami Computer Entertainment Japan, Inc. | Game system and computer readable storage medium storing game program |
US20010030652A1 (en) * | 2000-03-24 | 2001-10-18 | Takashi Kitao | Game system and computer readable storage medium storing game program |
EP1139293A3 (en) * | 2000-03-24 | 2003-02-12 | Konami Computer Entertainment Japan Inc. | Game system and computer readable storage medium game program |
US20020060685A1 (en) * | 2000-04-28 | 2002-05-23 | Malcolm Handley | Method, system, and computer program product for managing terrain rendering information |
US8272964B2 (en) | 2000-07-04 | 2012-09-25 | Sony Computer Entertainment America Llc | Identifying obstructions in an impression area |
US10218814B2 (en) * | 2000-12-27 | 2019-02-26 | Bradium Technologies Llc | Optimized image delivery over limited bandwidth communication channels |
US9195991B2 (en) | 2001-02-09 | 2015-11-24 | Sony Computer Entertainment America Llc | Display of user selected advertising content in a digital environment |
US9984388B2 (en) | 2001-02-09 | 2018-05-29 | Sony Interactive Entertainment America Llc | Advertising impression determination |
US9466074B2 (en) | 2001-02-09 | 2016-10-11 | Sony Interactive Entertainment America Llc | Advertising impression determination |
US20030059743A1 (en) * | 2001-08-29 | 2003-03-27 | The Boeing Company | Method and apparatus for automatically generating a terrain model for display during flight simulation |
US20030190587A1 (en) * | 2001-08-29 | 2003-10-09 | The Boeing Company | Automated flight simulation mission profiler and associated method |
US6910892B2 (en) | 2001-08-29 | 2005-06-28 | The Boeing Company | Method and apparatus for automatically collecting terrain source data for display during flight simulation |
US20040229701A1 (en) * | 2001-10-10 | 2004-11-18 | Gavin Andrew Scott | System and method for dynamically loading game software for smooth game play |
US9138648B2 (en) | 2001-10-10 | 2015-09-22 | Sony Computer Entertainment America Llc | System and method for dynamically loading game software for smooth game play |
US10322347B2 (en) | 2001-10-10 | 2019-06-18 | Sony Interactive Entertainment America Llc | System and method for dynamicaly loading game software for smooth game play |
WO2003032127A2 (en) * | 2001-10-10 | 2003-04-17 | Sony Computer Entertainment America Inc. | Dynamically loaded game software for smooth play. |
WO2003032127A3 (en) * | 2001-10-10 | 2004-03-04 | Sony Comp Emtertainment Us | Dynamically loaded game software for smooth play. |
US6764403B2 (en) | 2001-10-10 | 2004-07-20 | Sony Computer Entertainment America Inc. | System and method for dynamically loading game software for smooth game play |
US6700573B2 (en) | 2001-11-07 | 2004-03-02 | Novalogic, Inc. | Method for rendering realistic terrain simulation |
US20060176305A1 (en) * | 2003-03-05 | 2006-08-10 | Arcas Blaise A Y | System and method for managing communication and/or storage of image data |
US20060267982A1 (en) * | 2003-03-05 | 2006-11-30 | Seadragon Software, Inc. | System and method for exact rendering in a zooming user interface |
US7554543B2 (en) * | 2003-03-05 | 2009-06-30 | Microsoft Corporation | System and method for exact rendering in a zooming user interface |
US20080050024A1 (en) * | 2003-03-05 | 2008-02-28 | Seadragon Software, Inc. | Method for encoding and serving geospatial or other vector data as images |
US7724965B2 (en) | 2003-03-05 | 2010-05-25 | Microsoft Corporation | Method for encoding and serving geospatial or other vector data as images |
US7930434B2 (en) | 2003-03-05 | 2011-04-19 | Microsoft Corporation | System and method for managing communication and/or storage of image data |
US20070182743A1 (en) * | 2003-05-30 | 2007-08-09 | Microsoft Corporation | Displaying visual content using multiple nodes |
US8133115B2 (en) | 2003-10-22 | 2012-03-13 | Sony Computer Entertainment America Llc | System and method for recording and displaying a graphical path in a video game |
US20070047101A1 (en) * | 2004-03-17 | 2007-03-01 | Seadragon Software, Inc. | Methods and apparatus for navigating an image |
US7436405B2 (en) | 2004-05-14 | 2008-10-14 | Microsoft Corporation | Terrain rendering using nested regular grids |
US20050253843A1 (en) * | 2004-05-14 | 2005-11-17 | Microsoft Corporation | Terrain rendering using nested regular grids |
KR101130400B1 (en) * | 2004-05-14 | 2012-03-27 | 마이크로소프트 코포레이션 | Terrain rendering using nested regular grids |
EP1596339A1 (en) * | 2004-05-14 | 2005-11-16 | Microsoft Corporation | Terrain rendering using nested regular grids |
US20050268044A1 (en) * | 2004-06-01 | 2005-12-01 | Arcas Blaise A Y | Efficient data cache |
US7546419B2 (en) | 2004-06-01 | 2009-06-09 | Aguera Y Arcas Blaise | Efficient data cache |
US9531686B2 (en) | 2004-08-23 | 2016-12-27 | Sony Interactive Entertainment America Llc | Statutory license restricted digital media playback on portable devices |
US10042987B2 (en) | 2004-08-23 | 2018-08-07 | Sony Interactive Entertainment America Llc | Statutory license restricted digital media playback on portable devices |
US8763157B2 (en) | 2004-08-23 | 2014-06-24 | Sony Computer Entertainment America Llc | Statutory license restricted digital media playback on portable devices |
US8289325B2 (en) | 2004-10-06 | 2012-10-16 | Sony Computer Entertainment America Llc | Multi-pass shading |
US20080031527A1 (en) * | 2004-10-08 | 2008-02-07 | Arcas Blaise Aguera Y | System and method for efficiently encoding data |
US7912299B2 (en) | 2004-10-08 | 2011-03-22 | Microsoft Corporation | System and method for efficiently encoding data |
US7796835B2 (en) * | 2004-11-18 | 2010-09-14 | Ziosoft, Inc. | Computer readable medium for image processing and image processing method |
US20060104545A1 (en) * | 2004-11-18 | 2006-05-18 | Ziosoft, Inc. | Computer readable medium for image processing and image processing method |
US20060235941A1 (en) * | 2005-03-29 | 2006-10-19 | Microsoft Corporation | System and method for transferring web page data |
US8284310B2 (en) | 2005-06-22 | 2012-10-09 | Sony Computer Entertainment America Llc | Delay matching in audio/video systems |
US9298311B2 (en) | 2005-06-23 | 2016-03-29 | Apple Inc. | Trackpad sensitivity compensation |
US7577930B2 (en) | 2005-06-23 | 2009-08-18 | Apple Inc. | Method and apparatus for analyzing integrated circuit operations |
US7599044B2 (en) | 2005-06-23 | 2009-10-06 | Apple Inc. | Method and apparatus for remotely detecting presence |
US8795076B2 (en) | 2005-09-30 | 2014-08-05 | Sony Computer Entertainment America Llc | Advertising impression determination |
US9129301B2 (en) | 2005-09-30 | 2015-09-08 | Sony Computer Entertainment America Llc | Display of user selected advertising content in a digital environment |
US10467651B2 (en) | 2005-09-30 | 2019-11-05 | Sony Interactive Entertainment America Llc | Advertising impression determination |
US8626584B2 (en) | 2005-09-30 | 2014-01-07 | Sony Computer Entertainment America Llc | Population of an advertisement reference list |
US11436630B2 (en) | 2005-09-30 | 2022-09-06 | Sony Interactive Entertainment LLC | Advertising impression determination |
US9873052B2 (en) | 2005-09-30 | 2018-01-23 | Sony Interactive Entertainment America Llc | Monitoring advertisement impressions |
US8574074B2 (en) | 2005-09-30 | 2013-11-05 | Sony Computer Entertainment America Llc | Advertising impression determination |
US10789611B2 (en) | 2005-09-30 | 2020-09-29 | Sony Interactive Entertainment LLC | Advertising impression determination |
US8751310B2 (en) | 2005-09-30 | 2014-06-10 | Sony Computer Entertainment America Llc | Monitoring advertisement impressions |
US8267783B2 (en) | 2005-09-30 | 2012-09-18 | Sony Computer Entertainment America Llc | Establishing an impression area |
US7433191B2 (en) | 2005-09-30 | 2008-10-07 | Apple Inc. | Thermal contact arrangement |
US10046239B2 (en) | 2005-09-30 | 2018-08-14 | Sony Interactive Entertainment America Llc | Monitoring advertisement impressions |
US20070094083A1 (en) * | 2005-10-25 | 2007-04-26 | Podbridge, Inc. | Matching ads to content and users for time and space shifted media network |
US10657538B2 (en) | 2005-10-25 | 2020-05-19 | Sony Interactive Entertainment LLC | Resolution of advertising rules |
US20070094363A1 (en) * | 2005-10-25 | 2007-04-26 | Podbridge, Inc. | Configuration for ad and content delivery in time and space shifted media network |
US8676900B2 (en) | 2005-10-25 | 2014-03-18 | Sony Computer Entertainment America Llc | Asynchronous advertising placement based on metadata |
US9864998B2 (en) | 2005-10-25 | 2018-01-09 | Sony Interactive Entertainment America Llc | Asynchronous advertising |
US11004089B2 (en) | 2005-10-25 | 2021-05-11 | Sony Interactive Entertainment LLC | Associating media content files with advertisements |
US11195185B2 (en) | 2005-10-25 | 2021-12-07 | Sony Interactive Entertainment LLC | Asynchronous advertising |
US10410248B2 (en) | 2005-10-25 | 2019-09-10 | Sony Interactive Entertainment America Llc | Asynchronous advertising placement based on metadata |
US9367862B2 (en) | 2005-10-25 | 2016-06-14 | Sony Interactive Entertainment America Llc | Asynchronous advertising placement based on metadata |
US7598711B2 (en) | 2005-11-23 | 2009-10-06 | Apple Inc. | Power source switchover apparatus and method |
US8243089B2 (en) | 2006-05-04 | 2012-08-14 | Sony Computer Entertainment Inc. | Implementing lighting control of a user environment |
US8204272B2 (en) | 2006-05-04 | 2012-06-19 | Sony Computer Entertainment Inc. | Lighting control of a user environment via a display device |
US8645992B2 (en) | 2006-05-05 | 2014-02-04 | Sony Computer Entertainment America Llc | Advertisement rotation |
US7891818B2 (en) | 2006-12-12 | 2011-02-22 | Evans & Sutherland Computer Corporation | System and method for aligning RGB light in a single modulator projector |
US20080252641A1 (en) * | 2007-04-11 | 2008-10-16 | Fujiflm Corporation | Projection image generation apparatus and program |
US20090046095A1 (en) * | 2007-08-16 | 2009-02-19 | Southwest Research Institute | Image Analogy Filters For Terrain Modeling |
US8289326B2 (en) * | 2007-08-16 | 2012-10-16 | Southwest Research Institute | Image analogy filters for terrain modeling |
WO2009040154A2 (en) * | 2007-09-24 | 2009-04-02 | Robert Bosch Gmbh | Method and system for representing geographic objects on a display device |
WO2009040154A3 (en) * | 2007-09-24 | 2009-09-17 | Robert Bosch Gmbh | Method and system for representing geographic objects on a display device |
US9272203B2 (en) | 2007-10-09 | 2016-03-01 | Sony Computer Entertainment America, LLC | Increasing the number of advertising impressions in an interactive environment |
US9525902B2 (en) | 2008-02-12 | 2016-12-20 | Sony Interactive Entertainment America Llc | Discovery and analytics for episodic downloaded media |
US8769558B2 (en) | 2008-02-12 | 2014-07-01 | Sony Computer Entertainment America Llc | Discovery and analytics for episodic downloaded media |
US8358317B2 (en) | 2008-05-23 | 2013-01-22 | Evans & Sutherland Computer Corporation | System and method for displaying a planar image on a curved surface |
US8702248B1 (en) | 2008-06-11 | 2014-04-22 | Evans & Sutherland Computer Corporation | Projection method for reducing interpixel gaps on a viewing surface |
US8077378B1 (en) | 2008-11-12 | 2011-12-13 | Evans & Sutherland Computer Corporation | Calibration system and method for light modulation device |
US20100177098A1 (en) * | 2009-01-14 | 2010-07-15 | Cellius, Inc. | Image generation system, image generation method, and computer program product |
US10298703B2 (en) | 2009-08-11 | 2019-05-21 | Sony Interactive Entertainment America Llc | Management of ancillary content delivery and presentation |
US9474976B2 (en) | 2009-08-11 | 2016-10-25 | Sony Interactive Entertainment America Llc | Management of ancillary content delivery and presentation |
US8763090B2 (en) | 2009-08-11 | 2014-06-24 | Sony Computer Entertainment America Llc | Management of ancillary content delivery and presentation |
US11478706B2 (en) | 2010-05-11 | 2022-10-25 | Sony Interactive Entertainment LLC | Placement of user information in a game space |
US10786736B2 (en) | 2010-05-11 | 2020-09-29 | Sony Interactive Entertainment LLC | Placement of user information in a game space |
US9342817B2 (en) | 2011-07-07 | 2016-05-17 | Sony Interactive Entertainment LLC | Auto-creating groups for sharing photos |
US9641826B1 (en) | 2011-10-06 | 2017-05-02 | Evans & Sutherland Computer Corporation | System and method for displaying distant 3-D stereo on a dome surface |
US10110876B1 (en) | 2011-10-06 | 2018-10-23 | Evans & Sutherland Computer Corporation | System and method for displaying images in 3-D stereo |
US9347792B2 (en) | 2011-10-31 | 2016-05-24 | Honeywell International Inc. | Systems and methods for displaying images with multi-resolution integration |
US20130321407A1 (en) * | 2012-06-02 | 2013-12-05 | Schlumberger Technology Corporation | Spatial data services |
US20140152664A1 (en) * | 2012-11-30 | 2014-06-05 | Thales | Method of rendering a terrain stored in a massive database |
US10846779B2 (en) | 2016-11-23 | 2020-11-24 | Sony Interactive Entertainment LLC | Custom product categorization of digital media content |
US10860987B2 (en) | 2016-12-19 | 2020-12-08 | Sony Interactive Entertainment LLC | Personalized calendar for digital media content-related events |
US11429363B2 (en) * | 2017-07-31 | 2022-08-30 | Sony Interactive Entertainment Inc. | Information processing apparatus and file copying method |
US10931991B2 (en) | 2018-01-04 | 2021-02-23 | Sony Interactive Entertainment LLC | Methods and systems for selectively skipping through media content |
US11315327B1 (en) | 2018-11-02 | 2022-04-26 | Facebook Technologies, Llc. | Beam-racing fallbacks in a display engine |
US11721307B1 (en) | 2018-11-02 | 2023-08-08 | Meta Platforms Technologies, Llc | Beam-racing pixel generation in a display engine |
US12020368B1 (en) * | 2018-11-02 | 2024-06-25 | Meta Platforms Technologies, Llc | Pixel replication and interpolation for foveated rendering in a display engine |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5699497A (en) | Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint | |
US6747649B1 (en) | Terrain rendering in a three-dimensional environment | |
US5742749A (en) | Method and apparatus for shadow generation through depth mapping | |
US5140532A (en) | Digital map generator and display system | |
DE69130545T2 (en) | System for creating a textured perspective view | |
EP0321095B1 (en) | Polygon priority resolving system with antialiasing | |
US7123260B2 (en) | System and method for synthetic vision terrain display | |
CA2174090C (en) | Weather simulation system | |
EP0812447B1 (en) | Computer graphics system for creating and enhancing texture maps | |
EP0463700B1 (en) | Method of and apparatus for generating an image | |
US5872572A (en) | Method and apparatus for generating non-uniform resolution image data | |
US5409379A (en) | Weather simulation system | |
US7884825B2 (en) | Drawing method, image generating device, and electronic information apparatus | |
WO2007087538A2 (en) | System and method for asynchronous continuous-level-of-detail texture mapping for large-scale terrain rendering | |
US5719599A (en) | Method and apparatus for efficient digital modeling and texture mapping | |
US5719598A (en) | Graphics processor for parallel processing a plurality of fields of view for multiple video displays | |
GB2245805A (en) | Generating an anti-aliased image | |
EP1058912B1 (en) | Subsampled texture edge antialiasing | |
KR100429092B1 (en) | Graphic image processing method and apparatus | |
CA2617373C (en) | Improved terrain rendering in a three-dimensional environment | |
US20070257936A1 (en) | Image generator | |
GB2130854A (en) | Display system | |
GB2288304A (en) | Computer graphics | |
GB2245806A (en) | Generating an image | |
JP4624617B2 (en) | Improved S-buffer anti-aliasing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: FOOTHILL CAPITAL CORPORATION, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:EVAN & SUTHERLAND COMPUTER CORPORATION;REEL/FRAME:011369/0944 Effective date: 20001214 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: LTOS); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: EVANS & SUTHERLAND COMPUTER CORPORATION, UTAH Free format text: RELEASE OF SECURITY INTERESTS;ASSIGNOR:FOOTHILL CAPITAL CORPORATION;REEL/FRAME:017015/0428 Effective date: 20050517 |
|
AS | Assignment |
Owner name: ROCKWELL COLLINS SIMULATION AND TRAINING SOLUTIONS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EVANS & SUTHERLAND COMPUTER CORPORATION;REEL/FRAME:018972/0259 Effective date: 20060525 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
SULP | Surcharge for late payment |