US20070242131A1 - Location Based Wireless Collaborative Environment With A Visual User Interface - Google Patents
Location Based Wireless Collaborative Environment With A Visual User Interface Download PDFInfo
- Publication number
- US20070242131A1 US20070242131A1 US11/618,672 US61867206A US2007242131A1 US 20070242131 A1 US20070242131 A1 US 20070242131A1 US 61867206 A US61867206 A US 61867206A US 2007242131 A1 US2007242131 A1 US 2007242131A1
- Authority
- US
- United States
- Prior art keywords
- message
- location
- current position
- location data
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/14—Session management
- H04L67/148—Migration or transfer of sessions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/21—Monitoring or handling of messages
- H04L51/222—Monitoring or handling of messages using geographical location information, e.g. messages transmitted or received in proximity of a certain spot or area
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/14—Session management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/75—Indicating network or usage conditions on the user display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/18—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
- H04W4/185—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals by embedding added-value information into content, e.g. geo-tagging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
- H04W64/006—Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W8/00—Network data management
- H04W8/22—Processing or transfer of terminal data, e.g. status or physical capabilities
- H04W8/24—Transfer of terminal data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W92/00—Interfaces specially adapted for wireless communication networks
- H04W92/16—Interfaces between hierarchically similar devices
- H04W92/18—Interfaces between hierarchically similar devices between terminal devices
Definitions
- the present invention relates generally to the field of computer graphics and networked wireless communication.
- the World-Wide-Web, e-mail and instant messaging have revolutionized team collaboration, together with applications such as Lotus Notes and Microsoft Groove.
- the availability of mobile computing devices, together with access to the Global Positioning System (GPS) network are starting to bring the tools available for fixed location team collaboration to any place at any time.
- Devices such as the RIM Blackberry and the Palm Treo have brought e-mail to mobile devices with great success, but their effectiveness is certainly limited compared to the environment available to systems designed for use in fixed locations, in many cases due to interface and usability problems.
- geolocation tags on web content are enabling applications to be used not just for mapping but to display Internet search results for localized areas—services such as Yahoo! and Google Local allow the user to search for restaurants close to a given location, and display the returned results on a map.
- Maps are ideal for fixed-location computing with large displays, but the small screen sizes and interfacing constraints of mobile devices can limit their usability in mobile applications.
- a map has to be interpreted by the user and reconciled with her actual position in the real world, sometimes requiring significant effort to understand fully the information represented on the map.
- HUD Heads Up Display
- Augmented reality is a branch of computer graphics that focuses on the incorporation of interactively-rendered imagery into real-world scenes. In most cases, it is implemented by using see-through head-mounted displays (HMDs) where the user can see both the real world surrounding her and a perspective-matched computer graphics rendering of objects in the scene.
- HMDs head-mounted displays
- Augmented reality has been used for applications such as aircraft maintenance training and navigation in complex environments such as a factory floor, where the user can see information displayed over the real scene, annotating the real world.
- Some recent projects such as “A Touring Machine” developed at Columbia University in 1997 allow annotation of real world locations and interaction with geographically tagged database content on a transportable computing device.
- the present invention provides a system having advantages associated with a heads-up display as well as augmented reality technology allowing interaction within a collaborative environment similar to e-mail or instant messaging but with geographic locality, enabling teams to share information while on location with the same flexibility and immediacy that e-mail and instant messaging have brought to fixed location, office-based teams.
- a system in accordance with the present invention includes a server in communication with one or more client devices over a network such as a cellular telephone network.
- a client device includes a video capture device such as a video camera, which displays a live captured image on a screen of the client device. Data received from the server or other devices is overlaid on the live image in real time. In this way, the client device functions as a window between the real world and the virtual world of a networked collaborative environment by fusing data from the virtual world with live video from the device's camera.
- Users of this system can gaze through this window by pointing their device at areas of interest in their real environment and viewing the scene on the device's screen as with a common video camera's viewfinder, but with messages and data from the virtual world overlaid on the real scene.
- the user can interact with others in the collaborative environment by accessing and creating messages and data presented via the client device window.
- a system in one embodiment uses a wireless computing device as its delivery platform, connected to a server system and/or other wireless devices over a network.
- the wireless device is also equipped with a high resolution display capable of rendering real time graphics, a video camera, a geo-location device that provides its current position (such as a GPS receiver or a radiolocation device using triangulation of cell phone network base station signals), and a view tracking system (such as an inertial tracker or a software based image tracker) that determines the orientation of its camera in real time.
- the present invention includes a networked client device and a server side application; alternatively the functionality provided by the server can be carried out by client devices in the case of a peer-to-peer network.
- FIG. 1 illustrates wireless clients in communication with a server in accordance with an embodiment of the present invention.
- FIG. 2 is a block diagram of a message server in accordance with an embodiment of the present invention.
- FIG. 3 illustrates a wireless client device in accordance with an embodiment of the present invention.
- FIG. 4 illustrates an emergency response application in accordance with an embodiment of the present invention.
- FIG. 5 illustrates a message viewed in an emergency response application in accordance with an embodiment of the present invention.
- FIG. 6 is a block diagram of a wireless client device in accordance with an embodiment of the present invention.
- FIG. 7 is a flow chart illustrating a method of operation of a message server in accordance with an embodiment of the present invention.
- FIG. 8 is a flow chart illustrating a main loop flow for a client device in accordance with an embodiment of the present invention.
- FIG. 9 illustrates airport departure procedures provided as an example of an embodiment of the present invention
- FIG. 1 illustrates a system 100 for providing wireless collaboration in accordance with an embodiment of the present invention.
- System 100 includes a server 102 and wireless client devices 300 a , 300 b , 300 c .
- server 102 is in contact with client devices 300 a , 300 b and 300 c via a wireless interface, for example a cellular network.
- Multiple client devices 300 a , 300 b , and 300 c are illustrated to indicate that server 102 may be in contact with a plurality of client devices.
- client device 300 we refer generally to client device 300 , though any number of client devices may be in operation and communication with server 102 .
- the operation of and interaction between server 102 and client device 300 is described further below.
- geographically tagged messages are received and sent by client device 300 via its wireless network interface 310 .
- Messages in one embodiment include latitude, longitude and elevation coordinates, and in one embodiment Extensible Markup Language (XML) standard geo-locations tags are used.
- XML Extensible Markup Language
- Client device 300 presents those messages to a user on screen 302 as a graphics overlay on top of input from video camera 306 .
- the user pans the device around her environment, she can see basic information about each message at its actual physical location on screen 302 , combined with the real world image captured by the camera 306 .
- Such information can include, for example, a user-selected icon, message subject, coordinates, range and time information for each message.
- Messages in one embodiment are color- and size-coded and filtered according to distance, time, priority level, category, sender and other user-set criteria to facilitate navigation.
- Device 300 can also determine occlusion information for each message, and not present occluded messages or present them in an attenuated fashion by making them transparent when drawn or using a different color coding.
- a user can expand a particular message and see its contents in more detail overlaid in front of the scene, for example by centering the desired message icon on the screen 302 by pointing the camera 306 at it, and pushing a button 314 while the message is contained inside a selection target box located at the center of the screen 302 .
- Other UI implementations for selecting and displaying a message can be used, as will be apparent to those of skill in the art.
- the geographically tagged messages can contain, for example, text, audio, video, pictures or a hyperlinked URL to additional content, in addition to their coordinates and time and date information.
- the user can add information to the message at a specific location for other users to see, edit its content, reply to the user who sent the message from her current location, send a new message to a given user, or post a message for a given group of target users at a specific location to see.
- her current location can be sent as a continuously updated message, so other users can find her location by just panning around their devices from their current position—or she can find the location of any other members who are currently broadcasting their position by panning around her camera and seeing their icons over the real-world camera image.
- a voice message can be recorded as audio; preset simple messages that cover commonly conveyed information, or a predetermined icon with a given contextual meaning can also be attached to a message without requiring keyboard input. Once read, the message can be closed to return to the camera input message interface, and deleted if desired.
- Client wireless device 300 captures input from video camera 306 in real time and paints it as a background on screen 302 .
- Client device 300 also determines its current location using geo-location device 308 , and its current view direction using view tracking device 312 .
- messages preferably include geo-location information and may be marked as active or not active.
- server 102 determines the active status of each message and communicates the status to device 300 .
- a screen space position is determined from the received coordinates and client device's 300 current location and view direction.
- client device 300 uses color in one embodiment to represent range, priority, age or other attributes of the message to render the message source, subject, time and date at the proper screen position. As the user pans and moves the camera, the message locations follow their real-world screen projected positions, allowing the user to associate each message with its location by looking around with the device.
- messages are sent to client device 300 either when the user requests an update, at periodic intervals or whenever a certain threshold for positional change is exceeded—for example, whenever device 300 moves more than 20 meters in any direction.
- messages are received from server 102 ; in a peer-to-peer environment, server 102 is not present, and messages are received from other client devices.
- server 102 is not present, and messages are received from other client devices.
- a peer-to-peer embodiment is described further below.
- FIG. 2 illustrates an additional view of server 102 .
- Server 102 maintains a database 208 of all active messages and their coordinates.
- server 102 creates an active message list for that device by determining the range of all messages targeted for that client device 300 or any of the user groups it belongs to, and adding the messages that are proximately located, i.e. that are closer than a reception radius threshold, to a range sorted list.
- the reception radius threshold may be selected by a user of device 300 or by an operator of server 102 , or some combination of the two.
- server 102 determines a line-of-sight query from the device's position to the message coordinates, using geometric database 206 of terrain elevation and three-dimensional models of structures and vegetation specific to that location, and then updates an occlusion attribute for the message that are specific to each device's settings.
- the message is placed into a message queue 204 to be sent to client device 300 via its wireless network interface 310 .
- client device 300 Once updated, the message is placed into a message queue 204 to be sent to client device 300 via its wireless network interface 310 .
- partial updates are possible where only messages that change are sent, and where indices to currently-stored local messages can be sent to re-order the list according to the current device location and selected filtering criteria.
- Client device 300 in one embodiment can participate in a peer-to-peer network without the presence of server 102 .
- client devices pass messages to each other until they reach the desired device(s).
- each client (peer) device performs operations that would otherwise be performed by the server, including range-sorting and filtering messages according to its current location.
- no occlusion information is generated in the peer-to-peer protocol, if the client devices do not contain a geometric database of the area to query against.
- messages can be sent to the client devices not only from other devices in the collaborative environment, but from any networked computer by adding geographic tag information to the text of any e-mail, instant messenger post or web mail.
- server 102 receives the global network traffic and translates the incoming data into the proper messages format.
- client devices can send geo-located messages to any networked computer as e-mail, and server 102 translates those into pop3, IMAP, SMTP or other suitable network data.
- User interface 304 of device 300 can also be used to view web content that includes geographical tags, thus providing a browser interface that can simplify interaction with any data that has inherent locality.
- FIG. 3 is a diagram of a wireless client device 300 in accordance with an embodiment of the present invention.
- Device 300 is a computing device with a graphics-capable screen 302 and a user interface 304 , and preferably includes at least one button 314 , a video camera 306 that can support live video input, and a wireless network interface 310 for supporting a connection such as Wi-Fi, Wi-Max, EDGE or WCDMA.
- Device 300 also preferably has a geo-location subsystem 308 that provides the latitude, longitude, approximate heading and altitude of the device 300 at regular intervals, in one embodiment at least once per second. In one embodiment this is supplied by a GPS receiver, and alternatively device 300 can also use radiolocation by triangulating cell tower signals.
- Device 300 also includes a view tracking device 312 that determines the spatial orientation—i.e. which direction the camera is looking—of the device in real time.
- View tracking device 312 in one embodiment includes an inertial three-degree of freedom tracker such as those made by Intersense Inc. of Bedford, Mass.; alternatively a software-based image tracker is used on the captured video; or a magnetic tracker, gyroscope or any other method of determining real world orientation is suitable.
- Device 300 may be a tablet PC, laptop, pocket PC, PDA, smart phone digital video camera, digital binoculars, laser range finder, GPS navigation device or other equipment that incorporates the described sub-components and functionality, including a graphics-capable screen and user interface.
- User interface 304 supports panning the device around in the same way a handheld camera is used. As the user points the camera 306 in different directions, messages are shown overlaid with the camera input on screen 302 at their actual location in the real world.
- the message representation includes information on the subject, sender, time and distance.
- FIG. 4 illustrates an example in which device 300 is used as part of an emergency response operation.
- a real-time video display 406 shows a flooded area, with two messages 402 , 404 overlaid on the image 406 .
- One message 402 indicates that it is from joe@rescue1, sent at 15:40:03, and having text “Gas Leak” at a distance of 0.1 miles; the other message 404 indicates that it is from mark@rescue1, sent at 12:30:00, reads “Structural Damage” and is located at a distance of 0.5 miles”.
- the message icon moves to reflect its actual real world position in relation to the current location of device 300 as determined from the GPS positioning device 308 and orientation tracking device 312 .
- a user-determined sorting criteria filters out messages to only the relevant subset—including distance, type, priority, sender, recipient list, time and other factors.
- the full message 402 descriptor can be seen now that the user has panned the device, and reads mark@rescue1 12:30:00 Structural Damage 0.5 miles.
- the user can expand it and see the message's full contents overlaid on the camera input. This is illustrated, for example, in FIG. 5 .
- the user can then view any attached files, edit the message, post a reply from her current location, at the message location or at a different place, or remove the message.
- Client device 300 sends update requests containing current geo-location and view data to server 102 , and server 102 responds with updated range sorted message data according to the client device's current location.
- Device 300 can also send message updates for server 102 to store in global message database 208 if required, including new messages added by the user on the client device, or existing message updates or replies.
- Server 102 can also be connected to the Internet for interfacing with other messaging systems and accessing other geo-located web content that can be displayed on the scene as well.
- server 102 and client device 300 use XML-based message data for networked communications, which are transmitted using standard known Internet protocols such as HTTP, SOAP and WSDL.
- the delivery system uses an HTTP server such as the Apache HTTP Server.
- client device software components map to the FIG. 3 hardware components as illustrated in FIG. 6 .
- FIG. 6 includes a local message database 614 that caches the messages pertinent to each client, and a central message manager 602 that arbitrates the use of all the other components.
- a geo-location manager 608 controls interfacing with the geo-location device 308 of FIG. 3 , a view tracking manager 610 that interacts with the view tracking device 312 , a camera image capture manager 604 , a graphics rendering engine 606 , a user interface manager 612 and a wireless client network manager 616 , all connected to central message manager 602 .
- Server 102 in addition to global message database 208 that stores messages for all users, has a message queue 204 specific to each client, described further below with respect to FIG. 7 , and which is sorted by range from each message's position to the client device's current location, and where each message is particularized for the client, including view occlusion information.
- server 102 includes elevation database 206 for terrain, buildings and other cultural features such as bridges and water towers.
- Message management module 202 determines a geometric intersection from a device's location to the coordinate of each message, and by comparing the resulting range with the actual distance between the two points, determines whether the message is visible from the device's position. This can be used to occlude or modify the appearance of the message when displayed.
- Message management module 202 arbitrates the interaction of all the components of the server message system, including building each device's current target message queue 204 , and network server module 210 asynchronously communicates with all the target devices 300 as data becomes available.
- Network server module 210 provides message data updates to the global message database 208 , but uses each device's specific message queue 204 to send data to the client device 300 .
- server 102 can in alternative embodiments be fulfilled by client devices, building a peer-to-peer network, where clients share all the messages that they have in common, combining their local message databases.
- FIG. 7 illustrates a flow chart for a processing loop executed by server 102 for a given device 300 update.
- Server 102 initially receives 702 an update request from the client device 300 that includes the device's current geographic location coordinates and view direction information, as well as content filtering settings.
- Filtering settings supplied by device 300 may include, for example, a maximum range to display, a setting to include only messages posted in the last hour, only marked as urgent or danger alert, and addressed directly to the user or its group.
- Server 102 then retrieves 704 all relevant messages from global message database 208 , which includes messages for all users of the system, and proceeds to generate 705 a range-sorted target message list for the particular device, in accordance with the device's filtering settings.
- a line of sight query against elevation database 206 from the device's position to the message's location is determined 724 . If 726 the range computed is less than the distance between both positions, minus a user determined error threshold such as 50 meters in one example, the message is marked as not directly visible from the current location.
- the dynamic message attributes (such as range, current relevance ranking or visibility) are then updated 712 , and if the message is determined 714 to be active, it is added 716 to an active message queue to be sent to the target device.
- server 102 sends 720 the active queue to client device 300 , and if available, receives 722 any message database updates from client device 300 and stores them in global database 208 .
- server 102 sends only a partial subset of its contents, in order to economize bandwidth. For example, only the information that has changed since the last update is sent in one embodiment.
- FIG. 8 illustrates a main loop flow for a client device 300 in message display mode.
- device 00 determines 802 its current geo-location from geo-location manager 608 and the current view direction from view tracking manager 610 . If the location and view direction, or time elapsed since the last update are such that 804 a server update should take place, device 300 sends 806 its location and an update request to server 102 .
- An update is triggered in one embodiment either by a direct request via user interface 304 , or by a pre-determined time or distance threshold being exceeded. Regardless of whether server communication is in process, then camera image capture module 604 continuously captures 812 input from video camera 306 and operation proceeds to step 814 described below.
- device 300 While waiting for a response from server 102 , device 300 captures 808 the camera video input from video camera 306 and graphics rendering engine 606 displays it as a background on screen 302 to render over.
- Device 300 then receives 810 message data from server 102 via wireless network interface 310 , and updates its internal message database 614 with the information. Next, device 300 determines 814 a camera transformation, which may be embodied as an affine matrix well known to those skilled in the art, for the current location and view direction, which will be used to determine whether messages are currently visible on the screen, and the actual screen position of each message at its real world location.
- a camera transformation which may be embodied as an affine matrix well known to those skilled in the art, for the current location and view direction, which will be used to determine whether messages are currently visible on the screen, and the actual screen position of each message at its real world location.
- a 3-D transformation for its position from the current device location is determined 828 , and its visibility is checked 830 against the current view volume (frustum) defined by the camera transformation.
- the message is then rendered 832 as an icon with text including the sender, subject, time and range, at the screen space transformed projected position of the 3D location determined in the previous step.
- device 300 checks the user interface 304 inputs and updates 822 the on-screen data overlay display, and if 824 message updates have happened or a new message has been generated, it sends 826 an update to server 102 including any changed data. If a user of device 300 has composed a message destined for another device, the message is in one embodiment included in the update request sent to server 102 . Following completion of the update, another iteration begins again at step 802 .
- One application of the present invention is as a tool for collaboration by an emergency response team.
- the system can be used to facilitate team collaboration on a large scale search and rescue operation such as the one that took place in New La in the wake of Hurricane Katrina or one that would be needed in the event of a large earthquake hitting the San Francisco Bay Area.
- a rescue team's effectiveness and safety can be greatly increased by enhancing communication between its members.
- the ability for a team member to quickly determine the current location of other team members, identify the actual physical location and immediate needs of victims who require help in real time, and be able to share information with other team members about what is happening at a given physical location can be invaluable, particularly when compared to conventional radio communications that require each team member to keep track of all the information as it comes, since there is no storage, or even more rudimentary methods such as the information spray painted on the doors of houses and buildings in New La to identify which locations had been searched.
- each team member carries a wireless computing device 300 connected in one embodiment to server 102 via a cell data network connection or packet radio, or alternatively connected to a laptop installed on a local vehicle via a local mid range network such as Wi-Max acting as a server—an advantage of the mobile laptop server being its total independence from power and communication infrastructure availability.
- each member can see on their screen by panning around the device both the position and distance of each member of the team, and if enabled, the physical location of the 911 calls routed to their particular team as they happen in real time, either derived by correlating fixed phone numbers to street addresses, and those to geo-position coordinates, or by direct geo-location of cell phone emergency calls.
- a team member in charge of coordinating the effort can see on a global map all the message and team member positions—for example, on the laptop that acts as the local server—and can dispatch instructions to each member of the team to ensure everything is covered with the maximum efficiency and safety possible, which can be displayed as notes on each person's device 300 at a given location, or presented on the screen as an information overlay visible over the camera input.
- Team members can place a message at a specific location for a given team member or group, or for everybody to see—for example, information on accessibility from a given direction, places that have already been searched for survivors in need of help, notes on how changes in the environment or the weather may affect their mission, or any other information that can help other team members.
- the message can be placed at the current location of the device 300 , or can be projected into the scene using an elevation database 206 for range determination, or at a distance determined by the user along the chosen view direction.
- These messages act as virtual sticky notes that team members can add for others to see as they gather new information on the field.
- a team member with an overall view of the scene can add notes at different locations with information on things such as accessibility, potential problems ahead or additional people in need of help that may not be visible from the location of the actual team members of the field. These notes then become visible on devices 300 as the relevant locations become viewable.
- Information posted at a given location can in one embodiment be edited by other team members, enabling refinement of the information by combining the observations of multiple members in real time—for example, once a given victim has been successfully reached, that information will be visible to everybody else on the team in real time. And in a disaster relief effort, information about which places have been searched and what is needed at each location is immediately visible as well at all times.
- the present invention enables collaboration of location deployed teams in ways not possible before. It constitutes a collective memory for the entire team that can enhance its effectiveness without a significant efficiency impact on each individual team member.
- FIG. 9 illustrates an application of the present invention to airport flight procedures.
- the illustrations can be overlaid in real-time to assist pilots in flying approaches and departures, complying with noise abatement procedures, and the like.
- FIG. 9 which illustrates a departure procedure for runway 28 at an airport
- the arrows 902 illustrating the actual departure procedure, the departure waypoint 904 and the “residential area” label 906 are rendered, while the remainder of the image 900 is captured by the camera.
- the overlays move in real time off-axis.
- the claimed invention presents an improved mobile device collaborative environment that simplifies interaction for field operations.
- the present invention has been described in particular detail with respect to a limited number of embodiments. Those of skill in the art will appreciate that the invention may additionally be practiced in other embodiments.
- the particular naming of the components, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, formats, or protocols.
- the system may be implemented via a combination of hardware and software, as described, or entirely in hardware elements.
- the particular division of functionality between the various system components described herein is merely exemplary, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead performed by a single component.
- the particular functions of the map data provider, map image provider and so forth may be provided in many or one module.
- Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by real time network operating systems.
- the present invention also relates to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
- the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- This application claims the benefit of U.S. Provisional Application Ser. No. 60/755,732, filed on Dec. 29, 2005, which is incorporated by reference herein in its entirety.
- 1. Field of the Invention
- The present invention relates generally to the field of computer graphics and networked wireless communication.
- 2. Description of the Related Art
- The World-Wide-Web, e-mail and instant messaging have revolutionized team collaboration, together with applications such as Lotus Notes and Microsoft Groove. The availability of mobile computing devices, together with access to the Global Positioning System (GPS) network are starting to bring the tools available for fixed location team collaboration to any place at any time. Devices such as the RIM Blackberry and the Palm Treo have brought e-mail to mobile devices with great success, but their effectiveness is certainly limited compared to the environment available to systems designed for use in fixed locations, in many cases due to interface and usability problems. Tools that work exceptionally well in an office environment prove themselves grossly inadequate for the same tasks when they need to be performed in the field and, in many cases, under pressure and adverse circumstances, as is often the case for rescue teams, military operations, law enforcement, infrastructure repair crews, and other teams that need to get a job done quickly and with efficient coordination. Currently, those teams typically rely on radio or cellular network communications without the capability of storing information to be shared; or they use text-based messaging/electronic mail systems that are hard to integrate with what is happening on the field and where it is taking place.
- In addition, browsing the World Wide Web on mobile devices is a much less rewarding experience than doing so on larger computers. Small screens, cumbersome interfaces and slow update speeds limit the usability of mobile devices.
- Recent trends in Internet content generation have seen the appearance of geotags—XML fields added to a web page that provide exact latitude and longitude coordinates. All of this is fostering developments in Internet mapping and cartography, from the original Internet maps to advanced applications such as Google Maps, Goggle Earth and Microsoft's TerraServer.
- These applications use traditional maps and computer graphics renderings of real world satellite imagery to allow users to view and navigate locations, access content and interact with the information available on the web with geographic locality. The appearance of geolocation tags on web content are enabling applications to be used not just for mapping but to display Internet search results for localized areas—services such as Yahoo! and Google Local allow the user to search for restaurants close to a given location, and display the returned results on a map.
- Maps are ideal for fixed-location computing with large displays, but the small screen sizes and interfacing constraints of mobile devices can limit their usability in mobile applications. In addition, a map has to be interpreted by the user and reconciled with her actual position in the real world, sometimes requiring significant effort to understand fully the information represented on the map.
- Military aircraft have long incorporated a different type of display, the Heads Up Display (HUD), where a representation of the aircraft instruments is displayed on a see-through mirror and superimposed over the out-the-window scene the pilot sees through the aircraft's canopy. HUD systems have repeatedly proven to increase pilot effectiveness and response time. Recently, HUD systems have appeared on civil aircraft and even in automobiles.
- Augmented reality is a branch of computer graphics that focuses on the incorporation of interactively-rendered imagery into real-world scenes. In most cases, it is implemented by using see-through head-mounted displays (HMDs) where the user can see both the real world surrounding her and a perspective-matched computer graphics rendering of objects in the scene. The field was pioneered by Ivan Sutherland, who introduced the first see-through HMD in 1968.
- Augmented reality has been used for applications such as aircraft maintenance training and navigation in complex environments such as a factory floor, where the user can see information displayed over the real scene, annotating the real world. Some recent projects such as “A Touring Machine” developed at Columbia University in 1997 allow annotation of real world locations and interaction with geographically tagged database content on a transportable computing device.
- While some existing wireless data communications tools such as text messaging, e-mail and instant messaging can be useful, making use of those while deployed in the field is cumbersome and inefficient. A limitation of these systems is that even though the information shared might have relevance to a specific physical location, these systems do not adapt the presentation of the information according to the perspective from one's location. Representing geographically tagged data on a map can improve the efficiency and has been used by certain DARPA military unit test wireless communication systems, but this forces the team members to constantly re-interpret the map and its correspondence to the real world scenario around them as they move, something made harder by the small screen real estate available on mobile devices.
- The present invention provides a system having advantages associated with a heads-up display as well as augmented reality technology allowing interaction within a collaborative environment similar to e-mail or instant messaging but with geographic locality, enabling teams to share information while on location with the same flexibility and immediacy that e-mail and instant messaging have brought to fixed location, office-based teams.
- A system in accordance with the present invention includes a server in communication with one or more client devices over a network such as a cellular telephone network. A client device includes a video capture device such as a video camera, which displays a live captured image on a screen of the client device. Data received from the server or other devices is overlaid on the live image in real time. In this way, the client device functions as a window between the real world and the virtual world of a networked collaborative environment by fusing data from the virtual world with live video from the device's camera. Users of this system can gaze through this window by pointing their device at areas of interest in their real environment and viewing the scene on the device's screen as with a common video camera's viewfinder, but with messages and data from the virtual world overlaid on the real scene. The user can interact with others in the collaborative environment by accessing and creating messages and data presented via the client device window.
- The present invention simplifies team collaboration on mobile devices, by allowing users to access and create geographically tagged information. A system in one embodiment uses a wireless computing device as its delivery platform, connected to a server system and/or other wireless devices over a network. The wireless device is also equipped with a high resolution display capable of rendering real time graphics, a video camera, a geo-location device that provides its current position (such as a GPS receiver or a radiolocation device using triangulation of cell phone network base station signals), and a view tracking system (such as an inertial tracker or a software based image tracker) that determines the orientation of its camera in real time. In one embodiment, the present invention includes a networked client device and a server side application; alternatively the functionality provided by the server can be carried out by client devices in the case of a peer-to-peer network.
-
FIG. 1 illustrates wireless clients in communication with a server in accordance with an embodiment of the present invention. -
FIG. 2 is a block diagram of a message server in accordance with an embodiment of the present invention. -
FIG. 3 illustrates a wireless client device in accordance with an embodiment of the present invention. -
FIG. 4 illustrates an emergency response application in accordance with an embodiment of the present invention. -
FIG. 5 illustrates a message viewed in an emergency response application in accordance with an embodiment of the present invention. -
FIG. 6 is a block diagram of a wireless client device in accordance with an embodiment of the present invention. -
FIG. 7 is a flow chart illustrating a method of operation of a message server in accordance with an embodiment of the present invention. -
FIG. 8 is a flow chart illustrating a main loop flow for a client device in accordance with an embodiment of the present invention. -
FIG. 9 illustrates airport departure procedures provided as an example of an embodiment of the present invention - The figures depict preferred embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
-
FIG. 1 illustrates asystem 100 for providing wireless collaboration in accordance with an embodiment of the present invention.System 100 includes aserver 102 andwireless client devices server 102 is in contact withclient devices Multiple client devices server 102 may be in contact with a plurality of client devices. For clarity of description, we refer generally toclient device 300, though any number of client devices may be in operation and communication withserver 102. The operation of and interaction betweenserver 102 andclient device 300 is described further below. - In one embodiment, geographically tagged messages are received and sent by
client device 300 via itswireless network interface 310. Messages in one embodiment include latitude, longitude and elevation coordinates, and in one embodiment Extensible Markup Language (XML) standard geo-locations tags are used. -
Client device 300 presents those messages to a user onscreen 302 as a graphics overlay on top of input fromvideo camera 306. As the user pans the device around her environment, she can see basic information about each message at its actual physical location onscreen 302, combined with the real world image captured by thecamera 306. Such information can include, for example, a user-selected icon, message subject, coordinates, range and time information for each message. Messages in one embodiment are color- and size-coded and filtered according to distance, time, priority level, category, sender and other user-set criteria to facilitate navigation.Device 300 can also determine occlusion information for each message, and not present occluded messages or present them in an attenuated fashion by making them transparent when drawn or using a different color coding. - At any time, a user can expand a particular message and see its contents in more detail overlaid in front of the scene, for example by centering the desired message icon on the
screen 302 by pointing thecamera 306 at it, and pushing abutton 314 while the message is contained inside a selection target box located at the center of thescreen 302. Other UI implementations for selecting and displaying a message can be used, as will be apparent to those of skill in the art. Similar to e-mail messages, the geographically tagged messages can contain, for example, text, audio, video, pictures or a hyperlinked URL to additional content, in addition to their coordinates and time and date information. - After receiving a message, the user can add information to the message at a specific location for other users to see, edit its content, reply to the user who sent the message from her current location, send a new message to a given user, or post a message for a given group of target users at a specific location to see. In addition, if she desires, her current location can be sent as a continuously updated message, so other users can find her location by just panning around their devices from their current position—or she can find the location of any other members who are currently broadcasting their position by panning around her camera and seeing their icons over the real-world camera image.
- To facilitate input on the go, in one embodiment data in addition to text is supported. For example, a voice message can be recorded as audio; preset simple messages that cover commonly conveyed information, or a predetermined icon with a given contextual meaning can also be attached to a message without requiring keyboard input. Once read, the message can be closed to return to the camera input message interface, and deleted if desired.
-
Client wireless device 300 captures input fromvideo camera 306 in real time and paints it as a background onscreen 302.Client device 300 also determines its current location using geo-location device 308, and its current view direction usingview tracking device 312. - As noted, messages (including text or other data items) preferably include geo-location information and may be marked as active or not active. In one embodiment,
server 102 determines the active status of each message and communicates the status todevice 300. For each received message that is active and meets display criteria selected by the device user, e.g., based on a time the message was sent, the message's recipients, messages of a certain importance, etc., a screen space position is determined from the received coordinates and client device's 300 current location and view direction. Using color in one embodiment to represent range, priority, age or other attributes of the message, if the computed screen space position is contained onscreen 302,client device 300 renders the message source, subject, time and date at the proper screen position. As the user pans and moves the camera, the message locations follow their real-world screen projected positions, allowing the user to associate each message with its location by looking around with the device. - In one embodiment, messages are sent to
client device 300 either when the user requests an update, at periodic intervals or whenever a certain threshold for positional change is exceeded—for example, wheneverdevice 300 moves more than 20 meters in any direction. In the embodiment illustrated inFIG. 1 , messages are received fromserver 102; in a peer-to-peer environment,server 102 is not present, and messages are received from other client devices. A peer-to-peer embodiment is described further below. -
FIG. 2 illustrates an additional view ofserver 102.Server 102 maintains adatabase 208 of all active messages and their coordinates. Whenclient device 300 requests an update and sends its current location to theserver 102,server 102 creates an active message list for that device by determining the range of all messages targeted for thatclient device 300 or any of the user groups it belongs to, and adding the messages that are proximately located, i.e. that are closer than a reception radius threshold, to a range sorted list. The reception radius threshold may be selected by a user ofdevice 300 or by an operator ofserver 102, or some combination of the two. - In one embodiment, for each message in the range-sorted list,
server 102 determines a line-of-sight query from the device's position to the message coordinates, usinggeometric database 206 of terrain elevation and three-dimensional models of structures and vegetation specific to that location, and then updates an occlusion attribute for the message that are specific to each device's settings. - Once updated, the message is placed into a
message queue 204 to be sent toclient device 300 via itswireless network interface 310. To economize bandwidth, partial updates are possible where only messages that change are sent, and where indices to currently-stored local messages can be sent to re-order the list according to the current device location and selected filtering criteria. -
Client device 300 in one embodiment can participate in a peer-to-peer network without the presence ofserver 102. In such an embodiment, client devices pass messages to each other until they reach the desired device(s). In such an embodiment, each client (peer) device performs operations that would otherwise be performed by the server, including range-sorting and filtering messages according to its current location. In one embodiment, no occlusion information is generated in the peer-to-peer protocol, if the client devices do not contain a geometric database of the area to query against. - In another embodiment, messages can be sent to the client devices not only from other devices in the collaborative environment, but from any networked computer by adding geographic tag information to the text of any e-mail, instant messenger post or web mail. In this case,
server 102 receives the global network traffic and translates the incoming data into the proper messages format. In a similar fashion, client devices can send geo-located messages to any networked computer as e-mail, andserver 102 translates those into pop3, IMAP, SMTP or other suitable network data. -
User interface 304 ofdevice 300 can also be used to view web content that includes geographical tags, thus providing a browser interface that can simplify interaction with any data that has inherent locality. - System Architecture
-
FIG. 3 is a diagram of awireless client device 300 in accordance with an embodiment of the present invention.Device 300 is a computing device with a graphics-capable screen 302 and auser interface 304, and preferably includes at least onebutton 314, avideo camera 306 that can support live video input, and awireless network interface 310 for supporting a connection such as Wi-Fi, Wi-Max, EDGE or WCDMA.Device 300 also preferably has a geo-location subsystem 308 that provides the latitude, longitude, approximate heading and altitude of thedevice 300 at regular intervals, in one embodiment at least once per second. In one embodiment this is supplied by a GPS receiver, and alternativelydevice 300 can also use radiolocation by triangulating cell tower signals. One example of geo-location technology is Rosum Inc.'s TV-GPS triangulation based GPS.Device 300 also includes aview tracking device 312 that determines the spatial orientation—i.e. which direction the camera is looking—of the device in real time.View tracking device 312 in one embodiment includes an inertial three-degree of freedom tracker such as those made by Intersense Inc. of Bedford, Mass.; alternatively a software-based image tracker is used on the captured video; or a magnetic tracker, gyroscope or any other method of determining real world orientation is suitable. -
Device 300 may be a tablet PC, laptop, pocket PC, PDA, smart phone digital video camera, digital binoculars, laser range finder, GPS navigation device or other equipment that incorporates the described sub-components and functionality, including a graphics-capable screen and user interface. -
User interface 304 supports panning the device around in the same way a handheld camera is used. As the user points thecamera 306 in different directions, messages are shown overlaid with the camera input onscreen 302 at their actual location in the real world. In one embodiment, the message representation includes information on the subject, sender, time and distance. For example,FIG. 4 illustrates an example in whichdevice 300 is used as part of an emergency response operation. A real-time video display 406 shows a flooded area, with twomessages image 406. Onemessage 402 indicates that it is from joe@rescue1, sent at 15:40:03, and having text “Gas Leak” at a distance of 0.1 miles; theother message 404 indicates that it is from mark@rescue1, sent at 12:30:00, reads “Structural Damage” and is located at a distance of 0.5 miles”. - As the user pans the
device 300 around the scene, the message icon moves to reflect its actual real world position in relation to the current location ofdevice 300 as determined from theGPS positioning device 308 andorientation tracking device 312. A user-determined sorting criteria filters out messages to only the relevant subset—including distance, type, priority, sender, recipient list, time and other factors. InFIG. 4 , for example, thefull message 402 descriptor can be seen now that the user has panned the device, and reads mark@rescue1 12:30:00 Structural Damage 0.5 miles. - In one embodiment, by centering a message on the crosshair at the center at the screen and clicking
button 314 in theuser interface 304, the user can expand it and see the message's full contents overlaid on the camera input. This is illustrated, for example, inFIG. 5 . The user can then view any attached files, edit the message, post a reply from her current location, at the message location or at a different place, or remove the message. -
Client device 300 sends update requests containing current geo-location and view data toserver 102, andserver 102 responds with updated range sorted message data according to the client device's current location.Device 300 can also send message updates forserver 102 to store inglobal message database 208 if required, including new messages added by the user on the client device, or existing message updates or replies.Server 102 can also be connected to the Internet for interfacing with other messaging systems and accessing other geo-located web content that can be displayed on the scene as well. - In one embodiment,
server 102 andclient device 300 use XML-based message data for networked communications, which are transmitted using standard known Internet protocols such as HTTP, SOAP and WSDL. In one embodiment, the delivery system uses an HTTP server such as the Apache HTTP Server. - In one embodiment, client device software components map to the
FIG. 3 hardware components as illustrated inFIG. 6 .FIG. 6 includes alocal message database 614 that caches the messages pertinent to each client, and acentral message manager 602 that arbitrates the use of all the other components. A geo-location manager 608 controls interfacing with the geo-location device 308 ofFIG. 3 , aview tracking manager 610 that interacts with theview tracking device 312, a camera image capture manager 604, a graphics rendering engine 606, auser interface manager 612 and a wirelessclient network manager 616, all connected tocentral message manager 602. -
Server 102, in addition toglobal message database 208 that stores messages for all users, has amessage queue 204 specific to each client, described further below with respect toFIG. 7 , and which is sorted by range from each message's position to the client device's current location, and where each message is particularized for the client, including view occlusion information. - In order to be able to determine view occlusion factors,
server 102 includeselevation database 206 for terrain, buildings and other cultural features such as bridges and water towers.Message management module 202 determines a geometric intersection from a device's location to the coordinate of each message, and by comparing the resulting range with the actual distance between the two points, determines whether the message is visible from the device's position. This can be used to occlude or modify the appearance of the message when displayed. -
Message management module 202 arbitrates the interaction of all the components of the server message system, including building each device's currenttarget message queue 204, andnetwork server module 210 asynchronously communicates with all thetarget devices 300 as data becomes available.Network server module 210 provides message data updates to theglobal message database 208, but uses each device'sspecific message queue 204 to send data to theclient device 300. - The role of
server 102 can in alternative embodiments be fulfilled by client devices, building a peer-to-peer network, where clients share all the messages that they have in common, combining their local message databases. -
FIG. 7 illustrates a flow chart for a processing loop executed byserver 102 for a givendevice 300 update.Server 102 initially receives 702 an update request from theclient device 300 that includes the device's current geographic location coordinates and view direction information, as well as content filtering settings. Filtering settings supplied bydevice 300 may include, for example, a maximum range to display, a setting to include only messages posted in the last hour, only marked as urgent or danger alert, and addressed directly to the user or its group. -
Server 102 then retrieves 704 all relevant messages fromglobal message database 208, which includes messages for all users of the system, and proceeds to generate 705 a range-sorted target message list for the particular device, in accordance with the device's filtering settings. - For each 706 message in the range-sorted target list, if 710 view occlusion is enabled, a line of sight query against
elevation database 206 from the device's position to the message's location is determined 724. If 726 the range computed is less than the distance between both positions, minus a user determined error threshold such as 50 meters in one example, the message is marked as not directly visible from the current location. - The dynamic message attributes (such as range, current relevance ranking or visibility) are then updated 712, and if the message is determined 714 to be active, it is added 716 to an active message queue to be sent to the target device.
- Once the entire range sorted list has been processed 718,
server 102 sends 720 the active queue toclient device 300, and if available, receives 722 any message database updates fromclient device 300 and stores them inglobal database 208. In one embodiment,server 102 sends only a partial subset of its contents, in order to economize bandwidth. For example, only the information that has changed since the last update is sent in one embodiment. -
FIG. 8 illustrates a main loop flow for aclient device 300 in message display mode. First, device 00 determines 802 its current geo-location from geo-location manager 608 and the current view direction fromview tracking manager 610. If the location and view direction, or time elapsed since the last update are such that 804 a server update should take place,device 300 sends 806 its location and an update request toserver 102. An update is triggered in one embodiment either by a direct request viauser interface 304, or by a pre-determined time or distance threshold being exceeded. Regardless of whether server communication is in process, then camera image capture module 604 continuously captures 812 input fromvideo camera 306 and operation proceeds to step 814 described below. - While waiting for a response from
server 102,device 300captures 808 the camera video input fromvideo camera 306 and graphics rendering engine 606 displays it as a background onscreen 302 to render over. -
Device 300 then receives 810 message data fromserver 102 viawireless network interface 310, and updates itsinternal message database 614 with the information. Next,device 300 determines 814 a camera transformation, which may be embodied as an affine matrix well known to those skilled in the art, for the current location and view direction, which will be used to determine whether messages are currently visible on the screen, and the actual screen position of each message at its real world location. - Next, for 816 each active message in the
local database 614 that is not 818 fully occluded, i.e. its occlusion flag is not set, or otherwise deactivated (i.e. a previously active message that has been marked inactive by the server), a 3-D transformation for its position from the current device location is determined 828, and its visibility is checked 830 against the current view volume (frustum) defined by the camera transformation. - If the message is determined to be visible on
screen 302, it is then rendered 832 as an icon with text including the sender, subject, time and range, at the screen space transformed projected position of the 3D location determined in the previous step. - Once every message has been processed 820,
device 300 checks theuser interface 304 inputs andupdates 822 the on-screen data overlay display, and if 824 message updates have happened or a new message has been generated, it sends 826 an update toserver 102 including any changed data. If a user ofdevice 300 has composed a message destined for another device, the message is in one embodiment included in the update request sent toserver 102. Following completion of the update, another iteration begins again atstep 802. - Emergency Response Example
- One application of the present invention is as a tool for collaboration by an emergency response team. For example, the system can be used to facilitate team collaboration on a large scale search and rescue operation such as the one that took place in New Orleans in the wake of Hurricane Katrina or one that would be needed in the event of a large earthquake hitting the San Francisco Bay Area.
- A rescue team's effectiveness and safety can be greatly increased by enhancing communication between its members. The ability for a team member to quickly determine the current location of other team members, identify the actual physical location and immediate needs of victims who require help in real time, and be able to share information with other team members about what is happening at a given physical location can be invaluable, particularly when compared to conventional radio communications that require each team member to keep track of all the information as it comes, since there is no storage, or even more rudimentary methods such as the information spray painted on the doors of houses and buildings in New Orleans to identify which locations had been searched.
- In the emergency response context, each team member carries a
wireless computing device 300 connected in one embodiment toserver 102 via a cell data network connection or packet radio, or alternatively connected to a laptop installed on a local vehicle via a local mid range network such as Wi-Max acting as a server—an advantage of the mobile laptop server being its total independence from power and communication infrastructure availability. - Once the team reaches their target area, each member can see on their screen by panning around the device both the position and distance of each member of the team, and if enabled, the physical location of the 911 calls routed to their particular team as they happen in real time, either derived by correlating fixed phone numbers to street addresses, and those to geo-position coordinates, or by direct geo-location of cell phone emergency calls.
- A team member in charge of coordinating the effort can see on a global map all the message and team member positions—for example, on the laptop that acts as the local server—and can dispatch instructions to each member of the team to ensure everything is covered with the maximum efficiency and safety possible, which can be displayed as notes on each person's
device 300 at a given location, or presented on the screen as an information overlay visible over the camera input. - Team members can place a message at a specific location for a given team member or group, or for everybody to see—for example, information on accessibility from a given direction, places that have already been searched for survivors in need of help, notes on how changes in the environment or the weather may affect their mission, or any other information that can help other team members. The message can be placed at the current location of the
device 300, or can be projected into the scene using anelevation database 206 for range determination, or at a distance determined by the user along the chosen view direction. These messages act as virtual sticky notes that team members can add for others to see as they gather new information on the field. - A team member with an overall view of the scene can add notes at different locations with information on things such as accessibility, potential problems ahead or additional people in need of help that may not be visible from the location of the actual team members of the field. These notes then become visible on
devices 300 as the relevant locations become viewable. - Other users can see messages on their screens as they pan around their devices, or if a message is targeted to them or their group, will be alerted once they are within the specified distance of the message location. Information posted at a given location can in one embodiment be edited by other team members, enabling refinement of the information by combining the observations of multiple members in real time—for example, once a given victim has been successfully reached, that information will be visible to everybody else on the team in real time. And in a disaster relief effort, information about which places have been searched and what is needed at each location is immediately visible as well at all times.
- By combining the information gathered by everybody into a common database, allowing it to be stored at the true physical location where it is most useful and providing a method to access the data in a simple, unobtrusive fashion without requiring an elaborate user interface, the present invention enables collaboration of location deployed teams in ways not possible before. It constitutes a collective memory for the entire team that can enhance its effectiveness without a significant efficiency impact on each individual team member.
- Similar applications are possible in the military and homeland defense space, firefighting, law enforcement, utility infrastructure maintenance crews, field surveys, and other situations where fielded team collaboration is required.
- Example: Airport Approach Procedures
-
FIG. 9 illustrates an application of the present invention to airport flight procedures. The illustrations can be overlaid in real-time to assist pilots in flying approaches and departures, complying with noise abatement procedures, and the like. InFIG. 9 , which illustrates a departure procedure for runway 28 at an airport, thearrows 902 illustrating the actual departure procedure, thedeparture waypoint 904 and the “residential area”label 906 are rendered, while the remainder of theimage 900 is captured by the camera. Unlike HUD or instrument systems, as the camera pans, the overlays move in real time off-axis. - Although the invention has been described in terms of a preferred embodiment comprising a handheld wireless computing device, its functionality can be implemented in other integrated devices such as video cameras, digital binoculars, digital range finders or GPS positioning devices without substantial differences.
- By enabling interaction between system users using a simple panning interface, the claimed invention presents an improved mobile device collaborative environment that simplifies interaction for field operations.
- The present invention has been described in particular detail with respect to a limited number of embodiments. Those of skill in the art will appreciate that the invention may additionally be practiced in other embodiments. First, the particular naming of the components, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, formats, or protocols. Further, the system may be implemented via a combination of hardware and software, as described, or entirely in hardware elements. Also, the particular division of functionality between the various system components described herein is merely exemplary, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead performed by a single component. For example, the particular functions of the map data provider, map image provider and so forth may be provided in many or one module.
- Some portions of the above description present the feature of the present invention in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are the means used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art. These operations, while described functionally or logically, are understood to be implemented by computer programs. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules or code devices, without loss of generality.
- It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the present discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
- Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by real time network operating systems.
- The present invention also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description above. In addition, the present invention is not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any references to specific languages are provided for disclosure of enablement and best mode of the present invention.
- The figures depict preferred embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
- Finally, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention.
Claims (7)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2006/062767 WO2007076555A2 (en) | 2005-12-29 | 2006-12-29 | A location based wireless collaborative environment with a visual user interface |
US11/618,672 US8280405B2 (en) | 2005-12-29 | 2006-12-29 | Location based wireless collaborative environment with a visual user interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US75573205P | 2005-12-29 | 2005-12-29 | |
US11/618,672 US8280405B2 (en) | 2005-12-29 | 2006-12-29 | Location based wireless collaborative environment with a visual user interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070242131A1 true US20070242131A1 (en) | 2007-10-18 |
US8280405B2 US8280405B2 (en) | 2012-10-02 |
Family
ID=38218898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/618,672 Active 2030-01-24 US8280405B2 (en) | 2005-12-29 | 2006-12-29 | Location based wireless collaborative environment with a visual user interface |
Country Status (2)
Country | Link |
---|---|
US (1) | US8280405B2 (en) |
WO (1) | WO2007076555A2 (en) |
Cited By (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040189517A1 (en) * | 2001-10-09 | 2004-09-30 | Ashutosh Pande | Method and system for sending location coded images over a wireless network |
US20070208994A1 (en) * | 2006-03-03 | 2007-09-06 | Reddel Frederick A V | Systems and methods for document annotation |
US20070249368A1 (en) * | 2006-04-25 | 2007-10-25 | Google Inc. | Shared Geo-Located Objects |
US20070282792A1 (en) * | 2006-04-25 | 2007-12-06 | Google Inc. | Identifying Geo-Located Objects |
US20080147730A1 (en) * | 2006-12-18 | 2008-06-19 | Motorola, Inc. | Method and system for providing location-specific image information |
US20080243906A1 (en) * | 2007-03-31 | 2008-10-02 | Keith Peters | Online system and method for providing geographic presentations of localities that are pertinent to a text item |
US20090215471A1 (en) * | 2008-02-21 | 2009-08-27 | Microsoft Corporation | Location based object tracking |
US20090276816A1 (en) * | 2008-05-05 | 2009-11-05 | Anh Hao Tran | System And Method For Obtaining And Distributing Video |
US20100009700A1 (en) * | 2008-07-08 | 2010-01-14 | Sony Ericsson Mobile Communications Ab | Methods and Apparatus for Collecting Image Data |
US20100023878A1 (en) * | 2008-07-23 | 2010-01-28 | Yahoo! Inc. | Virtual notes in a reality overlay |
US20100081416A1 (en) * | 2008-09-30 | 2010-04-01 | Microsoft Corporation | Virtual skywriting |
US20100167256A1 (en) * | 2008-02-14 | 2010-07-01 | Douglas Michael Blash | System and method for global historical database |
US20100208033A1 (en) * | 2009-02-13 | 2010-08-19 | Microsoft Corporation | Personal Media Landscapes in Mixed Reality |
US20100214398A1 (en) * | 2009-02-25 | 2010-08-26 | Valerie Goulart | Camera pod that captures images or video when triggered by a mobile device |
US20100238161A1 (en) * | 2009-03-19 | 2010-09-23 | Kenneth Varga | Computer-aided system for 360º heads up display of safety/mission critical data |
US20110007150A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Extraction of Real World Positional Information from Video |
US20110007134A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Synchronizing video images and three dimensional visualization images |
US20110010674A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Displaying situational information based on geospatial data |
US20110007962A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Overlay Information Over Video |
US20110014927A1 (en) * | 2009-07-16 | 2011-01-20 | Inventec Appliances Corp. | Method and mobile electronic device for a function of giving notice according to positions |
US20110173074A1 (en) * | 2008-10-27 | 2011-07-14 | Fujitsu Limited | Communication system, advertisement managing device, and wireless base station |
US20110231092A1 (en) * | 2010-03-18 | 2011-09-22 | Sony Corporation | Real-time tracking of digital cameras and wireless capable devices |
KR20110115664A (en) * | 2010-04-16 | 2011-10-24 | 엘지전자 주식회사 | Mobile terminal and its control method |
US20110279446A1 (en) * | 2010-05-16 | 2011-11-17 | Nokia Corporation | Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device |
US20120001939A1 (en) * | 2010-06-30 | 2012-01-05 | Nokia Corporation | Methods, apparatuses and computer program products for automatically generating suggested information layers in augmented reality |
US20120143963A1 (en) * | 2010-12-07 | 2012-06-07 | Aleksandr Kennberg | Determining Message Prominence |
US20120221552A1 (en) * | 2011-02-28 | 2012-08-30 | Nokia Corporation | Method and apparatus for providing an active search user interface element |
US20120236104A1 (en) * | 2010-05-18 | 2012-09-20 | Zte Corporation | Mobile Terminal and Method for Informing User's Location |
US20120246222A1 (en) * | 2011-03-22 | 2012-09-27 | David Martin | System for Group Supervision |
US8412237B1 (en) * | 2011-07-29 | 2013-04-02 | Intuit Inc. | Method and system for launching and preparing applications on mobile computing systems based on geo-location data |
EP2587792A1 (en) * | 2010-06-28 | 2013-05-01 | LG Electronics Inc. | Method and apparatus for providing the operation state of an external device |
US20130162644A1 (en) * | 2011-12-27 | 2013-06-27 | Nokia Corporation | Method and apparatus for providing perspective-based content placement |
US20130196718A1 (en) * | 2010-10-12 | 2013-08-01 | Kyocera Corporation | Portable terminal device |
WO2014074257A1 (en) * | 2012-11-06 | 2014-05-15 | Ripple Inc | Rendering a digital element |
US20140132630A1 (en) * | 2012-11-13 | 2014-05-15 | Samsung Electronics Co., Ltd. | Apparatus and method for providing social network service using augmented reality |
US20140204121A1 (en) * | 2012-12-27 | 2014-07-24 | Schlumberger Technology Corporation | Augmented reality for oilfield |
US8896629B2 (en) | 2009-08-18 | 2014-11-25 | Metaio Gmbh | Method for representing virtual information in a real environment |
US8902254B1 (en) * | 2010-09-02 | 2014-12-02 | The Boeing Company | Portable augmented reality |
US20150124043A1 (en) * | 2013-11-01 | 2015-05-07 | Microsoft Corporation | Controlling Display of Video Data |
US20150169568A1 (en) * | 2012-03-16 | 2015-06-18 | Laura Garcia-Barrio | Method and apparatus for enabling digital memory walls |
US9113050B2 (en) | 2011-01-13 | 2015-08-18 | The Boeing Company | Augmented collaboration system |
JP2015537264A (en) * | 2012-08-27 | 2015-12-24 | エンパイア テクノロジー ディベロップメント エルエルシー | Indicate the geographical source of digitally mediated communications |
US20160035141A1 (en) * | 2010-02-02 | 2016-02-04 | Sony Corporation | Image processing device, image processing method, and program |
US20160220885A1 (en) * | 2005-07-14 | 2016-08-04 | Charles D. Huston | System And Method For Creating Content For An Event Using A Social Network |
US9485285B1 (en) | 2010-02-08 | 2016-11-01 | Google Inc. | Assisting the authoring of posts to an asymmetric social network |
US9619940B1 (en) | 2014-06-10 | 2017-04-11 | Ripple Inc | Spatial filtering trace location |
US9639857B2 (en) | 2011-09-30 | 2017-05-02 | Nokia Technologies Oy | Method and apparatus for associating commenting information with one or more objects |
US9646418B1 (en) | 2014-06-10 | 2017-05-09 | Ripple Inc | Biasing a rendering location of an augmented reality object |
US9681093B1 (en) * | 2011-08-19 | 2017-06-13 | Google Inc. | Geolocation impressions |
US9720228B2 (en) | 2010-12-16 | 2017-08-01 | Lockheed Martin Corporation | Collimating display with pixel lenses |
US9729352B1 (en) | 2010-02-08 | 2017-08-08 | Google Inc. | Assisting participation in a social network |
US9728006B2 (en) | 2009-07-20 | 2017-08-08 | Real Time Companies, LLC | Computer-aided system for 360° heads up display of safety/mission critical data |
US20170227361A1 (en) * | 2014-06-20 | 2017-08-10 | Uti Limited Partnership | Mobile mapping system |
US9836888B2 (en) * | 2016-02-18 | 2017-12-05 | Edx Technologies, Inc. | Systems and methods for augmented reality representations of networks |
US9930096B2 (en) | 2010-02-08 | 2018-03-27 | Google Llc | Recommending posts to non-subscribing users |
US9939650B2 (en) | 2015-03-02 | 2018-04-10 | Lockheed Martin Corporation | Wearable display system |
EP3306445A1 (en) * | 2016-10-04 | 2018-04-11 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
US20180144524A1 (en) * | 2014-06-10 | 2018-05-24 | Ripple Inc | Dynamic location based digital element |
US9984408B1 (en) * | 2012-05-30 | 2018-05-29 | Amazon Technologies, Inc. | Method, medium, and system for live video cooperative shopping |
US9995936B1 (en) | 2016-04-29 | 2018-06-12 | Lockheed Martin Corporation | Augmented reality systems having a virtual image overlaying an infrared portion of a live scene |
US10026227B2 (en) | 2010-09-02 | 2018-07-17 | The Boeing Company | Portable augmented reality |
US10026226B1 (en) | 2014-06-10 | 2018-07-17 | Ripple Inc | Rendering an augmented reality object |
US20180259958A1 (en) * | 2017-03-09 | 2018-09-13 | Uber Technologies, Inc. | Personalized content creation for autonomous vehicle rides |
US10152851B2 (en) | 2016-11-29 | 2018-12-11 | Microsoft Technology Licensing, Llc | Notification artifact display |
US10165261B2 (en) | 2016-10-04 | 2018-12-25 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
US10359545B2 (en) | 2010-10-21 | 2019-07-23 | Lockheed Martin Corporation | Fresnel lens with reduced draft facet visibility |
US10495790B2 (en) | 2010-10-21 | 2019-12-03 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more Fresnel lenses |
US10684476B2 (en) | 2014-10-17 | 2020-06-16 | Lockheed Martin Corporation | Head-wearable ultra-wide field of view display device |
US10754156B2 (en) | 2015-10-20 | 2020-08-25 | Lockheed Martin Corporation | Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system |
KR20210004918A (en) * | 2019-04-25 | 2021-01-13 | 군산대학교산학협력단 | Method of Discovering Region of Attractions from Geo-tagged Photos and Apparatus Thereof |
US11032677B2 (en) * | 2002-04-24 | 2021-06-08 | Ipventure, Inc. | Method and system for enhanced messaging using sensor input |
US11054527B2 (en) | 2002-04-24 | 2021-07-06 | Ipventure, Inc. | Method and apparatus for intelligent acquisition of position information |
EP3890363A1 (en) * | 2020-04-02 | 2021-10-06 | Cyberschmiede GmbH | Method for transferring information |
US11144760B2 (en) | 2019-06-21 | 2021-10-12 | International Business Machines Corporation | Augmented reality tagging of non-smart items |
US11157089B2 (en) * | 2019-12-27 | 2021-10-26 | Hypori Llc | Character editing on a physical device via interaction with a virtual device user interface |
CN113806644A (en) * | 2021-09-18 | 2021-12-17 | 英华达(上海)科技有限公司 | Message processing method, message display method, message processing device, message display device, terminal and storage medium |
US11238398B2 (en) | 2002-04-24 | 2022-02-01 | Ipventure, Inc. | Tracking movement of objects and notifications therefor |
US11330419B2 (en) | 2000-02-28 | 2022-05-10 | Ipventure, Inc. | Method and system for authorized location monitoring |
US11354864B2 (en) * | 2018-02-21 | 2022-06-07 | Raziq Yaqub | System and method for presenting location based augmented reality road signs on or in a vehicle |
US11368808B2 (en) | 2002-04-24 | 2022-06-21 | Ipventure, Inc. | Method and apparatus for identifying and presenting location and location-related information |
US11411900B2 (en) | 2020-03-30 | 2022-08-09 | Snap Inc. | Off-platform messaging system |
US20220414173A1 (en) * | 2018-08-23 | 2022-12-29 | Newsplug, Inc. | Geographic location based feed |
US11722442B2 (en) | 2019-07-05 | 2023-08-08 | Snap Inc. | Event planning in a content sharing platform |
US11838252B2 (en) * | 2017-08-08 | 2023-12-05 | Snap Inc. | Application-independent messaging system |
US11972450B2 (en) | 2005-07-14 | 2024-04-30 | Charles D. Huston | Spectator and participant system and method for displaying different views of an event |
US11973730B2 (en) | 2022-06-02 | 2024-04-30 | Snap Inc. | External messaging function for an interaction system |
US12008697B2 (en) | 2014-06-10 | 2024-06-11 | Ripple, Inc. Of Delaware | Dynamic location based digital element |
US12126588B2 (en) | 2020-04-23 | 2024-10-22 | Snap Inc. | Event overlay invite messaging system |
US12278791B2 (en) | 2024-02-28 | 2025-04-15 | Snap Inc. | Event planning in a content sharing platform |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8364397B2 (en) * | 2007-08-23 | 2013-01-29 | International Business Machines Corporation | Pictorial navigation |
US10095375B2 (en) * | 2008-07-09 | 2018-10-09 | Apple Inc. | Adding a contact to a home screen |
TWI558199B (en) | 2008-08-08 | 2016-11-11 | 尼康股份有限公司 | Carry information machine and information acquisition system |
EP2389669A1 (en) * | 2009-01-21 | 2011-11-30 | Universiteit Gent | Geodatabase information processing |
DE102009050187A1 (en) * | 2009-10-21 | 2011-04-28 | Gobandit Gmbh | GPS / video data communication system, data communication method, and apparatus for use in a GPS / video data communication system |
US8830267B2 (en) * | 2009-11-16 | 2014-09-09 | Alliance For Sustainable Energy, Llc | Augmented reality building operations tool |
WO2013128078A1 (en) | 2012-02-29 | 2013-09-06 | Nokia Corporation | Method and apparatus for rendering items in a user interface |
EP2582101A1 (en) * | 2011-10-10 | 2013-04-17 | inZair SA | A method, system and apparatus for geolocalized mobile messaging |
WO2013170875A1 (en) * | 2012-05-14 | 2013-11-21 | Abb Research Ltd | Method and mobile terminal for industrial process equipment maintenance |
US10354004B2 (en) | 2012-06-07 | 2019-07-16 | Apple Inc. | Intelligent presentation of documents |
WO2014005066A1 (en) * | 2012-06-28 | 2014-01-03 | Experience Proximity, Inc., Dba Oooii | Systems and methods for navigating virtual structured data relative to real-world locales |
US9154917B2 (en) * | 2013-03-13 | 2015-10-06 | Qualcomm Incorporated | Method and apparatus for geotagging |
WO2015027199A2 (en) | 2013-08-22 | 2015-02-26 | Naqvi Shamim A | Method and system for addressing the problem of discovering relevant services and applications that are available over the internet or other communcations network |
US9613459B2 (en) | 2013-12-19 | 2017-04-04 | Honda Motor Co., Ltd. | System and method for in-vehicle interaction |
US9552587B2 (en) | 2014-07-11 | 2017-01-24 | Sensoriant, Inc. | System and method for mediating representations with respect to preferences of a party not located in the environment |
KR102686073B1 (en) | 2015-10-14 | 2024-07-17 | 엑스-써마 인코포레이티드 | Compositions and methods for reducing ice crystal formation |
US9699606B1 (en) * | 2016-06-24 | 2017-07-04 | Amazon Technologies, Inc. | Delivery confirmation using overlapping geo-fences |
US10123181B1 (en) | 2017-05-03 | 2018-11-06 | Honeywell International Inc. | Systems and methods for collaborative vehicle mission operations |
US10803610B2 (en) | 2018-03-06 | 2020-10-13 | At&T Intellectual Property I, L.P. | Collaborative visual enhancement devices |
Citations (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5884216A (en) * | 1992-10-16 | 1999-03-16 | Mobile Information System, Inc. | Method and apparatus for tracking vehicle location |
US6222583B1 (en) * | 1997-03-27 | 2001-04-24 | Nippon Telegraph And Telephone Corporation | Device and system for labeling sight images |
US20020062246A1 (en) * | 2000-11-21 | 2002-05-23 | Hajime Matsubara | Advertising information transmitting and receiving methods |
US20020065090A1 (en) * | 2000-07-28 | 2002-05-30 | Akio Ohba | Data providing system, method and computer program |
US20020086672A1 (en) * | 2000-01-26 | 2002-07-04 | Mcdowell Mark | Method and apparatus for sharing mobile user event information between wireless networks and fixed IP networks |
US20020140745A1 (en) * | 2001-01-24 | 2002-10-03 | Ellenby Thomas William | Pointing systems for addressing objects |
US20020167536A1 (en) * | 2001-03-30 | 2002-11-14 | Koninklijke Philips Electronics N.V. | Method, system and device for augmented reality |
US6507802B1 (en) * | 2000-02-16 | 2003-01-14 | Hrl Laboratories, Llc | Mobile user collaborator discovery method and apparatus |
US20030027553A1 (en) * | 2001-08-03 | 2003-02-06 | Brian Davidson | Mobile browsing |
US20030104820A1 (en) * | 2001-12-04 | 2003-06-05 | Greene David P. | Location-specific messaging system |
US20030218638A1 (en) * | 2002-02-06 | 2003-11-27 | Stuart Goose | Mobile multimodal user interface combining 3D graphics, location-sensitive speech interaction and tracking technologies |
US6822624B2 (en) * | 2002-09-10 | 2004-11-23 | Universal Avionics Systems Corporation | Display generation system |
US20050032527A1 (en) * | 2003-08-08 | 2005-02-10 | Networks In Motion, Inc. | Method and system for collecting synchronizing, and reporting telecommunication call events and work flow related information |
US6884216B2 (en) * | 2002-08-12 | 2005-04-26 | Kabushiki Kaisha Toshiba | Ultrasound diagnosis apparatus and ultrasound image display method and apparatus |
US6909708B1 (en) * | 1996-11-18 | 2005-06-21 | Mci Communications Corporation | System, method and article of manufacture for a communication system architecture including video conferencing |
US6930715B1 (en) * | 2000-07-21 | 2005-08-16 | The Research Foundation Of The State University Of New York | Method, system and program product for augmenting an image of a scene with information about the scene |
US7057614B2 (en) * | 2002-05-24 | 2006-06-06 | Olympus Corporation | Information display system and portable information terminal |
US7088389B2 (en) * | 2000-09-19 | 2006-08-08 | Olympus Optical Co., Ltd. | System for displaying information in specific region |
US7091852B2 (en) * | 2002-07-02 | 2006-08-15 | Tri-Sentinel, Inc. | Emergency response personnel automated accountability system |
US20060190812A1 (en) * | 2005-02-22 | 2006-08-24 | Geovector Corporation | Imaging systems including hyperlink associations |
US7170632B1 (en) * | 1998-05-20 | 2007-01-30 | Fuji Photo Film Co., Ltd. | Image reproducing method and apparatus, image processing method and apparatus, and photographing support system |
US20070123280A1 (en) * | 2005-07-13 | 2007-05-31 | Mcgary Faith | System and method for providing mobile device services using SMS communications |
US20070162942A1 (en) * | 2006-01-09 | 2007-07-12 | Kimmo Hamynen | Displaying network objects in mobile devices based on geolocation |
US20070237318A1 (en) * | 2006-02-14 | 2007-10-11 | Mcgary Faith | System and method for providing mobile device services using SMS communications |
US20070271035A1 (en) * | 2006-05-22 | 2007-11-22 | Arne Stoschek | Navigation system for a motor vehicle, method for operating a navigation system and motor vehicle including a navigation system |
US20080122871A1 (en) * | 2006-11-27 | 2008-05-29 | Microsoft Corporation | Federated Virtual Graffiti |
US20080243861A1 (en) * | 2007-03-29 | 2008-10-02 | Tomas Karl-Axel Wassingbo | Digital photograph content information service |
US20090049004A1 (en) * | 2007-08-16 | 2009-02-19 | Nokia Corporation | Apparatus, method and computer program product for tying information to features associated with captured media objects |
US20100194782A1 (en) * | 2009-02-04 | 2010-08-05 | Motorola, Inc. | Method and apparatus for creating virtual graffiti in a mobile virtual and augmented reality system |
US7844229B2 (en) * | 2007-09-21 | 2010-11-30 | Motorola Mobility, Inc | Mobile virtual and augmented reality system |
US7917153B2 (en) * | 2004-03-31 | 2011-03-29 | France Telecom | Method and apparatus for creating, directing, storing and automatically delivering a message to an intended recipient upon arrival of a specified mobile object at a designated location |
-
2006
- 2006-12-29 WO PCT/US2006/062767 patent/WO2007076555A2/en active Application Filing
- 2006-12-29 US US11/618,672 patent/US8280405B2/en active Active
Patent Citations (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5884216A (en) * | 1992-10-16 | 1999-03-16 | Mobile Information System, Inc. | Method and apparatus for tracking vehicle location |
US6909708B1 (en) * | 1996-11-18 | 2005-06-21 | Mci Communications Corporation | System, method and article of manufacture for a communication system architecture including video conferencing |
US6222583B1 (en) * | 1997-03-27 | 2001-04-24 | Nippon Telegraph And Telephone Corporation | Device and system for labeling sight images |
US7170632B1 (en) * | 1998-05-20 | 2007-01-30 | Fuji Photo Film Co., Ltd. | Image reproducing method and apparatus, image processing method and apparatus, and photographing support system |
US20020086672A1 (en) * | 2000-01-26 | 2002-07-04 | Mcdowell Mark | Method and apparatus for sharing mobile user event information between wireless networks and fixed IP networks |
US6507802B1 (en) * | 2000-02-16 | 2003-01-14 | Hrl Laboratories, Llc | Mobile user collaborator discovery method and apparatus |
US6930715B1 (en) * | 2000-07-21 | 2005-08-16 | The Research Foundation Of The State University Of New York | Method, system and program product for augmenting an image of a scene with information about the scene |
US20020065090A1 (en) * | 2000-07-28 | 2002-05-30 | Akio Ohba | Data providing system, method and computer program |
US7088389B2 (en) * | 2000-09-19 | 2006-08-08 | Olympus Optical Co., Ltd. | System for displaying information in specific region |
US20020062246A1 (en) * | 2000-11-21 | 2002-05-23 | Hajime Matsubara | Advertising information transmitting and receiving methods |
US20020140745A1 (en) * | 2001-01-24 | 2002-10-03 | Ellenby Thomas William | Pointing systems for addressing objects |
US7031875B2 (en) * | 2001-01-24 | 2006-04-18 | Geo Vector Corporation | Pointing systems for addressing objects |
US20020167536A1 (en) * | 2001-03-30 | 2002-11-14 | Koninklijke Philips Electronics N.V. | Method, system and device for augmented reality |
US20030027553A1 (en) * | 2001-08-03 | 2003-02-06 | Brian Davidson | Mobile browsing |
US20030104820A1 (en) * | 2001-12-04 | 2003-06-05 | Greene David P. | Location-specific messaging system |
US20030218638A1 (en) * | 2002-02-06 | 2003-11-27 | Stuart Goose | Mobile multimodal user interface combining 3D graphics, location-sensitive speech interaction and tracking technologies |
US7057614B2 (en) * | 2002-05-24 | 2006-06-06 | Olympus Corporation | Information display system and portable information terminal |
US7091852B2 (en) * | 2002-07-02 | 2006-08-15 | Tri-Sentinel, Inc. | Emergency response personnel automated accountability system |
US6884216B2 (en) * | 2002-08-12 | 2005-04-26 | Kabushiki Kaisha Toshiba | Ultrasound diagnosis apparatus and ultrasound image display method and apparatus |
US6822624B2 (en) * | 2002-09-10 | 2004-11-23 | Universal Avionics Systems Corporation | Display generation system |
US20050032527A1 (en) * | 2003-08-08 | 2005-02-10 | Networks In Motion, Inc. | Method and system for collecting synchronizing, and reporting telecommunication call events and work flow related information |
US7917153B2 (en) * | 2004-03-31 | 2011-03-29 | France Telecom | Method and apparatus for creating, directing, storing and automatically delivering a message to an intended recipient upon arrival of a specified mobile object at a designated location |
US20060190812A1 (en) * | 2005-02-22 | 2006-08-24 | Geovector Corporation | Imaging systems including hyperlink associations |
US20070123280A1 (en) * | 2005-07-13 | 2007-05-31 | Mcgary Faith | System and method for providing mobile device services using SMS communications |
US7720436B2 (en) * | 2006-01-09 | 2010-05-18 | Nokia Corporation | Displaying network objects in mobile devices based on geolocation |
US20070162942A1 (en) * | 2006-01-09 | 2007-07-12 | Kimmo Hamynen | Displaying network objects in mobile devices based on geolocation |
US20070237318A1 (en) * | 2006-02-14 | 2007-10-11 | Mcgary Faith | System and method for providing mobile device services using SMS communications |
US7925243B2 (en) * | 2006-02-14 | 2011-04-12 | Mcgary Faith | System and method for providing mobile device services using SMS communications |
US20070271035A1 (en) * | 2006-05-22 | 2007-11-22 | Arne Stoschek | Navigation system for a motor vehicle, method for operating a navigation system and motor vehicle including a navigation system |
US20080122871A1 (en) * | 2006-11-27 | 2008-05-29 | Microsoft Corporation | Federated Virtual Graffiti |
US20080243861A1 (en) * | 2007-03-29 | 2008-10-02 | Tomas Karl-Axel Wassingbo | Digital photograph content information service |
US20090049004A1 (en) * | 2007-08-16 | 2009-02-19 | Nokia Corporation | Apparatus, method and computer program product for tying information to features associated with captured media objects |
US7844229B2 (en) * | 2007-09-21 | 2010-11-30 | Motorola Mobility, Inc | Mobile virtual and augmented reality system |
US20100194782A1 (en) * | 2009-02-04 | 2010-08-05 | Motorola, Inc. | Method and apparatus for creating virtual graffiti in a mobile virtual and augmented reality system |
Cited By (167)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11330419B2 (en) | 2000-02-28 | 2022-05-10 | Ipventure, Inc. | Method and system for authorized location monitoring |
US20040189517A1 (en) * | 2001-10-09 | 2004-09-30 | Ashutosh Pande | Method and system for sending location coded images over a wireless network |
US7747259B2 (en) * | 2001-10-09 | 2010-06-29 | Sirf Technology, Inc. | Method and system for sending location coded images over a wireless network |
US11249196B2 (en) | 2002-04-24 | 2022-02-15 | Ipventure, Inc. | Method and apparatus for intelligent acquisition of position information |
US11368808B2 (en) | 2002-04-24 | 2022-06-21 | Ipventure, Inc. | Method and apparatus for identifying and presenting location and location-related information |
US11308441B2 (en) | 2002-04-24 | 2022-04-19 | Ipventure, Inc. | Method and system for tracking and monitoring assets |
US11418905B2 (en) | 2002-04-24 | 2022-08-16 | Ipventure, Inc. | Method and apparatus for identifying and presenting location and location-related information |
US11238398B2 (en) | 2002-04-24 | 2022-02-01 | Ipventure, Inc. | Tracking movement of objects and notifications therefor |
US11218848B2 (en) * | 2002-04-24 | 2022-01-04 | Ipventure, Inc. | Messaging enhancement with location information |
US11067704B2 (en) | 2002-04-24 | 2021-07-20 | Ipventure, Inc. | Method and apparatus for intelligent acquisition of position information |
US11054527B2 (en) | 2002-04-24 | 2021-07-06 | Ipventure, Inc. | Method and apparatus for intelligent acquisition of position information |
US11032677B2 (en) * | 2002-04-24 | 2021-06-08 | Ipventure, Inc. | Method and system for enhanced messaging using sensor input |
US11915186B2 (en) | 2002-04-24 | 2024-02-27 | Ipventure, Inc. | Personalized medical monitoring and notifications therefor |
US20160220885A1 (en) * | 2005-07-14 | 2016-08-04 | Charles D. Huston | System And Method For Creating Content For An Event Using A Social Network |
US10512832B2 (en) * | 2005-07-14 | 2019-12-24 | Charles D. Huston | System and method for a golf event using artificial reality |
US20200061435A1 (en) * | 2005-07-14 | 2020-02-27 | Charles D. Huston | System And Method For Creating Content For An Event Using A Social Network |
US11972450B2 (en) | 2005-07-14 | 2024-04-30 | Charles D. Huston | Spectator and participant system and method for displaying different views of an event |
US12190342B2 (en) | 2005-07-14 | 2025-01-07 | Spatial Reality, Llc. | Spectator and participant system and method for displaying different views of an event |
US11087345B2 (en) * | 2005-07-14 | 2021-08-10 | Charles D. Huston | System and method for creating content for an event using a social network |
WO2007103352A3 (en) * | 2006-03-03 | 2008-11-13 | Live Cargo Inc | Systems and methods for document annotation |
US20070208994A1 (en) * | 2006-03-03 | 2007-09-06 | Reddel Frederick A V | Systems and methods for document annotation |
WO2007103352A2 (en) * | 2006-03-03 | 2007-09-13 | Live Cargo, Inc. | Systems and methods for document annotation |
US9418164B2 (en) | 2006-04-25 | 2016-08-16 | Google Inc. | Shared geo-located objects |
US20070282792A1 (en) * | 2006-04-25 | 2007-12-06 | Google Inc. | Identifying Geo-Located Objects |
US8938464B2 (en) | 2006-04-25 | 2015-01-20 | Google Inc. | Identifying geo-located objects |
US9031964B2 (en) * | 2006-04-25 | 2015-05-12 | Google Inc. | Shared geo-located objects |
US20070249368A1 (en) * | 2006-04-25 | 2007-10-25 | Google Inc. | Shared Geo-Located Objects |
US9418163B2 (en) | 2006-04-25 | 2016-08-16 | Google Inc. | Shared geo-located objects |
US20080147730A1 (en) * | 2006-12-18 | 2008-06-19 | Motorola, Inc. | Method and system for providing location-specific image information |
US20080243906A1 (en) * | 2007-03-31 | 2008-10-02 | Keith Peters | Online system and method for providing geographic presentations of localities that are pertinent to a text item |
US20100167256A1 (en) * | 2008-02-14 | 2010-07-01 | Douglas Michael Blash | System and method for global historical database |
US20090215471A1 (en) * | 2008-02-21 | 2009-08-27 | Microsoft Corporation | Location based object tracking |
US8903430B2 (en) * | 2008-02-21 | 2014-12-02 | Microsoft Corporation | Location based object tracking |
US20090276816A1 (en) * | 2008-05-05 | 2009-11-05 | Anh Hao Tran | System And Method For Obtaining And Distributing Video |
US20100009700A1 (en) * | 2008-07-08 | 2010-01-14 | Sony Ericsson Mobile Communications Ab | Methods and Apparatus for Collecting Image Data |
US9509867B2 (en) * | 2008-07-08 | 2016-11-29 | Sony Corporation | Methods and apparatus for collecting image data |
US20120131435A1 (en) * | 2008-07-23 | 2012-05-24 | Yahoo! Inc. | Virtual notes in a reality overlay |
US9288079B2 (en) * | 2008-07-23 | 2016-03-15 | Yahoo! Inc. | Virtual notes in a reality overlay |
US9191238B2 (en) * | 2008-07-23 | 2015-11-17 | Yahoo! Inc. | Virtual notes in a reality overlay |
US20100023878A1 (en) * | 2008-07-23 | 2010-01-28 | Yahoo! Inc. | Virtual notes in a reality overlay |
USRE43545E1 (en) * | 2008-09-30 | 2012-07-24 | Microsoft Corporation | Virtual skywriting |
US7966024B2 (en) * | 2008-09-30 | 2011-06-21 | Microsoft Corporation | Virtual skywriting |
US20100081416A1 (en) * | 2008-09-30 | 2010-04-01 | Microsoft Corporation | Virtual skywriting |
US20110173074A1 (en) * | 2008-10-27 | 2011-07-14 | Fujitsu Limited | Communication system, advertisement managing device, and wireless base station |
US20100208033A1 (en) * | 2009-02-13 | 2010-08-19 | Microsoft Corporation | Personal Media Landscapes in Mixed Reality |
US8264529B2 (en) * | 2009-02-25 | 2012-09-11 | T-Mobile Usa, Inc. | Camera pod that captures images or video when triggered by a mobile device |
US20100214398A1 (en) * | 2009-02-25 | 2010-08-26 | Valerie Goulart | Camera pod that captures images or video when triggered by a mobile device |
US20100238161A1 (en) * | 2009-03-19 | 2010-09-23 | Kenneth Varga | Computer-aided system for 360º heads up display of safety/mission critical data |
WO2011008678A1 (en) * | 2009-07-13 | 2011-01-20 | Raytheon Company | Displaying situational information based on geospatial data |
US20110007134A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Synchronizing video images and three dimensional visualization images |
US20110007962A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Overlay Information Over Video |
US8331611B2 (en) * | 2009-07-13 | 2012-12-11 | Raytheon Company | Overlay information over video |
WO2011008611A1 (en) * | 2009-07-13 | 2011-01-20 | Raytheon Company | Overlay information over video |
US20110010674A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Displaying situational information based on geospatial data |
US8558847B2 (en) | 2009-07-13 | 2013-10-15 | Raytheon Company | Displaying situational information based on geospatial data |
US20110007150A1 (en) * | 2009-07-13 | 2011-01-13 | Raytheon Company | Extraction of Real World Positional Information from Video |
US20110014927A1 (en) * | 2009-07-16 | 2011-01-20 | Inventec Appliances Corp. | Method and mobile electronic device for a function of giving notice according to positions |
US9728006B2 (en) | 2009-07-20 | 2017-08-08 | Real Time Companies, LLC | Computer-aided system for 360° heads up display of safety/mission critical data |
US8896629B2 (en) | 2009-08-18 | 2014-11-25 | Metaio Gmbh | Method for representing virtual information in a real environment |
US11562540B2 (en) | 2009-08-18 | 2023-01-24 | Apple Inc. | Method for representing virtual information in a real environment |
US11651574B2 (en) | 2010-02-02 | 2023-05-16 | Sony Corporation | Image processing device, image processing method, and program |
US10515488B2 (en) | 2010-02-02 | 2019-12-24 | Sony Corporation | Image processing device, image processing method, and program |
US10223837B2 (en) | 2010-02-02 | 2019-03-05 | Sony Corporation | Image processing device, image processing method, and program |
US10810803B2 (en) | 2010-02-02 | 2020-10-20 | Sony Corporation | Image processing device, image processing method, and program |
US9754418B2 (en) * | 2010-02-02 | 2017-09-05 | Sony Corporation | Image processing device, image processing method, and program |
US20160035141A1 (en) * | 2010-02-02 | 2016-02-04 | Sony Corporation | Image processing device, image processing method, and program |
US12002173B2 (en) | 2010-02-02 | 2024-06-04 | Sony Corporation | Image processing device, image processing method, and program |
US10037628B2 (en) | 2010-02-02 | 2018-07-31 | Sony Corporation | Image processing device, image processing method, and program |
US11189105B2 (en) | 2010-02-02 | 2021-11-30 | Sony Corporation | Image processing device, image processing method, and program |
US9729352B1 (en) | 2010-02-08 | 2017-08-08 | Google Inc. | Assisting participation in a social network |
US11394669B2 (en) | 2010-02-08 | 2022-07-19 | Google Llc | Assisting participation in a social network |
US10511652B2 (en) | 2010-02-08 | 2019-12-17 | Google Llc | Recommending posts to non-subscribing users |
US9930096B2 (en) | 2010-02-08 | 2018-03-27 | Google Llc | Recommending posts to non-subscribing users |
US9485285B1 (en) | 2010-02-08 | 2016-11-01 | Google Inc. | Assisting the authoring of posts to an asymmetric social network |
US20110231092A1 (en) * | 2010-03-18 | 2011-09-22 | Sony Corporation | Real-time tracking of digital cameras and wireless capable devices |
KR101674946B1 (en) | 2010-04-16 | 2016-11-22 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR20110115664A (en) * | 2010-04-16 | 2011-10-24 | 엘지전자 주식회사 | Mobile terminal and its control method |
US20110279446A1 (en) * | 2010-05-16 | 2011-11-17 | Nokia Corporation | Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device |
US9916673B2 (en) | 2010-05-16 | 2018-03-13 | Nokia Technologies Oy | Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device |
US20120236104A1 (en) * | 2010-05-18 | 2012-09-20 | Zte Corporation | Mobile Terminal and Method for Informing User's Location |
EP2587792A1 (en) * | 2010-06-28 | 2013-05-01 | LG Electronics Inc. | Method and apparatus for providing the operation state of an external device |
EP2587792A4 (en) * | 2010-06-28 | 2014-06-25 | Lg Electronics Inc | Method and apparatus for providing the operation state of an external device |
US9247142B2 (en) | 2010-06-28 | 2016-01-26 | Lg Electronics Inc. | Method and apparatus for providing the operation state of an external device |
US20120001939A1 (en) * | 2010-06-30 | 2012-01-05 | Nokia Corporation | Methods, apparatuses and computer program products for automatically generating suggested information layers in augmented reality |
US9910866B2 (en) * | 2010-06-30 | 2018-03-06 | Nokia Technologies Oy | Methods, apparatuses and computer program products for automatically generating suggested information layers in augmented reality |
US10026227B2 (en) | 2010-09-02 | 2018-07-17 | The Boeing Company | Portable augmented reality |
US8902254B1 (en) * | 2010-09-02 | 2014-12-02 | The Boeing Company | Portable augmented reality |
US20130196718A1 (en) * | 2010-10-12 | 2013-08-01 | Kyocera Corporation | Portable terminal device |
US9026181B2 (en) * | 2010-10-12 | 2015-05-05 | Kyocera Corporation | Portable terminal device |
US10495790B2 (en) | 2010-10-21 | 2019-12-03 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more Fresnel lenses |
US10359545B2 (en) | 2010-10-21 | 2019-07-23 | Lockheed Martin Corporation | Fresnel lens with reduced draft facet visibility |
AU2011253646B2 (en) * | 2010-12-07 | 2016-06-30 | Google Llc | Determining message prominence |
AU2011253646A1 (en) * | 2010-12-07 | 2012-06-21 | Google Llc | Determining message prominence |
WO2012078791A1 (en) * | 2010-12-07 | 2012-06-14 | Google Inc. | Determining message prominence |
US8527597B2 (en) * | 2010-12-07 | 2013-09-03 | Google Inc. | Determining message prominence |
US20120143963A1 (en) * | 2010-12-07 | 2012-06-07 | Aleksandr Kennberg | Determining Message Prominence |
US9356901B1 (en) | 2010-12-07 | 2016-05-31 | Google Inc. | Determining message prominence |
US9720228B2 (en) | 2010-12-16 | 2017-08-01 | Lockheed Martin Corporation | Collimating display with pixel lenses |
US9113050B2 (en) | 2011-01-13 | 2015-08-18 | The Boeing Company | Augmented collaboration system |
US20120221552A1 (en) * | 2011-02-28 | 2012-08-30 | Nokia Corporation | Method and apparatus for providing an active search user interface element |
US9973630B2 (en) * | 2011-03-22 | 2018-05-15 | Fmr Llc | System for group supervision |
US20120246222A1 (en) * | 2011-03-22 | 2012-09-27 | David Martin | System for Group Supervision |
US9424579B2 (en) * | 2011-03-22 | 2016-08-23 | Fmr Llc | System for group supervision |
US20170034350A1 (en) * | 2011-03-22 | 2017-02-02 | Fmr Llc | System for Group Supervision |
US8412237B1 (en) * | 2011-07-29 | 2013-04-02 | Intuit Inc. | Method and system for launching and preparing applications on mobile computing systems based on geo-location data |
US9681093B1 (en) * | 2011-08-19 | 2017-06-13 | Google Inc. | Geolocation impressions |
US10956938B2 (en) | 2011-09-30 | 2021-03-23 | Nokia Technologies Oy | Method and apparatus for associating commenting information with one or more objects |
US9639857B2 (en) | 2011-09-30 | 2017-05-02 | Nokia Technologies Oy | Method and apparatus for associating commenting information with one or more objects |
US20130162644A1 (en) * | 2011-12-27 | 2013-06-27 | Nokia Corporation | Method and apparatus for providing perspective-based content placement |
US9672659B2 (en) * | 2011-12-27 | 2017-06-06 | Here Global B.V. | Geometrically and semanitically aware proxy for content placement |
US9978170B2 (en) | 2011-12-27 | 2018-05-22 | Here Global B.V. | Geometrically and semanitcally aware proxy for content placement |
US20150169568A1 (en) * | 2012-03-16 | 2015-06-18 | Laura Garcia-Barrio | Method and apparatus for enabling digital memory walls |
US9984408B1 (en) * | 2012-05-30 | 2018-05-29 | Amazon Technologies, Inc. | Method, medium, and system for live video cooperative shopping |
JP2015537264A (en) * | 2012-08-27 | 2015-12-24 | エンパイア テクノロジー ディベロップメント エルエルシー | Indicate the geographical source of digitally mediated communications |
US10535196B2 (en) | 2012-08-27 | 2020-01-14 | Empire Technology Development Llc | Indicating the geographic origin of a digitally-mediated communication |
US9710969B2 (en) | 2012-08-27 | 2017-07-18 | Empire Technology Development Llc | Indicating the geographic origin of a digitally-mediated communication |
WO2014074257A1 (en) * | 2012-11-06 | 2014-05-15 | Ripple Inc | Rendering a digital element |
US9142038B2 (en) | 2012-11-06 | 2015-09-22 | Ripple Inc | Rendering a digital element |
US20140132630A1 (en) * | 2012-11-13 | 2014-05-15 | Samsung Electronics Co., Ltd. | Apparatus and method for providing social network service using augmented reality |
CN103812761A (en) * | 2012-11-13 | 2014-05-21 | 三星电子株式会社 | Apparatus and method for providing social network service using augmented reality |
US20140204121A1 (en) * | 2012-12-27 | 2014-07-24 | Schlumberger Technology Corporation | Augmented reality for oilfield |
US20150124043A1 (en) * | 2013-11-01 | 2015-05-07 | Microsoft Corporation | Controlling Display of Video Data |
US9294715B2 (en) * | 2013-11-01 | 2016-03-22 | Microsoft Technology Licensing, Llc | Controlling display of video data |
US9646418B1 (en) | 2014-06-10 | 2017-05-09 | Ripple Inc | Biasing a rendering location of an augmented reality object |
US12008697B2 (en) | 2014-06-10 | 2024-06-11 | Ripple, Inc. Of Delaware | Dynamic location based digital element |
US9619940B1 (en) | 2014-06-10 | 2017-04-11 | Ripple Inc | Spatial filtering trace location |
US11403797B2 (en) * | 2014-06-10 | 2022-08-02 | Ripple, Inc. Of Delaware | Dynamic location based digital element |
US11069138B2 (en) | 2014-06-10 | 2021-07-20 | Ripple, Inc. Of Delaware | Audio content of a digital object associated with a geographical location |
US11532140B2 (en) | 2014-06-10 | 2022-12-20 | Ripple, Inc. Of Delaware | Audio content of a digital object associated with a geographical location |
US12154233B2 (en) | 2014-06-10 | 2024-11-26 | Ripple, Inc. Of Delaware | Audio content of a digital object associated with a geographical location |
US20180144524A1 (en) * | 2014-06-10 | 2018-05-24 | Ripple Inc | Dynamic location based digital element |
US10026226B1 (en) | 2014-06-10 | 2018-07-17 | Ripple Inc | Rendering an augmented reality object |
US10930038B2 (en) * | 2014-06-10 | 2021-02-23 | Lab Of Misfits Ar, Inc. | Dynamic location based digital element |
US11959749B2 (en) * | 2014-06-20 | 2024-04-16 | Profound Positioning Inc. | Mobile mapping system |
US20170227361A1 (en) * | 2014-06-20 | 2017-08-10 | Uti Limited Partnership | Mobile mapping system |
US10684476B2 (en) | 2014-10-17 | 2020-06-16 | Lockheed Martin Corporation | Head-wearable ultra-wide field of view display device |
US9939650B2 (en) | 2015-03-02 | 2018-04-10 | Lockheed Martin Corporation | Wearable display system |
US10754156B2 (en) | 2015-10-20 | 2020-08-25 | Lockheed Martin Corporation | Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system |
US9836888B2 (en) * | 2016-02-18 | 2017-12-05 | Edx Technologies, Inc. | Systems and methods for augmented reality representations of networks |
US9995936B1 (en) | 2016-04-29 | 2018-06-12 | Lockheed Martin Corporation | Augmented reality systems having a virtual image overlaying an infrared portion of a live scene |
US10931941B2 (en) | 2016-10-04 | 2021-02-23 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
EP3306445A1 (en) * | 2016-10-04 | 2018-04-11 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
US10536691B2 (en) | 2016-10-04 | 2020-01-14 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
US10602133B2 (en) | 2016-10-04 | 2020-03-24 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
US10165261B2 (en) | 2016-10-04 | 2018-12-25 | Facebook, Inc. | Controls and interfaces for user interactions in virtual spaces |
WO2018098080A3 (en) * | 2016-11-23 | 2020-07-09 | Lab Of Misfits Ar, Inc. | Dynamic location based digital element |
US10152851B2 (en) | 2016-11-29 | 2018-12-11 | Microsoft Technology Licensing, Llc | Notification artifact display |
US20180259958A1 (en) * | 2017-03-09 | 2018-09-13 | Uber Technologies, Inc. | Personalized content creation for autonomous vehicle rides |
US12069017B2 (en) | 2017-08-08 | 2024-08-20 | Snap Inc. | Application-independent messaging system |
US11838252B2 (en) * | 2017-08-08 | 2023-12-05 | Snap Inc. | Application-independent messaging system |
US11354864B2 (en) * | 2018-02-21 | 2022-06-07 | Raziq Yaqub | System and method for presenting location based augmented reality road signs on or in a vehicle |
US20220414173A1 (en) * | 2018-08-23 | 2022-12-29 | Newsplug, Inc. | Geographic location based feed |
KR102308864B1 (en) | 2019-04-25 | 2021-10-06 | 군산대학교산학협력단 | Method of Discovering Region of Attractions from Geo-tagged Photos and Apparatus Thereof |
KR20210004918A (en) * | 2019-04-25 | 2021-01-13 | 군산대학교산학협력단 | Method of Discovering Region of Attractions from Geo-tagged Photos and Apparatus Thereof |
US11144760B2 (en) | 2019-06-21 | 2021-10-12 | International Business Machines Corporation | Augmented reality tagging of non-smart items |
US11722442B2 (en) | 2019-07-05 | 2023-08-08 | Snap Inc. | Event planning in a content sharing platform |
US11973728B2 (en) | 2019-07-05 | 2024-04-30 | Snap Inc. | Event planning in a content sharing platform |
US11157089B2 (en) * | 2019-12-27 | 2021-10-26 | Hypori Llc | Character editing on a physical device via interaction with a virtual device user interface |
US20220121293A1 (en) * | 2019-12-27 | 2022-04-21 | Hypori, LLC | Character editing on a physical device via interaction with a virtual device user interface |
US11411900B2 (en) | 2020-03-30 | 2022-08-09 | Snap Inc. | Off-platform messaging system |
US12244549B2 (en) | 2020-03-30 | 2025-03-04 | Snap Inc. | Off-platform messaging system |
US11463867B2 (en) | 2020-04-02 | 2022-10-04 | Cyberschmiede GmbH | Method for transmitting information |
EP3890363A1 (en) * | 2020-04-02 | 2021-10-06 | Cyberschmiede GmbH | Method for transferring information |
US12126588B2 (en) | 2020-04-23 | 2024-10-22 | Snap Inc. | Event overlay invite messaging system |
CN113806644A (en) * | 2021-09-18 | 2021-12-17 | 英华达(上海)科技有限公司 | Message processing method, message display method, message processing device, message display device, terminal and storage medium |
US11973730B2 (en) | 2022-06-02 | 2024-04-30 | Snap Inc. | External messaging function for an interaction system |
US12278791B2 (en) | 2024-02-28 | 2025-04-15 | Snap Inc. | Event planning in a content sharing platform |
Also Published As
Publication number | Publication date |
---|---|
WO2007076555A3 (en) | 2008-04-17 |
US8280405B2 (en) | 2012-10-02 |
WO2007076555A2 (en) | 2007-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8280405B2 (en) | Location based wireless collaborative environment with a visual user interface | |
JP5871976B2 (en) | Mobile imaging device as navigator | |
USRE46737E1 (en) | Method and apparatus for an augmented reality user interface | |
EP2207113B1 (en) | Automated annotation of a view | |
CN111540059B (en) | Enhanced video system providing enhanced environmental awareness | |
US20080114543A1 (en) | Mobile phone based navigation system | |
EP2533214A2 (en) | Method for providing information on object within view of terminal device, terminal device for same and computer-readable recording medium | |
EP1692863B1 (en) | Device, system, method and computer software product for displaying additional information in association with the image of an object | |
CA2662810A1 (en) | Gps explorer | |
Qadeer et al. | Design and implementation of location awareness and sharing system using GPS and 3G/GPRS | |
Ling et al. | A hybrid rtk gnss and slam outdoor augmented reality system | |
Tokusho et al. | Prototyping an outdoor mobile augmented reality street view application | |
JP4710217B2 (en) | Information presenting apparatus, information presenting method, information presenting system, and computer program | |
JP4733343B2 (en) | Navigation system, navigation device, navigation method, and navigation program | |
US8159337B2 (en) | Systems and methods for identification of locations | |
KR20060117140A (en) | Constellation information providing service using a mobile communication terminal. | |
Matos et al. | A GPS-based mobile coordinated positioning system for firefighting scenarios | |
JP2019045958A (en) | Spot information display system | |
Fröhlich et al. | Adding space to location in mobile emergency response technologies | |
US8599066B1 (en) | System, method, and apparatus for obtaining information of a visually acquired aircraft in flight | |
KR20060064458A (en) | Position detection device and method | |
Forward et al. | overarching research challenges | |
Santos et al. | A navigation and registration system for mobile and augmented environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AECHELON TECHONOLOGY, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SANZ-PASTOR, IGNACIO;MORGAN, DAVID L., III;CASTELLAR, JAVIER;REEL/FRAME:019149/0617;SIGNING DATES FROM 20070220 TO 20070401 Owner name: AECHELON TECHONOLOGY, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SANZ-PASTOR, IGNACIO;MORGAN, DAVID L., III;CASTELLAR, JAVIER;SIGNING DATES FROM 20070220 TO 20070401;REEL/FRAME:019149/0617 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2553); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 12 |
|
AS | Assignment |
Owner name: PENNANTPARK LOAN AGENCY SERVICING, LLC, AS ADMINISTRATIVE AGENT, FLORIDA Free format text: SECURITY INTEREST;ASSIGNOR:AECHELON TECHNOLOGY, INC.;REEL/FRAME:068315/0560 Effective date: 20240816 |