US20110163944A1 - Intuitive, gesture-based communications with physics metaphors - Google Patents
Intuitive, gesture-based communications with physics metaphors Download PDFInfo
- Publication number
- US20110163944A1 US20110163944A1 US12/652,719 US65271910A US2011163944A1 US 20110163944 A1 US20110163944 A1 US 20110163944A1 US 65271910 A US65271910 A US 65271910A US 2011163944 A1 US2011163944 A1 US 2011163944A1
- Authority
- US
- United States
- Prior art keywords
- data
- interface
- gesture
- animating
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims 19
- 239000013598 vector Substances 0.000 claims 8
- 230000000977 initiatory effect Effects 0.000 claims 5
- 238000000926 separation method Methods 0.000 claims 2
- 238000010408 sweeping Methods 0.000 claims 1
- 230000001131 transforming effect Effects 0.000 claims 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/06—Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/64—Details of telephonic subscriber devices file transfer between terminals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/06—Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W76/00—Connection management
- H04W76/10—Connection setup
Definitions
- This disclosure relates generally to communications, and more particularly, to data transfer between devices.
- the individual When an individual performs an action in a real world, physical environment, the individual experiences various physical phenomenon that indicates that the task is being performed or has been completed. For example, if an individual pours objects from a first container into a second container, the individual can observe the objects reacting to the forces of friction and gravity. If the objects having different shapes and masses, then the individual would observe different reactions to the forces.
- a user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors.
- the detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment.
- the first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device.
- an object in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture.
- Users can transfer files and other data between devices using intuitive gestures combined with animation based on physics metaphors. Users can transfer files to a network using intuitive physical gestures. Users can broadcast files and other data to other devices using intuitive interface or physical gestures.
- FIGS. 1A-1C illustrate an exemplary intuitive, gesture-based data transfer between two devices using animation based on a physics metaphor.
- FIG. 2 illustrates initiation of an exemplary communications session with a device in response to an interface gesture.
- FIG. 3 illustrates initiation of an exemplary data broadcast from a device to multiple devices in response to a physical gesture.
- FIG. 4 illustrates an exemplary data transfer between two devices in response to intuitive, physical gestures.
- FIG. 5 illustrates an exemplary physical gesture for initiating a communications session with a network.
- FIG. 6 is a flow diagram of an exemplary process for using intuitive, physical gestures to initiate a communications session between devices.
- FIG. 7 is a flow diagram of an exemplary process for using intuitive, interface gestures to initiate a communications session between devices.
- FIG. 8 is a block diagram of exemplary network operating environment for a device for implementing the features and operations described in reference to FIGS. 1-7 .
- FIG. 9 is a block diagram illustrating an exemplary device architecture of a device implementing the features and operations described in reference to FIGS. 1-7 .
- FIGS. 1A-1C illustrate an exemplary intuitive, gesture-based communication between two devices using animation based on a physics metaphor.
- devices 110 , 120 are shown in close proximity and having a relative orientation.
- Devices 110 , 120 can be any electronic device capable of displaying information and communicating with other devices, including but not limited to: personal computers, handheld devices, electronic tablets, Personal Digital Assistants (PDAs), cellular telephones, network appliances, cameras, smart phones, network base stations, media players, navigation devices, email devices, game consoles, automotive informatics systems (e.g., a dashboard, entertainment system) and any combination of these devices.
- PDAs Personal Digital Assistants
- device 110 is a handheld device and device 120 is an electronic tablet 120 .
- Devices 110 , 120 can include respective interfaces 112 , 122 for displaying graphical objects, such as icons representing files, folders or other content.
- interfaces 112 , 122 can be touch sensitive surfaces that are responsive to touch and touch gesture input.
- Interface 112 is shown displaying a collection of graphical objects 114 a - 114 f (e.g., file icons) representing files stored on device 110 .
- the user can select one or more files for transfer to one or more devices by placing the files into a transfer state.
- the user has selected four files for transfer by touching their respective objects 114 a - 114 d for a predetermined amount of time and/or using a predetermined amount of pressure during the touch.
- the user can also select a group of files for transfer by drawing a circle around the icons with a finger or stylus, then using a touch, gesture or other input to select the group of files for transfer.
- the user can drag and drop individual files onto a container object (e.g., a “suitcase” icon) displayed on interface 112 , and then use a touch, gesture or other input to select the container of files for transfer.
- a container object e.g., a “suitcase” icon
- Other means for selecting individual files or groups of files for transfer are also possible, including but not limited to selecting files through menus or other conventional user interface elements.
- the selected objects 114 a - 114 d can be detached from interface 112 and allowed to freely “float” on interface 112 .
- the boundaries of interface 112 can be configured to behave like “bumpers” during device motion; such that floating objects 114 a - 114 d bounce off the boundaries of interface 112 while objects 114 e and 114 f remain fixed to interface 112 .
- FIG. 1B illustrates device 110 in motion relative to device 120 .
- device 110 can be equipped with one or more motion sensors (not illustrated) that detect when device 110 is moved.
- Motion sensors can include but are not limited to accelerometers, gyroscopes and magnetometers.
- the user is holding device 110 directly over interface 122 , and has made a physical gesture with device 110 .
- a physical gesture can be any gesture that moves a device or changes the orientation of a device.
- the user has rotated device 110 above interface 122 in a manner similar to tipping a glass of water. This angular motion can be detected by one or more onboard motion sensors.
- detached objects 114 a - 114 d can be animated to simulate the effect of gravity by “sliding” toward the lowermost portion of interface 112 as device 110 is rotated.
- the animation of the objects creates the appearance that the objects have mass and are reacting to forces of a real world, physical environment.
- Selected objects 114 a - 114 d being detached from interface 112 , can slide until they touch boundaries 116 a or 116 c of interface 112 .
- Objects 114 e and 114 f being fixed to interface 112 , can remain in their original positions on interface 112 .
- FIG. 1C illustrates devices 110 , 120 executing an intuitive, gesture-based file transfer.
- the user has rotated device 110 relative to interface 112 such that boundary 116 d of interface 112 is substantially parallel with interface 122 .
- a graphics engine onboard device 110 animates selected objects 114 a - 114 d to simulate the movement of objects 114 a - 114 d under the force of gravity and friction.
- selected objects 114 a - 114 d can be animated to slide toward an intersecting corner of boundaries 116 a , 116 d of interface 112 .
- Device 110 can interpret the rotation of device 110 (e.g., a “pouring” action) as an indication of the user's intent to transfer the files represented by selected objects 114 a - 114 d.
- device 110 determines if device 120 is present and available to receive the files.
- device 110 can use onboard short-range communication technology, such as Bluetooth or Radio Frequency Identification (RFID) to detect the presence of device 120 .
- RFID Radio Frequency Identification
- device 110 has files in a transfer state and detects the presence of device 122 . If device 120 is within a predetermined range of device 110 , then device 110 can attempt to establish a communications link 130 with device 120 . After a link is established and authenticated, device 110 can request that device 120 accept a file transfer. Upon an acknowledgement of acceptance from device 120 , device 110 can transfer the files represented by objects 114 a - 114 d to device 120 using known communication protocols.
- RFID Radio Frequency Identification
- icons representative of the transferred data can appear on interface 122 of device 120 .
- icon 114 c can appear on interface 122 and be animated by a graphics engine on device 120 to change in size or appearance (e.g., grow, fill, materialize) as the data represented by object 114 c is received by device 120 .
- device 120 can animate the objects 114 a - 114 d on interface 122 so as to appear to react to gravity, friction or drag, momentums, torques, accelerations, centripetal forces or any other force found in a real-world, physical environment.
- transferred files can appear to “drop” onto device 120 at a point directly below device 110 and then spread out onto interface 122 to simulate sand or liquid being poured onto a surface having friction or a viscous drag.
- the rate at which each object moves on interface 122 can be based on the size or “mass” of the file represented by the object. Larger files that have more “mass” can have their object animated to move slower in interface 122 , and small files that have less “mass” can have their object animated to move faster in interface 122 .
- the object 114 c can be detached from interface 122 so that it appears to “float” on interface 122 .
- the user can accept the data represented by icon 114 c by providing an interface or physical gesture of device 120 or by other input means. Upon detection of the input, object 114 c can be fixed to the interface 122 to visually indicate to the user the acceptance of the data.
- the order of data transfer can be determined by the arrangement of objects 114 a - 114 d in interface 112 .
- object 114 c which is closest to a virtual opening 117 in interface 112 can have its corresponding data transferred first because of its close proximity to virtual opening 117 .
- Objects corresponding to larger files can be animated to move slowly to virtual opening 117 and smaller icons can be animated to move more quickly to virtual opening 117 , thus enabling a smaller file to be transferred rather than being bottlenecked by a larger file that can take a long time to transfer.
- data transfer can be represented by animating objects 114 a - 114 d to simulate a variety of real-world physics.
- object 114 c can be animated on interface 112 to appear distorted around virtual opening 117 to simulate water going down a drain, sand flowing through an hourglass, or a genie being pulled into a bottle (a.k.a. “the genie effect”).
- the animation can simulate object 114 c dissolving like a tablet in water or dematerializing.
- Other animations are possible to convey to the user that data are being emptied from device 110 onto interface 122 .
- data transfer can be represented or accompanied by audible feedback, such as the sound of liquid pouring, a tablet fizzing, gas through a valve, a sci-fi teleporter, or other sound that audibly represent the transfer of a material from one point to another.
- audible feedback such as the sound of liquid pouring, a tablet fizzing, gas through a valve, a sci-fi teleporter, or other sound that audibly represent the transfer of a material from one point to another.
- the speed of animation or the pitch of sounds associated with data transfer can be determined from the speed of the data transfer.
- data transfers using a high bandwidth communications link 130 can be animated as “pouring” out of device 110 more quickly than a data transfer occurring over a lower bandwidth connection.
- the speed of data transfer can be at least partly determined by the orientation of device 110 .
- the data transfer rate, and the speed of associated animations can change based on the orientation or distance of device 110 relative to interface 122 . For example, if device 110 is orientated as shown in FIG. 1B , the data transfer rate over communication link 130 can be slower than the data transfer rate if the device 100 were orientated as shown in FIG. 1C . In this example, if device is orientated to a substantially upright position (e.g., an orientation opposite to the orientation shown in FIG. 1C ) the data transfer will stop.
- selected objects 114 a - 114 d are represented as substantially solid objects, but other representations of the data corresponding to the icons can also be used.
- objects 114 a - 114 d can be animated to “melt” into a simulated liquid that collects at boundary 116 c of interface 112 .
- Multiple selected icons can then be represented at stratified layers of liquid that can be “poured” out of device 110 .
- the volume of a given strata can be indicative of the amount of data it represents.
- a liquefaction and stratification metaphor can be used to determine the order in which data can be transferred.
- the first file selected can remain as the bottommost strata as device 110 is rotated, such that the first selected file “flows” into the bottommost position of interface 122 in FIG. 1C to become the first file to flow out of device 110 .
- the thickness of the strata on interface 112 can shrink to represent a shrinking amount of data that remains to be transferred.
- FIG. 2 illustrates initiation of a communications session with a device in response to an interface gesture.
- Devices 210 , 220 , 230 can be proximate each other, such as in the same room.
- Each of devices 210 , 230 , 230 can be a device, for example, like devices 110 or 120 described above with reference to FIGS. 1A-1C .
- Devices 210 , 220 , 230 can be equipped with short-range communication systems (e.g., Bluetooth) which allows each device to scan the room and sense the presence of other devices.
- Each of the devices can include motion sensors, which allow the devices to maintain a local reference coordinate frame.
- Each of the devices can also include a positioning system (e.g., a GPS receiver).
- the user has drawn a graphical object 240 (e.g., a note) on interface 250 of device 210 .
- the user can input a request to transmit data (e.g., copy data) represented by graphical object 240 to device 220 using touch gesture input to interface 250 (hereinafter also referred to as an “interface gesture”).
- touch gesture input to interface 250 hereinafter also referred to as an “interface gesture”.
- the user can touch graphical object 240 to select it, and then make a “swipe” or “flick” gesture on interface 250 with one or more fingers in the direction of device 220 .
- Device 210 senses the interface gesture input interacting with graphical object 240 and interprets the gesture as a request to transmit data represented by graphical object 240 to another device.
- device 210 Before receiving the data transfer request, device 210 can scan the room for the presence of other devices. In this example, devices 220 and 230 are detected. If communication has not been established, device 210 can establish communication with devices 220 , 230 . In the simplest case, the user of device 210 can manually select one or more devices for data transfer from a list of devices that were detected in the scan (e.g., devices 220 , 230 ). Upon receiving the “swipe” or “flick” gesture requesting data transfer, the data can be transferred to the selected device(s).
- device 210 can request position data from devices 220 and 230 .
- devices 220 , 230 can send their position vectors in an inertial reference coordinate frame shared by devices 210 , 220 , 230 .
- devices 220 , 230 can send their respective position vectors in the well-known Earth Centered Earth Fixed (ECEF) Cartesian coordinate frame.
- the position vectors can be obtained from positioning systems onboard devices 220 , 230 .
- device 210 can compute line of sight (LOS) vectors from device 210 to each target device 220 , 230 in ECEF coordinates.
- the LOS vectors can then be transformed 230 into a display coordinate frame for device 210 using coordinate transformations.
- device 210 can perform the following coordinate transformations for each LOS vector:
- L ⁇ ECEF R ⁇ T_ECEF - R ⁇ S_ECEF [ 1 ]
- L ⁇ Display T Device Display ⁇ T ECEF Device ⁇ L ⁇ ECEF . [ 2 ]
- ⁇ right arrow over (L) ⁇ ECEF is the LOS vector from device 210 to device 220 or 230 in ECEF coordinates
- ⁇ right arrow over (R) ⁇ S — ECEF , ⁇ right arrow over (R) ⁇ T — ECEF are the position vectors of device 210 and device 220 or 230 , respectively, in ECEF coordinates.
- ⁇ right arrow over (L) ⁇ Display is the LOS vector from device 210 to device 220 or 230 in display coordinates of device 210 ,
- display coordinates of device 210 is a two dimensional Cartesian coordinate frame where the display of device 210 is defined in FIG. 2 as an x-y plane.
- the LOS vectors ⁇ right arrow over (L) ⁇ 220 , ⁇ right arrow over (L) ⁇ 230 of devices 220 , 230 , respectively, are shown in the x-y plane.
- a vector ⁇ right arrow over (G) ⁇ representing the direction of the interface gesture made towards device 220 in display coordinates is shown in the x-y plane.
- the vector ⁇ right arrow over (G) ⁇ can be determined in the x-y plane by an onboard touch model based on raw touch sensor data (e.g., capacitive touch data).
- a dot product can be taken between the ⁇ right arrow over (G) ⁇ vector and each of the LOS vectors ⁇ right arrow over (L) ⁇ 220 , ⁇ right arrow over (L) ⁇ 230 in the x-y plane.
- the LOS vector that provides the smallest angle ⁇ with the ⁇ right arrow over (G) ⁇ vector (in this case ⁇ 1 ) can determine the device to receive the data transfer, which is given by
- the above technique can be used when position errors are small and there is sufficient angular separation between the communicating devices to ensure an accurate computation of ⁇ .
- Other techniques for determining the target device can also be used.
- either the user can physically point device 210 at device 220 or device 230 to indicate which device will receive the data transfer.
- the LOS vectors can be transformed into device coordinates (without transforming into display coordinates) and equation [3] can be applied by replacing the gesture vector ⁇ right arrow over (G) ⁇ with the device axis that is pointing in the direction of the target device, which in this example is the x-axis shown in FIG. 2 .
- the LOS vector that provides the smallest angle ⁇ with the ⁇ right arrow over (x) ⁇ vector can determine the device to receive the data transfer.
- a user can use equations [1] through [3] to indicate a target device for data transfer using either an interface gesture in the direction of the desired target device 220 , 230 or a physical gesture by physically pointing device 210 at the desired target device 220 , 230 .
- multiple target devices can be selected for a broadcast style data transfer using equations [1] through [3] as described with reference to FIG. 3 .
- graphical object 240 can be animated in response to the gesture to simulate a physics metaphor.
- graphical object 240 can be animated to simulate the effects of momentum, friction, viscosity, or other aspects of Newtonian mechanics, such that graphical object 240 can continue to move along its trajectory beyond where the gesture started. Simulated friction or viscosity can slow the movement of graphical object 240 as it travels along its trajectory.
- the edges of interface 250 may partly resist the motion of graphical object 240 when the two come into contact.
- the user may have to flick graphical object 240 with a velocity sufficient to overcome a simulated repelling force at edge 253 of interface 250 .
- repelling forces include but are not limited to gravity and friction provided by a speed bump or wall of a bubble, where an object either overcomes the repelling force by having sufficient speed to rollover the bump or sufficient speed to break through the bubble wall or has insufficient speed and rolls or bounces back.
- a gesture imparting sufficient velocity or speed to graphical object 240 can indicate an intent to perform data transfer to another device.
- a gesture imparting insufficient velocity can result in graphical object 240 rebounding off the edge of interface 250 with no transfer of data.
- this behavior can help device 210 distinguish the difference between gestures intended to reposition graphical object 240 within interface 250 and gestures intended to communicate the data corresponding to graphical object 240 to another device.
- the speed of the gestures can determine the speed of the graphical object 240 . Faster gestures result in higher velocities than slower gestures.
- the target devices can initiate an animation that simulates the receipt of data using a physics metaphor.
- device 220 when device 220 starts to receive data from device 210 , device 220 can display animated graphical objects on interface 270 representing data entering device 220 .
- the graphical objects can be detached from interface 270 so that the objects “float.”
- the user can provide an interface gesture or physical gesture to indicate acceptance of the data.
- the floating objects Upon the user's acceptance of the data through a gesture or by other means, the floating objects can become fixed to the interface 270 to visually indicate acceptance of the data to the user.
- FIG. 3 illustrates initiation of a data broadcast from a device to multiple devices in response to a physical gesture.
- Devices 310 , 320 , 325 , 330 are located in proximity to each other.
- Devices 310 , 320 , 325 , 330 can be, for example, devices similar to devices 110 or 120 of FIGS. 1A-1C .
- Device 330 can be a computer enabled display device, such as an electronic tablet, computer monitor, projection screen, electronic whiteboard, teleconferencing screen, television, or other type of device that can display information.
- the user has selected graphical object 340 (a file icon) to indicate an intention to perform a data transfer action.
- Device 310 is also shown in a rotational or sweeping motion due to a user performing a clockwise (or counterclockwise) rotational or sweeping gesture that emulates a toss of a Frisbee®.
- Motion sensors onboard device 310 senses this physical gesture and interprets it to indicate the user's intent to broadcast data represented by graphical object 340 to devices 320 , 320 , 325 , 330 .
- device 310 establishes communications link 350 (e.g., a bidirectional Bluetooth link) with devices 320 , 325 , 330 and transmits data corresponding to graphical object 340 to devices 320 , 325 , 330 .
- communications link 350 e.g., a bidirectional Bluetooth link
- devices 320 , 325 , 330 can display graphical object 340 on their respective displays 360 .
- the graphical object can be detached on the interface or otherwise modified to indicate that the data has not been accepted by the user of the device.
- the user of devices 320 , 325 , 330 can provide gesture input or other input means to accept the data.
- icon 340 can be fixed to the interface or otherwise modified to indicate that the data has been accepted onto the device.
- FIG. 4 illustrates a data transfer between two devices in response to intuitive, physical gestures.
- device 410 can be a handheld personal digital assistant and device 420 can be an electronic tablet.
- Other devices are also possible.
- Device 420 can include display 430 that can display graphical objects 432 , 434 , and 436 (e.g., file icons) representing electronic files or other electronic data stored in device 420 .
- the user has selected object 436 to indicate an intent to perform one or more actions upon the data corresponding to icon 436 .
- the user intends to request that data corresponding to icon 436 be transferred from device 420 to device 410 .
- the user indicates an intent to transfer data by placing device 410 in position and orientation 440 a relative to device 420 , and then moving device 410 across display 430 to position and orientation 440 b .
- the gesture just described can be a metaphor for the user holding and using the device 410 as a scraper or vacuum to “scrap” or “vacuum” data or files off interface 430 and onto device 410 .
- Device 410 detects the orientation and motion from location 440 a to location 440 b , and interprets the orientation and motion as a physical gesture indicating the user's intent to receive data from device 420 .
- the orientation can be detected by monitoring one or more angles between axes fixed to the device and a local level, instantaneous coordinate frame determined by, for example, a gravitational acceleration vector output computed from output of an onboard accelerometer and a vector directed North computed from output of an onboard magnetometer.
- the presence of device 420 is detected and if communication is not already established, device 410 can establish a wireless communications link 450 with device 420 .
- device 410 can request that device 420 transmit any selected data, such as the data corresponding to selected icon 436 .
- the data can be selected by a user of device 420 as described in reference to FIGS. 1A-1C and 2 .
- Device 420 transmits the data over link 450 to device 410 .
- Graphical object 436 can appear on interface 452 of device 410 to visually indicate the receipt of the selected data on device 410 .
- Graphical object 436 can initially be detached from interface 452 until the user of device 410 provides a gesture input or other input means to accept the data. Upon acceptance, graphical object 436 can become fixed to interface 452 .
- Other visual or audio feedback can also be provided to indicate user acceptance of the data.
- FIG. 5 illustrates an example physical gesture for initiating a communications session with a network.
- the physical gesture is used to indicate that the user wishes to initiate a communication session with a network resource, such as a network server.
- a network resource such as a network server.
- the user can lift a handheld device skyward in a gesture that symbolizes uplifting a torch.
- the device initiates the communication session with the network resource.
- device 510 includes interface 512 that displays graphical objects 514 a - 514 f (e.g., file icons).
- Device 510 can be, for example, one of the example devices described above with reference to FIGS. 1A-1C .
- a user selects one or more objects displayed on device 510 , for example, objects 514 b and 514 d .
- the user moves device 510 from a first position and orientation 520 a to a second position and orientation 520 b .
- Device 510 uses internal position and orientation sensors to detect this motion and determine that the motion is a gesture indicative of an intent to upload data represented by objects 514 b and 514 d to a remote server (not shown).
- an onboard accelerometer can monitor for an large acceleration the opposite gravitational acceleration to determine that a gesture was made indicating a request to upload data to a network resource.
- the user can first put the device in a transfer state using touch or other input so that the acceleration can be interpreted as a gesture and not another source of acceleration, such as an acceleration generated when a user of the device is on an elevator.
- Wireless communications link 530 can be, for example, a cellular, WiFi, WiMax, satellite, or other wireless communications link to network 540 , which can be a cellular telephone data network, a private, commercial, or public WiFi or WiMax access network, a satellite network or other wireless communications network.
- device 510 transmits the data corresponding to selected objects 514 a and 514 d to the network resource through network 540 .
- the user gestures are described in terms of physical gestures and interface gestures. Other implementations of user gestures can be used.
- a user can initiate transmission of a selected file by generally aligning the device with the target device and then blowing air across the display of the device.
- One or more microphones on the device can detect the sound of moving air and the direction of airflow. The direction of airflow can be used to infer the intent of the user to identify a target device.
- a sending device can be held over a receiving device as shown in FIG. 1C , and a touch sensitive surface (e.g., interface 112 ) on the sending device can be tapped to cause items to be transferred to the receiving device over a wireless communication link.
- a touch sensitive surface e.g., interface 112
- each tap can cause one item to transfer.
- This gesture can be analogized to tapping a KetchupTM bottle to get the KetchupTM to flow.
- FIG. 6 is a flow diagram of an example process for using intuitive, physical gestures to initiate a communications session between devices.
- Process 600 can be performed by one or more devices, for example, one or more of the devices described above with reference to FIGS. 1-5 . Therefore, for convenience, process 600 is described with reference to a device that performs process 600 .
- the device presents an object on an interface of the device ( 605 ).
- the object can be, for example, an icon or other representation of content.
- the interface can be a touch sensitive surface that is responsive to gesture inputs.
- the device determines whether the device is in motion ( 610 ). Examples of device motion can include changes in device position or orientation, such as tilting, shaking, rotating, spinning, shifting, or combinations of these or other motions. If the device is not in motion, then the device continues to present the object on the interface. If the device is in motion, then the device animates the object to simulate real-world physical behavior ( 615 ).
- the device can animate a graphical representation of the content object (e.g., icon) to make the object appear to slide, ricochet, vibrate, bounce, or perform other reactions to forces based on Newtonian mechanics corresponding to the detected motion.
- the content object e.g., icon
- the device detects one or more other devices ( 620 ). The detection can be accomplished using short-range communication technology such as Bluetooth scanning. The device then determines whether the motion is indicative of a peer user gesture ( 625 ), A “peer user” gesture is a gesture that suggests a user's intent to transfer data from a first device to a second device (one to one). The data can be stored by the device or accessible by the device (e.g., stored on another device in communication with the device) If so, the device transmits data represented by the object to a second device ( 630 ).
- a peer user gesture is a gesture that suggests a user's intent to transfer data from a first device to a second device (one to one).
- the data can be stored by the device or accessible by the device (e.g., stored on another device in communication with the device) If so, the device transmits data represented by the object to a second device ( 630 ).
- the device can determine whether the motion is indicative of a broadcast user gesture ( 640 ).
- a “broadcast user gesture” is a gesture that suggests a user's intent to cause a device to transfer data to multiple recipient devices simultaneously (one to many). If so, the device broadcasts the data represented by the object to the multiple recipient devices. If not, the device continues to present the object on the user interface 605 .
- FIG. 7 is a flow diagram of an example process for using intuitive, interface gestures to initiate a communications session between devices.
- Process 700 can be performed by one or more devices, for example, one or more of the devices described above with reference to FIGS. 1-5 . Therefore, for convenience, process 700 is described with reference to a device that performs the process
- the device presents 710 an object on the device's interface ( 710 ).
- the device determines whether a user is manipulating the object on the interface ( 720 ), for example, using one or more interface gestures such as tapping, clicking, dragging, flicking, pinching, stretching, encircling, rubber banding, or other actions that can be performed to manipulate objects such as icons displayed on a user interface. If the user is not manipulating the content object on the user interface, then the device continues to present the object.
- the device animates the object to simulate real-world physical behavior ( 730 ). For example, as the user drags the content object, a simulated momentum can be imparted upon the object, such that the object will initially appear to resist the motion in accordance with Newtonian mechanics. Similarly, the object can continue moving after it has been released according to Newtonian mechanics.
- the device can also simulate the effects of friction upon the motion of the object, e.g., to dampen and eventually halt the object's movement.
- the object can be animated according to a simulated mass that can be dependent upon the size of the data represented by the content object. For example, the icon of a large data file can be animated to respond more slowly to user manipulation and changes in its simulated momentum to simulate heaviness.
- the device detects the presence of other devices ( 740 ).
- properties of the user's manipulation can be combined with information about the device's position and orientation to determine the intent of the user's manipulation of the object. For example, the direction in which the object is swiped or flicked across an interface by the user's finger can be combined with the device's detected orientation to determine the target device for receiving data from several possible other target devices detected by the device in step 740 .
- the device determines whether the user's manipulation of the object is indicative of a peer user gesture ( 750 ). If so, the device transmits data represented by the object to the second device ( 760 ). Otherwise, if the device determines whether the manipulation is indicative of a broadcast user gesture ( 770 ), the device broadcasts data represented by the object so that it can be received by the multiple other devices ( 780 ). Otherwise, the device continues to present the object on the user interface ( 710 ).
- FIG. 8 is a block diagram of example network operating environment for a device for implementing the features and operations described in reference to FIGS. 1-7 .
- Devices 802 a and 802 b can communicate over one or more wired or wireless networks 810 in data communication.
- wireless network 812 e.g., a cellular network
- WAN wide area network
- access device 818 such as an 802.11 g wireless access device, can provide communication access to wide area network 814 .
- both voice and data communications can be established over wireless network 812 and access device 818 .
- device 802 a can place and receive phone calls (e.g., using VoIP protocols), send and receive e-mail messages (e.g., using POP3 protocol), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, over wireless network 812 , gateway 816 , and wide area network 814 (e.g., using TCP/IP or UDP protocols).
- device 802 b can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents over access device 818 and wide area network 814 .
- devices 802 a or 802 b can be physically connected to access device 818 using one or more cables and access device 818 can be a personal computer. In this configuration, device 802 a or 802 b can be referred to as a “tethered” device.
- Devices 802 a and 802 b can also establish communications by other means.
- wireless device 802 a can communicate with other wireless devices, e.g., other devices 802 a or 802 b , cell phones, etc., over wireless network 812 .
- devices 802 a and 802 b can establish peer-to-peer communications 820 , e.g., a personal area network, by use of one or more communication subsystems, such as a BluetoothTM communication device.
- Other communication protocols and topologies can also be implemented.
- Devices 802 a or 802 b can communicate with one or more services over one or more wired and/or wireless networks 810 . These services can include, for example, location services 830 , input processing service 840 , and animation engine 850 .
- Location services 830 can provide location-based services to devices 802 a and 802 b .
- Messaging services can provide email, text message and other communication services.
- Media services 850 can provide online stores for downloading content to devices 802 a , 802 b , such as music and electronic books.
- Syncing services 860 can provide network based syncing services for syncing content stored on user devices.
- Social networking service 870 can provide online communities where users can share content.
- Device 802 a or 802 b can also access other data and content over one or more wired and/or wireless networks 810 .
- content publishers such as news sites, RSS feeds, web sites, blogs, social networking sites, developer networks, etc.
- Such access can be provided by invocation of a web browsing function or application (e.g., a browser) in response to a user touching, for example, a Web object.
- a web browsing function or application e.g., a browser
- FIG. 9 is a block diagram illustrating an exemplary device architecture of a device implementing the features and operations described in reference to FIGS. 1-8 .
- Device 900 can include memory interface 902 , one or more data processors, image processors or central processing units 904 , and peripherals interface 906 .
- Memory interface 902 , one or more processors 904 or peripherals interface 906 can be separate components or can be integrated in one or more integrated circuits.
- the various components can be coupled by one or more communication buses or signal lines.
- Sensors, devices, and subsystems can be coupled to peripherals interface 906 to facilitate multiple functionalities.
- motion sensor 910 , light sensor 912 , and proximity sensor 914 can be coupled to peripherals interface 906 to facilitate orientation, lighting, and proximity functions of the mobile device.
- light sensor 912 can be utilized to facilitate adjusting the brightness of touch screen 946 .
- motion sensor 910 e.g., an accelerometer, gyros
- display objects or media can be presented according to a detected orientation, e.g., portrait or landscape.
- peripherals interface 906 Other sensors can also be connected to peripherals interface 906 , such as a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities.
- Location processor 915 e.g., GPS receiver
- Electronic magnetometer 916 e.g., an integrated circuit chip
- peripherals interface 906 can also be connected to peripherals interface 906 to provide data that can be used to determine the direction of magnetic North.
- electronic magnetometer 916 can be used as an electronic compass.
- Camera subsystem 920 and an optical sensor 922 can be utilized to facilitate camera functions, such as recording photographs and video clips.
- an optical sensor 922 e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
- CCD charged coupled device
- CMOS complementary metal-oxide semiconductor
- Communication functions can be facilitated through one or more communication subsystems 924 .
- Communication subsystem(s) 924 can include one or more wireless communication subsystems 924 .
- Wireless communication subsystems can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters.
- Wired communication system can include a port device, e.g., a Universal Serial Bus (USB) port or some other wired port connection that can be used to establish a wired connection to other computing devices, such as other communication devices, network access devices, a personal computer, a printer, a display screen, or other processing devices capable of receiving or transmitting data.
- USB Universal Serial Bus
- a mobile device can include communication subsystems 924 designed to operate over a GSM network, a GPRS network, an EDGE network, a WiFi or WiMax network, and a Bluetooth network.
- the wireless communication subsystems 924 can include
- device 900 may include wireless communication subsystems designed to operate over a global system for mobile communications (GSM) network, a GPRS network, an enhanced data GSM environment (EDGE) network, 802.x communication networks (e.g., WiFi, WiMax, or 3 G networks), code division multiple access (CDMA) networks, and a BluetoothTM network.
- GSM global system for mobile communications
- EDGE enhanced data GSM environment
- 802.x communication networks e.g., WiFi, WiMax, or 3 G networks
- CDMA code division multiple access
- Communication subsystems 924 may include hosting protocols such that the mobile device 900 may be configured as a base station for other wireless devices.
- the communication subsystems can allow the device to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP protocol, HTTP protocol, UDP protocol, and any other known protocol.
- Audio subsystem 926 can be coupled to a speaker 928 and one or more microphones 930 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
- I/O subsystem 940 can include touch screen controller 942 and/or other input controller(s) 944 .
- Touch-screen controller 942 can be coupled to a touch screen 946 or pad.
- Touch screen 946 and touch screen controller 942 can, for example, detect contact and movement or break thereof using any of a number of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 946 .
- Other input controller(s) 944 can be coupled to other input/control devices 948 , such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus.
- the one or more buttons can include an up/down button for volume control of speaker 928 and/or microphone 930 .
- a pressing of the button for a first duration may disengage a lock of the touch screen 946 ; and a pressing of the button for a second duration that is longer than the first duration may turn power to mobile device 400 on or off.
- the user may be able to customize a functionality of one or more of the buttons.
- the touch screen 946 can also be used to implement virtual or soft buttons and/or a keyboard.
- device 110 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files.
- mobile device 110 can include the functionality of an MP3 player, such as an iPodTM.
- Mobile device 110 may, therefore, include a pin connector that is compatible with the iPod.
- Other input/output and control devices can be used.
- Memory interface 902 can be coupled to memory 950 .
- Memory 950 can include high-speed random access memory or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, or flash memory (e.g., NAND, NOR).
- Memory 950 can store operating system 952 , such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.
- Operating system 952 may include instructions for handling basic system services and for performing hardware dependent tasks.
- operating system 952 can include a kernel (e.g., UNIX kernel).
- Memory 950 may also store communication instructions 954 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. Communication instructions 954 can also be used to select an operational mode or communication medium for use by the device, based on a geographic location (obtained by the GPS/Navigation instructions 968 ) of the device.
- Memory 950 may include graphical user interface instructions 956 to facilitate graphic user interface processing; sensor processing instructions 958 to facilitate sensor-related processing and functions; phone instructions 960 to facilitate phone-related processes and functions; electronic messaging instructions 962 to facilitate electronic-messaging related processes and functions; web browsing instructions 964 to facilitate web browsing-related processes and functions; media processing instructions 966 to facilitate media processing-related processes and functions; GPS/Navigation instructions 968 to facilitate GPS and navigation-related processes and instructions; camera instructions 970 to facilitate camera-related processes and functions; touch model 972 for interpreting touch and gesture input from raw touch input data to facilitate the processes and features described with reference to FIGS. 1-8 ; and a motion model 974 to interpret device motions from raw motion sensor data to facilitate the processes and features of FIGS. 1-7 .
- the memory 950 may also store other software instructions 976 for facilitating other processes, features and applications.
- Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. Memory 950 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
- the features described can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of them.
- the features can be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output.
- the program instructions can be encoded on a propagated signal that is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a programmable processor.
- the described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
- a computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result.
- a computer program can be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer.
- a processor will receive instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data.
- a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
- Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
- magnetic disks such as internal hard disks and removable disks
- magneto-optical disks and CD-ROM and DVD-ROM disks.
- the processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- ASICs application-specific integrated circuits
- the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
- a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
- the features can be implemented in a computer system that includes a back-end component, such as a data server, that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them.
- a back-end component such as a data server
- a middleware component such as an application server or an Internet server
- a front-end component such as a client computer having a graphical user interface or an Internet browser, or any combination of them.
- the components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.
- the computer system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- An API can define on or more parameters that are passed between a calling application and other software code (e.g., an operating system, library routine, function) that provides a service, that provides data, or that performs an operation or a computation.
- software code e.g., an operating system, library routine, function
- the API can be implemented as one or more calls in program code that send or receive one or more parameters through a parameter list or other structure based on a call convention defined in an API specification document.
- a parameter can be a constant, a key, a data structure, an object, an object class, a variable, a data type, a pointer, an array, a list, or another call.
- API calls and parameters can be implemented in any programming language.
- the programming language can define the vocabulary and calling convention that a programmer will employ to access functions supporting the API.
- an API call can report to an application the capabilities of a device running the application, such as input capability, output capability, processing capability, power capability, communications capability, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors. The detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment. The first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device. In some implementations, in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture.
Description
- This disclosure relates generally to communications, and more particularly, to data transfer between devices.
- When an individual performs an action in a real world, physical environment, the individual experiences various physical phenomenon that indicates that the task is being performed or has been completed. For example, if an individual pours objects from a first container into a second container, the individual can observe the objects reacting to the forces of friction and gravity. If the objects having different shapes and masses, then the individual would observe different reactions to the forces.
- Conventional personal computers include operating systems that often provide a virtual “desktop” metaphor where users can manipulate and organize various objects. This metaphor is easily understood by users because it is intuitive, and like the “pouring” act described above, relates to their real world, physical environment. Modern computing devices, such as smart phones, often provide a large variety of applications. Some of these applications, however, provide interfaces that lack an equivalent of the “desktop” metaphor and as a result are more difficult to comprehend by the average user.
- A user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors. The detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment. The first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device. In some implementations, in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture.
- Particular embodiments of the subject matter described in this specification can be implemented to realize one or more of the following advantages. Users can transfer files and other data between devices using intuitive gestures combined with animation based on physics metaphors. Users can transfer files to a network using intuitive physical gestures. Users can broadcast files and other data to other devices using intuitive interface or physical gestures.
- The details of one or more implementations of user interfaces for mobile device communication are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of proactive security for mobile devices will become apparent from the description, the drawings, and the claims.
-
FIGS. 1A-1C illustrate an exemplary intuitive, gesture-based data transfer between two devices using animation based on a physics metaphor. -
FIG. 2 illustrates initiation of an exemplary communications session with a device in response to an interface gesture. -
FIG. 3 illustrates initiation of an exemplary data broadcast from a device to multiple devices in response to a physical gesture. -
FIG. 4 illustrates an exemplary data transfer between two devices in response to intuitive, physical gestures. -
FIG. 5 illustrates an exemplary physical gesture for initiating a communications session with a network. -
FIG. 6 is a flow diagram of an exemplary process for using intuitive, physical gestures to initiate a communications session between devices. -
FIG. 7 is a flow diagram of an exemplary process for using intuitive, interface gestures to initiate a communications session between devices. -
FIG. 8 is a block diagram of exemplary network operating environment for a device for implementing the features and operations described in reference toFIGS. 1-7 . -
FIG. 9 is a block diagram illustrating an exemplary device architecture of a device implementing the features and operations described in reference toFIGS. 1-7 . - Like reference symbols in the various drawings indicate like elements.
-
FIGS. 1A-1C illustrate an exemplary intuitive, gesture-based communication between two devices using animation based on a physics metaphor. Referring toFIG. 1A ,devices Devices device 110 is a handheld device anddevice 120 is anelectronic tablet 120.Devices respective interfaces interfaces -
Interface 112 is shown displaying a collection of graphical objects 114 a-114 f (e.g., file icons) representing files stored ondevice 110. In some implementations, the user can select one or more files for transfer to one or more devices by placing the files into a transfer state. In the example shown, the user has selected four files for transfer by touching their respective objects 114 a-114 d for a predetermined amount of time and/or using a predetermined amount of pressure during the touch. The user can also select a group of files for transfer by drawing a circle around the icons with a finger or stylus, then using a touch, gesture or other input to select the group of files for transfer. In some implementations, the user can drag and drop individual files onto a container object (e.g., a “suitcase” icon) displayed oninterface 112, and then use a touch, gesture or other input to select the container of files for transfer. Other means for selecting individual files or groups of files for transfer are also possible, including but not limited to selecting files through menus or other conventional user interface elements. - In some implementations, the selected objects 114 a-114 d can be detached from
interface 112 and allowed to freely “float” oninterface 112. The boundaries ofinterface 112 can be configured to behave like “bumpers” during device motion; such that floating objects 114 a-114 d bounce off the boundaries ofinterface 112 whileobjects interface 112. -
FIG. 1B illustratesdevice 110 in motion relative todevice 120. In some implementations,device 110 can be equipped with one or more motion sensors (not illustrated) that detect whendevice 110 is moved. Motion sensors can include but are not limited to accelerometers, gyroscopes and magnetometers. In the example shown, the user is holdingdevice 110 directly overinterface 122, and has made a physical gesture withdevice 110. A physical gesture can be any gesture that moves a device or changes the orientation of a device. Here, the user has rotateddevice 110 aboveinterface 122 in a manner similar to tipping a glass of water. This angular motion can be detected by one or more onboard motion sensors. - As shown in
FIG. 1B , detached objects 114 a-114 d can be animated to simulate the effect of gravity by “sliding” toward the lowermost portion ofinterface 112 asdevice 110 is rotated. The animation of the objects creates the appearance that the objects have mass and are reacting to forces of a real world, physical environment. Selected objects 114 a-114 d, being detached frominterface 112, can slide until they touchboundaries interface 112.Objects interface 112, can remain in their original positions oninterface 112. -
FIG. 1C illustratesdevices device 110 relative to interface 112 such thatboundary 116 d ofinterface 112 is substantially parallel withinterface 122. In response to the new orientation ofdevice 110, a graphics engineonboard device 110 animates selected objects 114 a-114 d to simulate the movement of objects 114 a-114 d under the force of gravity and friction. For example, selected objects 114 a-114 d can be animated to slide toward an intersecting corner ofboundaries interface 112.Device 110 can interpret the rotation of device 110 (e.g., a “pouring” action) as an indication of the user's intent to transfer the files represented by selected objects 114 a-114 d. - Upon determining that the user of
device 110 intends to transfer the files represented by selected objects 114 a-114 d,device 110 determines ifdevice 120 is present and available to receive the files. In some implementations,device 110 can use onboard short-range communication technology, such as Bluetooth or Radio Frequency Identification (RFID) to detect the presence ofdevice 120. In the example shown,device 110 has files in a transfer state and detects the presence ofdevice 122. Ifdevice 120 is within a predetermined range ofdevice 110, thendevice 110 can attempt to establish a communications link 130 withdevice 120. After a link is established and authenticated,device 110 can request thatdevice 120 accept a file transfer. Upon an acknowledgement of acceptance fromdevice 120,device 110 can transfer the files represented by objects 114 a-114 d todevice 120 using known communication protocols. - As the data transfers from
device 110 todevice 120, icons representative of the transferred data can appear oninterface 122 ofdevice 120. For example,icon 114 c can appear oninterface 122 and be animated by a graphics engine ondevice 120 to change in size or appearance (e.g., grow, fill, materialize) as the data represented byobject 114 c is received bydevice 120. As the files represented by the selected objects 114 a-114 d are transferred,device 120 can animate the objects 114 a-114 d oninterface 122 so as to appear to react to gravity, friction or drag, momentums, torques, accelerations, centripetal forces or any other force found in a real-world, physical environment. For example, transferred files can appear to “drop” ontodevice 120 at a point directly belowdevice 110 and then spread out ontointerface 122 to simulate sand or liquid being poured onto a surface having friction or a viscous drag. The rate at which each object moves oninterface 122 can be based on the size or “mass” of the file represented by the object. Larger files that have more “mass” can have their object animated to move slower ininterface 122, and small files that have less “mass” can have their object animated to move faster ininterface 122. - In some implementations, the
object 114 c can be detached frominterface 122 so that it appears to “float” oninterface 122. The user can accept the data represented byicon 114 c by providing an interface or physical gesture ofdevice 120 or by other input means. Upon detection of the input, object 114 c can be fixed to theinterface 122 to visually indicate to the user the acceptance of the data. - The order of data transfer can be determined by the arrangement of objects 114 a-114 d in
interface 112. For example, object 114 c, which is closest to avirtual opening 117 ininterface 112 can have its corresponding data transferred first because of its close proximity tovirtual opening 117. Objects corresponding to larger files can be animated to move slowly tovirtual opening 117 and smaller icons can be animated to move more quickly tovirtual opening 117, thus enabling a smaller file to be transferred rather than being bottlenecked by a larger file that can take a long time to transfer. - In some implementations, data transfer can be represented by animating objects 114 a-114 d to simulate a variety of real-world physics. For example, as
file 119 represented byobject 114 c is being transferred, object 114 c can be animated oninterface 112 to appear distorted aroundvirtual opening 117 to simulate water going down a drain, sand flowing through an hourglass, or a genie being pulled into a bottle (a.k.a. “the genie effect”). In other examples, the animation can simulateobject 114 c dissolving like a tablet in water or dematerializing. Other animations are possible to convey to the user that data are being emptied fromdevice 110 ontointerface 122. In some implementations, data transfer can be represented or accompanied by audible feedback, such as the sound of liquid pouring, a tablet fizzing, gas through a valve, a sci-fi teleporter, or other sound that audibly represent the transfer of a material from one point to another. - The speed of animation or the pitch of sounds associated with data transfer can be determined from the speed of the data transfer. For example, data transfers using a high bandwidth communications link 130 can be animated as “pouring” out of
device 110 more quickly than a data transfer occurring over a lower bandwidth connection. In some implementations, the speed of data transfer can be at least partly determined by the orientation ofdevice 110. In some implementations, the data transfer rate, and the speed of associated animations, can change based on the orientation or distance ofdevice 110 relative to interface 122. For example, ifdevice 110 is orientated as shown inFIG. 1B , the data transfer rate overcommunication link 130 can be slower than the data transfer rate if the device 100 were orientated as shown inFIG. 1C . In this example, if device is orientated to a substantially upright position (e.g., an orientation opposite to the orientation shown inFIG. 1C ) the data transfer will stop. - In the example of
FIGS. 1A-1C , selected objects 114 a-114 d are represented as substantially solid objects, but other representations of the data corresponding to the icons can also be used. For example, inFIG. 1A as the user selects objects 114 a-114 d, objects 114 a-114 d can be animated to “melt” into a simulated liquid that collects atboundary 116 c ofinterface 112. Multiple selected icons can then be represented at stratified layers of liquid that can be “poured” out ofdevice 110. In some examples, the volume of a given strata can be indicative of the amount of data it represents. In some examples, a liquefaction and stratification metaphor can be used to determine the order in which data can be transferred. For example, the first file selected can remain as the bottommost strata asdevice 110 is rotated, such that the first selected file “flows” into the bottommost position ofinterface 122 inFIG. 1C to become the first file to flow out ofdevice 110. In some examples, as data represented by a strata is transferred, the thickness of the strata oninterface 112 can shrink to represent a shrinking amount of data that remains to be transferred. -
FIG. 2 illustrates initiation of a communications session with a device in response to an interface gesture.Devices devices devices FIGS. 1A-1C .Devices - In the example shown, the user has drawn a graphical object 240 (e.g., a note) on
interface 250 ofdevice 210. The user can input a request to transmit data (e.g., copy data) represented bygraphical object 240 to device 220 using touch gesture input to interface 250 (hereinafter also referred to as an “interface gesture”). For example, the user can touchgraphical object 240 to select it, and then make a “swipe” or “flick” gesture oninterface 250 with one or more fingers in the direction of device 220.Device 210 senses the interface gesture input interacting withgraphical object 240 and interprets the gesture as a request to transmit data represented bygraphical object 240 to another device. - Before receiving the data transfer request,
device 210 can scan the room for the presence of other devices. In this example,devices 220 and 230 are detected. If communication has not been established,device 210 can establish communication withdevices 220, 230. In the simplest case, the user ofdevice 210 can manually select one or more devices for data transfer from a list of devices that were detected in the scan (e.g., devices 220, 230). Upon receiving the “swipe” or “flick” gesture requesting data transfer, the data can be transferred to the selected device(s). - In some implementations,
device 210 can request position data fromdevices 220 and 230. For example, in response to the request,devices 220, 230 can send their position vectors in an inertial reference coordinate frame shared bydevices devices 220, 230 can send their respective position vectors in the well-known Earth Centered Earth Fixed (ECEF) Cartesian coordinate frame. The position vectors can be obtained from positioning systemsonboard devices 220, 230. Using the position vectors and inertial measurements from its own onboard motion sensors,device 210 can compute line of sight (LOS) vectors fromdevice 210 to eachtarget device 220, 230 in ECEF coordinates. The LOS vectors can then be transformed 230 into a display coordinate frame fordevice 210 using coordinate transformations. For example,device 210 can perform the following coordinate transformations for each LOS vector: -
- In equation {right arrow over (L)}ECEF is the LOS vector from
device 210 todevice 220 or 230 in ECEF coordinates, and {right arrow over (R)}S— ECEF, {right arrow over (R)}T— ECEF are the position vectors ofdevice 210 anddevice 220 or 230, respectively, in ECEF coordinates. In equation [2], {right arrow over (L)}Display is the LOS vector fromdevice 210 todevice 220 or 230 in display coordinates ofdevice 210, -
- is a transformation matrix from device coordinates of
device 210 to display coordinates ofdevice 210, -
- is a transformation matrix from ECEF coordinates to device coordinates of
device 210. In this example, display coordinates ofdevice 210 is a two dimensional Cartesian coordinate frame where the display ofdevice 210 is defined inFIG. 2 as an x-y plane. The LOS vectors {right arrow over (L)}220, {right arrow over (L)}230 ofdevices 220, 230, respectively, are shown in the x-y plane. Additionally, a vector {right arrow over (G)}, representing the direction of the interface gesture made towards device 220 in display coordinates is shown in the x-y plane. The vector {right arrow over (G)} can be determined in the x-y plane by an onboard touch model based on raw touch sensor data (e.g., capacitive touch data). To determine the target device (in this example device 220), a dot product can be taken between the {right arrow over (G)} vector and each of the LOS vectors {right arrow over (L)}220, {right arrow over (L)}230 in the x-y plane. The LOS vector that provides the smallest angle θ with the {right arrow over (G)} vector (in this case θ1) can determine the device to receive the data transfer, which is given by -
- The above technique can be used when position errors are small and there is sufficient angular separation between the communicating devices to ensure an accurate computation of θ. Other techniques for determining the target device can also be used.
- In some implementations, either the user can physically point
device 210 at device 220 ordevice 230 to indicate which device will receive the data transfer. In this case, the LOS vectors can be transformed into device coordinates (without transforming into display coordinates) and equation [3] can be applied by replacing the gesture vector {right arrow over (G)} with the device axis that is pointing in the direction of the target device, which in this example is the x-axis shown inFIG. 2 . The LOS vector that provides the smallest angle θ with the {right arrow over (x)} vector can determine the device to receive the data transfer. Accordingly, a user can use equations [1] through [3] to indicate a target device for data transfer using either an interface gesture in the direction of the desiredtarget device 220, 230 or a physical gesture by physically pointingdevice 210 at the desiredtarget device 220, 230. In some implementations, multiple target devices can be selected for a broadcast style data transfer using equations [1] through [3] as described with reference toFIG. 3 . - In some implementations,
graphical object 240 can be animated in response to the gesture to simulate a physics metaphor. For example,graphical object 240 can be animated to simulate the effects of momentum, friction, viscosity, or other aspects of Newtonian mechanics, such thatgraphical object 240 can continue to move along its trajectory beyond where the gesture started. Simulated friction or viscosity can slow the movement ofgraphical object 240 as it travels along its trajectory. - In some implementations, the edges of
interface 250 may partly resist the motion ofgraphical object 240 when the two come into contact. For example, the user may have to flickgraphical object 240 with a velocity sufficient to overcome a simulated repelling force atedge 253 ofinterface 250. Some examples of repelling forces include but are not limited to gravity and friction provided by a speed bump or wall of a bubble, where an object either overcomes the repelling force by having sufficient speed to rollover the bump or sufficient speed to break through the bubble wall or has insufficient speed and rolls or bounces back. A gesture imparting sufficient velocity or speed tographical object 240 can indicate an intent to perform data transfer to another device. A gesture imparting insufficient velocity can result ingraphical object 240 rebounding off the edge ofinterface 250 with no transfer of data. In some examples, this behavior can helpdevice 210 distinguish the difference between gestures intended to repositiongraphical object 240 withininterface 250 and gestures intended to communicate the data corresponding tographical object 240 to another device. The speed of the gestures can determine the speed of thegraphical object 240. Faster gestures result in higher velocities than slower gestures. - In some implementations, the target devices can initiate an animation that simulates the receipt of data using a physics metaphor. For Example, when device 220 starts to receive data from
device 210, device 220 can display animated graphical objects oninterface 270 representing data entering device 220. The graphical objects can be detached frominterface 270 so that the objects “float.” The user can provide an interface gesture or physical gesture to indicate acceptance of the data. Upon the user's acceptance of the data through a gesture or by other means, the floating objects can become fixed to theinterface 270 to visually indicate acceptance of the data to the user. -
FIG. 3 illustrates initiation of a data broadcast from a device to multiple devices in response to a physical gesture.Devices 310, 320, 325, 330 are located in proximity to each other.Devices 310, 320, 325, 330 can be, for example, devices similar todevices FIGS. 1A-1C .Device 330 can be a computer enabled display device, such as an electronic tablet, computer monitor, projection screen, electronic whiteboard, teleconferencing screen, television, or other type of device that can display information. - In the example shown, the user has selected graphical object 340 (a file icon) to indicate an intention to perform a data transfer action. Device 310 is also shown in a rotational or sweeping motion due to a user performing a clockwise (or counterclockwise) rotational or sweeping gesture that emulates a toss of a Frisbee®. Motion sensors onboard device 310 senses this physical gesture and interprets it to indicate the user's intent to broadcast data represented by
graphical object 340 todevices 320, 320, 325, 330. - If communication has not already been established, device 310 establishes communications link 350 (e.g., a bidirectional Bluetooth link) with
devices 320, 325, 330 and transmits data corresponding tographical object 340 todevices 320, 325, 330. Upon receipt of the transmitted data,devices 320, 325, 330 can displaygraphical object 340 on theirrespective displays 360. The graphical object can be detached on the interface or otherwise modified to indicate that the data has not been accepted by the user of the device. The user ofdevices 320, 325, 330 can provide gesture input or other input means to accept the data. Upon acceptance by the user,icon 340 can be fixed to the interface or otherwise modified to indicate that the data has been accepted onto the device. -
FIG. 4 illustrates a data transfer between two devices in response to intuitive, physical gestures. For illustrative purposes, device 410 can be a handheld personal digital assistant anddevice 420 can be an electronic tablet. Other devices are also possible. -
Device 420 can includedisplay 430 that can displaygraphical objects device 420. The user has selectedobject 436 to indicate an intent to perform one or more actions upon the data corresponding toicon 436. In the example shown, the user intends to request that data corresponding toicon 436 be transferred fromdevice 420 to device 410. The user indicates an intent to transfer data by placing device 410 in position andorientation 440 a relative todevice 420, and then moving device 410 acrossdisplay 430 to position andorientation 440 b. In some implementations, the gesture just described can be a metaphor for the user holding and using the device 410 as a scraper or vacuum to “scrap” or “vacuum” data or files offinterface 430 and onto device 410. - Device 410 detects the orientation and motion from
location 440 a tolocation 440 b, and interprets the orientation and motion as a physical gesture indicating the user's intent to receive data fromdevice 420. For example, the orientation can be detected by monitoring one or more angles between axes fixed to the device and a local level, instantaneous coordinate frame determined by, for example, a gravitational acceleration vector output computed from output of an onboard accelerometer and a vector directed North computed from output of an onboard magnetometer. The presence ofdevice 420 is detected and if communication is not already established, device 410 can establish a wireless communications link 450 withdevice 420. Upon establishment oflink 450, device 410 can request thatdevice 420 transmit any selected data, such as the data corresponding to selectedicon 436. The data can be selected by a user ofdevice 420 as described in reference toFIGS. 1A-1C and 2.Device 420 transmits the data overlink 450 to device 410.Graphical object 436 can appear oninterface 452 of device 410 to visually indicate the receipt of the selected data on device 410.Graphical object 436 can initially be detached frominterface 452 until the user of device 410 provides a gesture input or other input means to accept the data. Upon acceptance,graphical object 436 can become fixed tointerface 452. Other visual or audio feedback can also be provided to indicate user acceptance of the data. -
FIG. 5 illustrates an example physical gesture for initiating a communications session with a network. The physical gesture is used to indicate that the user wishes to initiate a communication session with a network resource, such as a network server. For example, the user can lift a handheld device skyward in a gesture that symbolizes uplifting a torch. In response, the device initiates the communication session with the network resource. - The example shown,
device 510 includesinterface 512 that displays graphical objects 514 a-514 f (e.g., file icons).Device 510 can be, for example, one of the example devices described above with reference toFIGS. 1A-1C . A user selects one or more objects displayed ondevice 510, for example, objects 514 b and 514 d. The user movesdevice 510 from a first position andorientation 520 a to a second position andorientation 520 b.Device 510 uses internal position and orientation sensors to detect this motion and determine that the motion is a gesture indicative of an intent to upload data represented byobjects -
Device 510 then establishes wireless communications link 530 tonetwork 540. Wireless communications link 530 can be, for example, a cellular, WiFi, WiMax, satellite, or other wireless communications link tonetwork 540, which can be a cellular telephone data network, a private, commercial, or public WiFi or WiMax access network, a satellite network or other wireless communications network. Once wireless communications link 530 is established,device 510 transmits the data corresponding to selectedobjects network 540. - In the examples provided above, the user gestures are described in terms of physical gestures and interface gestures. Other implementations of user gestures can be used. For example, a user can initiate transmission of a selected file by generally aligning the device with the target device and then blowing air across the display of the device. One or more microphones on the device can detect the sound of moving air and the direction of airflow. The direction of airflow can be used to infer the intent of the user to identify a target device.
- In some implementations, a sending device can be held over a receiving device as shown in
FIG. 1C , and a touch sensitive surface (e.g., interface 112) on the sending device can be tapped to cause items to be transferred to the receiving device over a wireless communication link. In this example implementation, each tap can cause one item to transfer. This gesture can be analogized to tapping a Ketchup™ bottle to get the Ketchup™ to flow. -
FIG. 6 is a flow diagram of an example process for using intuitive, physical gestures to initiate a communications session between devices.Process 600 can be performed by one or more devices, for example, one or more of the devices described above with reference toFIGS. 1-5 . Therefore, for convenience,process 600 is described with reference to a device that performsprocess 600. - The device presents an object on an interface of the device (605). The object can be, for example, an icon or other representation of content. The interface can be a touch sensitive surface that is responsive to gesture inputs. The device then determines whether the device is in motion (610). Examples of device motion can include changes in device position or orientation, such as tilting, shaking, rotating, spinning, shifting, or combinations of these or other motions. If the device is not in motion, then the device continues to present the object on the interface. If the device is in motion, then the device animates the object to simulate real-world physical behavior (615). For example, the device can animate a graphical representation of the content object (e.g., icon) to make the object appear to slide, ricochet, vibrate, bounce, or perform other reactions to forces based on Newtonian mechanics corresponding to the detected motion.
- The device detects one or more other devices (620). The detection can be accomplished using short-range communication technology such as Bluetooth scanning. The device then determines whether the motion is indicative of a peer user gesture (625), A “peer user” gesture is a gesture that suggests a user's intent to transfer data from a first device to a second device (one to one). The data can be stored by the device or accessible by the device (e.g., stored on another device in communication with the device) If so, the device transmits data represented by the object to a second device (630).
- If the device determines that the detected motion is not indicative of a peer user gesture, then the device can determine whether the motion is indicative of a broadcast user gesture (640). A “broadcast user gesture” is a gesture that suggests a user's intent to cause a device to transfer data to multiple recipient devices simultaneously (one to many). If so, the device broadcasts the data represented by the object to the multiple recipient devices. If not, the device continues to present the object on the
user interface 605. -
FIG. 7 is a flow diagram of an example process for using intuitive, interface gestures to initiate a communications session between devices. Process 700 can be performed by one or more devices, for example, one or more of the devices described above with reference toFIGS. 1-5 . Therefore, for convenience, process 700 is described with reference to a device that performs the process - The device presents 710 an object on the device's interface (710). The device determines whether a user is manipulating the object on the interface (720), for example, using one or more interface gestures such as tapping, clicking, dragging, flicking, pinching, stretching, encircling, rubber banding, or other actions that can be performed to manipulate objects such as icons displayed on a user interface. If the user is not manipulating the content object on the user interface, then the device continues to present the object.
- If the user is manipulating the object, then the device animates the object to simulate real-world physical behavior (730). For example, as the user drags the content object, a simulated momentum can be imparted upon the object, such that the object will initially appear to resist the motion in accordance with Newtonian mechanics. Similarly, the object can continue moving after it has been released according to Newtonian mechanics. The device can also simulate the effects of friction upon the motion of the object, e.g., to dampen and eventually halt the object's movement. In some implementations, the object can be animated according to a simulated mass that can be dependent upon the size of the data represented by the content object. For example, the icon of a large data file can be animated to respond more slowly to user manipulation and changes in its simulated momentum to simulate heaviness.
- The device then detects the presence of other devices (740). In some implementations, properties of the user's manipulation can be combined with information about the device's position and orientation to determine the intent of the user's manipulation of the object. For example, the direction in which the object is swiped or flicked across an interface by the user's finger can be combined with the device's detected orientation to determine the target device for receiving data from several possible other target devices detected by the device in
step 740. - The device then determines whether the user's manipulation of the object is indicative of a peer user gesture (750). If so, the device transmits data represented by the object to the second device (760). Otherwise, if the device determines whether the manipulation is indicative of a broadcast user gesture (770), the device broadcasts data represented by the object so that it can be received by the multiple other devices (780). Otherwise, the device continues to present the object on the user interface (710).
-
FIG. 8 is a block diagram of example network operating environment for a device for implementing the features and operations described in reference toFIGS. 1-7 .Devices wireless networks 810 in data communication. For example,wireless network 812, e.g., a cellular network, can communicate with a wide area network (WAN) 814, such as the Internet, by use ofgateway 816. Likewise,access device 818, such as an 802.11 g wireless access device, can provide communication access towide area network 814. In some implementations, both voice and data communications can be established overwireless network 812 andaccess device 818. For example,device 802 a can place and receive phone calls (e.g., using VoIP protocols), send and receive e-mail messages (e.g., using POP3 protocol), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, overwireless network 812,gateway 816, and wide area network 814 (e.g., using TCP/IP or UDP protocols). Likewise, in some implementations,device 802 b can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents overaccess device 818 andwide area network 814. In some implementations,devices device 818 using one or more cables andaccess device 818 can be a personal computer. In this configuration,device -
Devices wireless device 802 a can communicate with other wireless devices, e.g.,other devices wireless network 812. Likewise,devices peer communications 820, e.g., a personal area network, by use of one or more communication subsystems, such as a Bluetooth™ communication device. Other communication protocols and topologies can also be implemented. -
Devices wireless networks 810. These services can include, for example,location services 830,input processing service 840, andanimation engine 850.Location services 830 can provide location-based services todevices Media services 850 can provide online stores for downloading content todevices Syncing services 860 can provide network based syncing services for syncing content stored on user devices.Social networking service 870 can provide online communities where users can share content. -
Device wireless networks 810. For example, content publishers, such as news sites, RSS feeds, web sites, blogs, social networking sites, developer networks, etc., can be accessed bydevice -
FIG. 9 is a block diagram illustrating an exemplary device architecture of a device implementing the features and operations described in reference toFIGS. 1-8 .Device 900 can includememory interface 902, one or more data processors, image processors orcentral processing units 904, and peripherals interface 906.Memory interface 902, one ormore processors 904 or peripherals interface 906 can be separate components or can be integrated in one or more integrated circuits. The various components can be coupled by one or more communication buses or signal lines. - Sensors, devices, and subsystems can be coupled to peripherals interface 906 to facilitate multiple functionalities. For example,
motion sensor 910,light sensor 912, andproximity sensor 914 can be coupled to peripherals interface 906 to facilitate orientation, lighting, and proximity functions of the mobile device. For example, in some implementations,light sensor 912 can be utilized to facilitate adjusting the brightness oftouch screen 946. In some implementations, motion sensor 910 (e.g., an accelerometer, gyros) can be utilized to detect movement and orientation of thedevice 900. Accordingly, display objects or media can be presented according to a detected orientation, e.g., portrait or landscape. - Other sensors can also be connected to
peripherals interface 906, such as a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities. - Location processor 915 (e.g., GPS receiver) can be connected to peripherals interface 906 to provide geopositioning. Electronic magnetometer 916 (e.g., an integrated circuit chip) can also be connected to peripherals interface 906 to provide data that can be used to determine the direction of magnetic North. Thus,
electronic magnetometer 916 can be used as an electronic compass. -
Camera subsystem 920 and anoptical sensor 922, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips. - Communication functions can be facilitated through one or more communication subsystems 924. Communication subsystem(s) 924 can include one or more wireless communication subsystems 924. Wireless communication subsystems can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. Wired communication system can include a port device, e.g., a Universal Serial Bus (USB) port or some other wired port connection that can be used to establish a wired connection to other computing devices, such as other communication devices, network access devices, a personal computer, a printer, a display screen, or other processing devices capable of receiving or transmitting data. The specific design and implementation of the communication subsystem 924 can depend on the communication network(s) or medium(s) over which
device 900 is intended to operate. For example, a mobile device can include communication subsystems 924 designed to operate over a GSM network, a GPRS network, an EDGE network, a WiFi or WiMax network, and a Bluetooth network. In particular, the wireless communication subsystems 924 can include For example,device 900 may include wireless communication subsystems designed to operate over a global system for mobile communications (GSM) network, a GPRS network, an enhanced data GSM environment (EDGE) network, 802.x communication networks (e.g., WiFi, WiMax, or 3G networks), code division multiple access (CDMA) networks, and a Bluetooth™ network. Communication subsystems 924 may include hosting protocols such that themobile device 900 may be configured as a base station for other wireless devices. As another example, the communication subsystems can allow the device to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP protocol, HTTP protocol, UDP protocol, and any other known protocol. -
Audio subsystem 926 can be coupled to aspeaker 928 and one ormore microphones 930 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. - I/
O subsystem 940 can include touch screen controller 942 and/or other input controller(s) 944. Touch-screen controller 942 can be coupled to atouch screen 946 or pad.Touch screen 946 and touch screen controller 942 can, for example, detect contact and movement or break thereof using any of a number of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact withtouch screen 946. - Other input controller(s) 944 can be coupled to other input/
control devices 948, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control ofspeaker 928 and/ormicrophone 930. - In one implementation, a pressing of the button for a first duration may disengage a lock of the
touch screen 946; and a pressing of the button for a second duration that is longer than the first duration may turn power to mobile device 400 on or off. The user may be able to customize a functionality of one or more of the buttons. Thetouch screen 946 can also be used to implement virtual or soft buttons and/or a keyboard. - In some implementations,
device 110 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations,mobile device 110 can include the functionality of an MP3 player, such as an iPod™.Mobile device 110 may, therefore, include a pin connector that is compatible with the iPod. Other input/output and control devices can be used. -
Memory interface 902 can be coupled tomemory 950.Memory 950 can include high-speed random access memory or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, or flash memory (e.g., NAND, NOR).Memory 950 can storeoperating system 952, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.Operating system 952 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations,operating system 952 can include a kernel (e.g., UNIX kernel). -
Memory 950 may also store communication instructions 954 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. Communication instructions 954 can also be used to select an operational mode or communication medium for use by the device, based on a geographic location (obtained by the GPS/Navigation instructions 968) of the device.Memory 950 may include graphicaluser interface instructions 956 to facilitate graphic user interface processing;sensor processing instructions 958 to facilitate sensor-related processing and functions;phone instructions 960 to facilitate phone-related processes and functions;electronic messaging instructions 962 to facilitate electronic-messaging related processes and functions;web browsing instructions 964 to facilitate web browsing-related processes and functions;media processing instructions 966 to facilitate media processing-related processes and functions; GPS/Navigation instructions 968 to facilitate GPS and navigation-related processes and instructions;camera instructions 970 to facilitate camera-related processes and functions;touch model 972 for interpreting touch and gesture input from raw touch input data to facilitate the processes and features described with reference toFIGS. 1-8 ; and amotion model 974 to interpret device motions from raw motion sensor data to facilitate the processes and features ofFIGS. 1-7 . Thememory 950 may also storeother software instructions 976 for facilitating other processes, features and applications. - Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules.
Memory 950 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits. - The features described can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of them. The features can be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output. Alternatively or in addition, the program instructions can be encoded on a propagated signal that is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a programmable processor.
- The described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. A computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data. Generally, a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- To provide for interaction with a user, the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
- The features can be implemented in a computer system that includes a back-end component, such as a data server, that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.
- The computer system can include clients and servers. A client and server are generally remote from each other and typically interact through a network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- One or more features or steps of the disclosed embodiments can be implemented using an Application Programming Interface (API). An API can define on or more parameters that are passed between a calling application and other software code (e.g., an operating system, library routine, function) that provides a service, that provides data, or that performs an operation or a computation.
- The API can be implemented as one or more calls in program code that send or receive one or more parameters through a parameter list or other structure based on a call convention defined in an API specification document. A parameter can be a constant, a key, a data structure, an object, an object class, a variable, a data type, a pointer, an array, a list, or another call. API calls and parameters can be implemented in any programming language. The programming language can define the vocabulary and calling convention that a programmer will employ to access functions supporting the API.
- In some implementations, an API call can report to an application the capabilities of a device running the application, such as input capability, output capability, processing capability, power capability, communications capability, etc.
- A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, elements of one or more implementations may be combined, deleted, modified, or supplemented to form further implementations. Yet in another example, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims.
Claims (27)
1. A computer-implemented method, comprising:
presenting an object on an interface of a first device, the object representing data stored or accessible by the first device;
detecting motion based on data from sensors onboard the first device;
receiving input selecting the object;
responsive to the input and the detected motion, animating the object on the interface using a physics metaphor, where the animation dynamically changes in response to the detected motion;
detecting a presence of a second device located in proximity to the first device;
determining that the detected motion results from a physical gesture made by a user of the first device, the physical gesture indicating a request to transfer the data to the second device; and
responsive to the determining and to the detected presence of the second device, initiating data transfer to the second device.
2. The method of claim 1 , where presenting an object on an interface, further comprises:
presenting an object on a touch sensitive surface.
3. The method of claim 2 , where receiving user input further comprises:
receiving touch input selecting the object through the touch sensitive surface;
determining that the touch input has exceeded a predetermined time or pressure; and
animating the object using the physics metaphor so that the object appears to be detached from the interface and freely moving on the interface in response to motion of the device.
4. The method of claim 3 , where the first or second device is an electronic tablet.
5. The method of claim 1 , where animating the object on the interface further comprises:
animating the object during data transfer to the second device, where the animating is based on the size of the data represented by the object.
6. The method of claim 5 , where an order or speed of data transfer is based on the location of the animated object in the interface or the size of the data being transferred.
7. A computer-implemented method, comprising:
receiving on a first device a request to receive data from a second device proximate to the first device and in communication with the first device;
detecting receipt of data from the second device;
presenting an object on an interface of the first device, the object representing the data received on the first device; and
animating the object on the interface using a physics metaphor.
8. The method of claim 7 , where presenting an object on an interface of the first device, further comprises:
presenting an object on a touch sensitive surface of the first device.
9. The method of claim 8 , further comprising:
receiving touch input selecting the object through the touch sensitive surface;
determining that the touch input has exceeded a predetermined time or pressure; and
fixing the object to the interface so that the object cannot move freely in the interface in response to motion of the device.
10. The method of claim 7 , where the first or second device is an electronic tablet.
11. The method of claim 7 , where animating the object on the interface further comprises:
animating the object during data transfer to the first device, where the animating is based on the size of the data represented by the object.
12. The method of claim 11 , where an order or speed of data transfer is based on the location of the animated object in the interface or the size of the data being transferred.
13. A computer-implemented method, comprising:
receiving gesture input selecting an object on a touch sensitive surface of a first device, the object representing data to be transferred to at least one other device;
determining a direction of the gesture on the touch sensitive surface;
receiving position information from one or more devices proximate to the first device;
selecting a target device for receiving data, where the target device is determined based on the position information and the sensor data; and
initiating a transfer of the data to the selected target device.
14. The method of claim 13 , where selecting a target device further comprises:
determining line of sight vectors from the position vectors;
transforming the line of sight vectors from an inertial coordinate frame to a display coordinate frame associated with the touch sensitive surface;
defining a gesture vector representing the direction of the gesture in the display coordinate frame;
determining an angular separation between the gesture vector and each line of sight vector in the display coordinate frame; and
selecting a target device having a line of sight vector with the smallest angular separation with the gesture vector in the display coordinates.
15. A computer-implemented method comprising:
receiving physical gesture input indicating an intent to broadcast data stored or accessible by a device;
determining two or more target devices for receiving the data from the device, where the target devices are located proximate to the device and in communication with the device; and
broadcasting the data to the two or more target devices.
16. The method of claim 15 , where the physical gesture is a clockwise or counterclockwise rotational or sweeping gesture made in the general direction of the target devices by a hand of a user holding the device.
17. A computer-implemented, comprising:
receiving physical gesture input indicating an intent to send data to, or receive data from a network resource; and
responsive to the physical gesture, sending data to, or receiving data from the network resource.
18. A computer-implemented method, comprising:
receiving input through a first interface of a first device, the input requesting data from a second device located proximate to the first device and in communication with the first device, the second device having a second interface displaying an object representing the data requested by the first device;
detecting an orientation and motion of the first device using sensor data output from at least one motion sensor onboard the first device, where the orientation and motion indicate an a request to transfer the data from the second device to the first device; and
responsive to the detecting, initiating a transfer of the data from the second device to the first device, where the initiating of the data transfer includes animating the object in the second interface using a physics metaphor, where the object appears to be scraped or vacuumed out of the second interface.
19. The method of claim 18 , where the first or second device is an electronic tablet.
20. A system comprising:
a motion sensor;
a processor;
a computer-readable medium storing instructions, which, when executed by the processor, causes the processor to perform operations comprising:
presenting an object on an interface of the system, the object representing data stored or accessible by the system;
detecting motion based on data from the motion sensor;
receiving input selecting the object;
responsive to the input and the detected motion, animating the object on the interface using a physics metaphor, where the animation dynamically changes in response to the detected motion;
detecting a presence of a device located in proximity to the system;
determining that the detected motion results from a physical gesture made by a user of the system, the physical gesture indicating a request to transfer the data to the device; and
responsive to the determining and to the detected presence of the device, initiating data transfer to the device.
21. The system of claim 20 , where receiving user input further comprises:
receiving touch input selecting the object through the interface;
determining that the touch input has exceeded a predetermined time or pressure; and
animating the object using the physics metaphor so that the object appears to be detached from the interface and freely moving on the interface in response to motion of the system.
22. The system of claim 20 , where animating the object on the interface further comprises:
animating the object during data transfer to the device, where the animating is based on the size of the data represented by the object.
23. The system of claim 22 , where an order or speed of data transfer is based on the location of the animated object in the interface or the size of the data being transferred.
24. A system comprising:
a processor;
a computer-readable medium storing instructions, which, when executed by the processor, causes the processor to perform operations comprising:
receiving a request to receive data from a device proximate to the system and in communication with the system;
detecting receipt of data from the device;
presenting an object on an interface of the system, the object representing the data received on the device; and
animating the object on the interface using a physics metaphor.
25. The system of claim 24 , further comprising:
receiving touch input selecting the object through the interface;
determining that the touch input has exceeded a predetermined time or pressure; and
fixing the object to the interface so that the object cannot move freely in the interface in response to motion of the system.
26. The method of claim 24 , where animating the object on the interface further comprises:
animating the object during data transfer to the first device, where the animating is based on the size of the data represented by the object.
27. The system of claim 26 , where an order or speed of data transfer is based on the location of the animated object in the interface or the size of the data being transferred.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/652,719 US20110163944A1 (en) | 2010-01-05 | 2010-01-05 | Intuitive, gesture-based communications with physics metaphors |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/652,719 US20110163944A1 (en) | 2010-01-05 | 2010-01-05 | Intuitive, gesture-based communications with physics metaphors |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110163944A1 true US20110163944A1 (en) | 2011-07-07 |
Family
ID=44224422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/652,719 Abandoned US20110163944A1 (en) | 2010-01-05 | 2010-01-05 | Intuitive, gesture-based communications with physics metaphors |
Country Status (1)
Country | Link |
---|---|
US (1) | US20110163944A1 (en) |
Cited By (160)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070269782A1 (en) * | 2006-01-17 | 2007-11-22 | Puente Melinda K | Instructional game program and method |
US20090136016A1 (en) * | 2007-11-08 | 2009-05-28 | Meelik Gornoi | Transferring a communication event |
US20090265225A1 (en) * | 2006-01-17 | 2009-10-22 | Melinda Kathryn Puente | Transfer methods, systems and devices for managing a compliance instruction lifecycle |
US20100146422A1 (en) * | 2008-12-08 | 2010-06-10 | Samsung Electronics Co., Ltd. | Display apparatus and displaying method thereof |
US20110172918A1 (en) * | 2010-01-13 | 2011-07-14 | Qualcomm Incorporated | Motion state detection for mobile device |
US20110181528A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Resizing Objects |
US20110185316A1 (en) * | 2010-01-26 | 2011-07-28 | Elizabeth Gloria Guarino Reid | Device, Method, and Graphical User Interface for Managing User Interface Content and User Interface Elements |
US20110193788A1 (en) * | 2010-02-10 | 2011-08-11 | Apple Inc. | Graphical objects that respond to touch or motion input |
US20110216076A1 (en) * | 2010-03-02 | 2011-09-08 | Samsung Electronics Co., Ltd. | Apparatus and method for providing animation effect in portable terminal |
US20110239114A1 (en) * | 2010-03-24 | 2011-09-29 | David Robbins Falkenburg | Apparatus and Method for Unified Experience Across Different Devices |
US20120078788A1 (en) * | 2010-09-28 | 2012-03-29 | Ebay Inc. | Transactions by flicking |
US20120102400A1 (en) * | 2010-10-22 | 2012-04-26 | Microsoft Corporation | Touch Gesture Notification Dismissal Techniques |
US20120159340A1 (en) * | 2010-12-16 | 2012-06-21 | Bae Jisoo | Mobile terminal and displaying method thereof |
US20120172681A1 (en) * | 2010-12-30 | 2012-07-05 | Stmicroelectronics R&D (Beijing) Co. Ltd | Subject monitor |
US20120206388A1 (en) * | 2011-02-10 | 2012-08-16 | Konica Minolta Business Technologies, Inc. | Image forming apparatus and terminal device each having touch panel |
US20120216153A1 (en) * | 2011-02-22 | 2012-08-23 | Acer Incorporated | Handheld devices, electronic devices, and data transmission methods and computer program products thereof |
US8253684B1 (en) * | 2010-11-02 | 2012-08-28 | Google Inc. | Position and orientation determination for a mobile computing device |
US20120218305A1 (en) * | 2011-02-24 | 2012-08-30 | Google Inc. | Systems and Methods for Manipulating User Annotations in Electronic Books |
US20120272162A1 (en) * | 2010-08-13 | 2012-10-25 | Net Power And Light, Inc. | Methods and systems for virtual experiences |
US20120278727A1 (en) * | 2011-04-29 | 2012-11-01 | Avaya Inc. | Method and apparatus for allowing drag-and-drop operations across the shared borders of adjacent touch screen-equipped devices |
US20120304063A1 (en) * | 2011-05-27 | 2012-11-29 | Cyberlink Corp. | Systems and Methods for Improving Object Detection |
US20130019193A1 (en) * | 2011-07-11 | 2013-01-17 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling content using graphical object |
US20130027341A1 (en) * | 2010-04-16 | 2013-01-31 | Mastandrea Nicholas J | Wearable motion sensing computing interface |
US20130027316A1 (en) * | 2011-07-26 | 2013-01-31 | Motorola Mobility, Inc. | User interface and method for managing a user interface state between a lock state and an unlock state |
US20130050277A1 (en) * | 2011-08-31 | 2013-02-28 | Hon Hai Precision Industry Co., Ltd. | Data transmitting media, data transmitting device, and data receiving device |
US20130052954A1 (en) * | 2011-08-23 | 2013-02-28 | Qualcomm Innovation Center, Inc. | Data transfer between mobile computing devices |
WO2013027077A1 (en) * | 2011-08-24 | 2013-02-28 | Sony Ericsson Mobile Communications Ab | Short-range radio frequency wireless communication data transfer methods and related devices |
US20130083005A1 (en) * | 2011-09-30 | 2013-04-04 | Nokia Corporation | Method and Apparatus for Accessing a Virtual Object |
WO2013049406A1 (en) * | 2011-10-01 | 2013-04-04 | Oracle International Corporation | Moving an object about a display frame by combining classical mechanics of motion |
US20130110974A1 (en) * | 2011-10-31 | 2013-05-02 | Nokia Corporation | Method and apparatus for controlled selection and copying of files to a target device |
US20130125018A1 (en) * | 2010-08-24 | 2013-05-16 | Lg Electronics Inc. | Method for controlling content-sharing, and portable terminal and content-sharing system using same |
US8464184B1 (en) * | 2010-11-30 | 2013-06-11 | Symantec Corporation | Systems and methods for gesture-based distribution of files |
US20130155072A1 (en) * | 2011-12-16 | 2013-06-20 | Fih (Hong Kong) Limited | Electronic device and method for managing files using the electronic device |
US20130167090A1 (en) * | 2011-12-22 | 2013-06-27 | Kyocera Corporation | Device, method, and storage medium storing program |
CN103218155A (en) * | 2012-01-19 | 2013-07-24 | 宏达国际电子股份有限公司 | Operating system and method |
US20130201123A1 (en) * | 2012-02-06 | 2013-08-08 | Lg Electronics Inc. | Mobile terminal and electronic communication method using the same |
WO2013126435A1 (en) * | 2012-02-20 | 2013-08-29 | Microsoft Corporation | Transferring of communication event |
US20130227037A1 (en) * | 2012-02-27 | 2013-08-29 | Damon Kyle Wayans | Method and apparatus for implementing a business card application |
US8539386B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
US8539385B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
CN103440095A (en) * | 2013-06-17 | 2013-12-11 | 华为技术有限公司 | File transmission method and terminal |
EP2680113A1 (en) * | 2012-02-20 | 2014-01-01 | Huawei Technologies Co., Ltd. | File data transmission method and device |
US20140013239A1 (en) * | 2011-01-24 | 2014-01-09 | Lg Electronics Inc. | Data sharing between smart devices |
WO2014015221A1 (en) * | 2012-07-19 | 2014-01-23 | Motorola Mobility Llc | Sending and receiving information |
US20140040762A1 (en) * | 2012-08-01 | 2014-02-06 | Google Inc. | Sharing a digital object |
EP2709286A1 (en) * | 2012-09-14 | 2014-03-19 | Samsung Electronics Co., Ltd | Apparatus and Method For Providing Data Transmission/Reception in a Terminal Using Near Field Communication |
US20140092002A1 (en) * | 2012-09-28 | 2014-04-03 | Apple Inc. | Movement Based Image Transformation |
CN103713822A (en) * | 2013-12-27 | 2014-04-09 | 联想(北京)有限公司 | Information processing method and first electronic equipment |
WO2014067843A1 (en) * | 2012-10-31 | 2014-05-08 | Intel Mobile Communications GmbH | Selecting devices for data transactions |
US8726198B2 (en) | 2012-01-23 | 2014-05-13 | Blackberry Limited | Electronic device and method of controlling a display |
EP2735961A1 (en) * | 2012-11-26 | 2014-05-28 | Canon Kabushiki Kaisha | Information processing apparatus which cooperate with another apparatus, and information processing system in which a plurality of information processing apparatus cooperate |
US20140168098A1 (en) * | 2012-12-17 | 2014-06-19 | Nokia Corporation | Apparatus and associated methods |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8782139B2 (en) | 2012-01-16 | 2014-07-15 | International Business Machines Corporation | Transferring applications and session state to a secondary device |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
EP2754010A2 (en) * | 2011-09-08 | 2014-07-16 | Samsung Electronics Co., Ltd. | Apparatus and content playback method thereof |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US20140223313A1 (en) * | 2013-02-07 | 2014-08-07 | Dizmo Ag | System for organizing and displaying information on a display device |
TWI450102B (en) * | 2011-10-14 | 2014-08-21 | Acer Inc | Data synchronization method and data synchronization system |
US20140253417A1 (en) * | 2013-03-11 | 2014-09-11 | International Business Machines Corporation | Colony desktop hive display: creating an extended desktop display from multiple mobile devices using near-field or other networking |
US20140282728A1 (en) * | 2012-01-26 | 2014-09-18 | Panasonic Corporation | Mobile terminal, television broadcast receiver, and device linkage method |
EP2785083A1 (en) * | 2013-03-28 | 2014-10-01 | NEC Corporation | Improved wireless communication of private data between two terminals |
US20140292720A1 (en) * | 2011-12-15 | 2014-10-02 | Uc Mobile Limited | Method, device, and system of cross-device data transfer |
US8854361B1 (en) | 2013-03-13 | 2014-10-07 | Cambridgesoft Corporation | Visually augmenting a graphical rendering of a chemical structure representation or biological sequence representation with multi-dimensional information |
KR20140120122A (en) * | 2013-04-02 | 2014-10-13 | 엘지전자 주식회사 | Multi screen device and method for controlling the same |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
EP2790390A1 (en) * | 2012-03-07 | 2014-10-15 | Huawei Device Co., Ltd. | Data transmission method and device |
EP2797280A1 (en) * | 2013-04-24 | 2014-10-29 | BlackBerry Limited | Device, system and method for generating display data |
WO2014176156A1 (en) * | 2013-04-22 | 2014-10-30 | Google Inc. | Moving content between devices using gestures |
US20140344764A1 (en) * | 2013-05-17 | 2014-11-20 | Barnesandnoble.Com Llc | Shake-based functions on a computing device |
US8914453B2 (en) | 2012-07-18 | 2014-12-16 | Blackberry Limited | Method and apparatus for motion based ping during chat mode |
CN104216506A (en) * | 2013-05-30 | 2014-12-17 | 华为技术有限公司 | Data interaction method and device based on gesture operation |
US20150006669A1 (en) * | 2013-07-01 | 2015-01-01 | Google Inc. | Systems and methods for directing information flow |
JP2015001971A (en) * | 2013-06-18 | 2015-01-05 | 船井電機株式会社 | Server device, content distribution control device, and content distribution system |
US20150022504A1 (en) * | 2013-07-17 | 2015-01-22 | Samsung Electronics Co., Ltd. | Method and device for transmitting/receiving data between wireless terminal and electronic pen |
US20150026723A1 (en) * | 2010-12-10 | 2015-01-22 | Rogers Communications Inc. | Method and device for controlling a video receiver |
US20150033121A1 (en) * | 2013-07-26 | 2015-01-29 | Disney Enterprises, Inc. | Motion based filtering of content elements |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
WO2015030795A1 (en) * | 2013-08-30 | 2015-03-05 | Hewlett Packard Development Company, L.P. | Touch input association |
US20150074253A1 (en) * | 2013-09-09 | 2015-03-12 | Samsung Electronics Co., Ltd. | Computing system with detection mechanism and method of operation thereof |
US9015641B2 (en) | 2011-01-06 | 2015-04-21 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
EP2866415A1 (en) * | 2013-10-24 | 2015-04-29 | NEC Corporation | Instant sharing of contents broadcasted over a local network |
US20150127590A1 (en) * | 2013-11-04 | 2015-05-07 | Google Inc. | Systems and methods for layered training in machine-learning architectures |
US9030409B2 (en) | 2013-01-11 | 2015-05-12 | Lg Electronics Inc. | Device for transmitting and receiving data using earphone and method for controlling the same |
US9031493B2 (en) | 2011-11-18 | 2015-05-12 | Google Inc. | Custom narration of electronic books |
US9031977B2 (en) | 2010-05-03 | 2015-05-12 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for processing documents to identify structures |
US20150153928A1 (en) * | 2013-12-04 | 2015-06-04 | Autodesk, Inc. | Techniques for interacting with handheld devices |
CN104754128A (en) * | 2015-03-23 | 2015-07-01 | 联想(北京)有限公司 | Information processing method and electronic equipment |
EP2891952A1 (en) * | 2014-01-03 | 2015-07-08 | Harman International Industries, Inc. | Seamless content transfer |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US20150229697A1 (en) * | 2014-02-12 | 2015-08-13 | Dell Products, Lp | System and Method for Transferring Data using a Directional Touch Gesture |
US20150253946A1 (en) * | 2014-03-10 | 2015-09-10 | Verizon Patent And Licensing Inc. | Method and apparatus for transferring files based on user interaction |
US20150261492A1 (en) * | 2014-03-13 | 2015-09-17 | Sony Corporation | Information processing apparatus, information processing method, and information processing system |
US9141404B2 (en) | 2011-10-24 | 2015-09-22 | Google Inc. | Extensible framework for ereader tools |
US20150268843A1 (en) * | 2014-03-24 | 2015-09-24 | Beijing Lenovo Software Ltd. | Information processing method and first electronic device |
US20150269797A1 (en) * | 2014-03-18 | 2015-09-24 | Google Inc. | Proximity-initiated physical mobile device gestures |
US9152632B2 (en) | 2008-08-27 | 2015-10-06 | Perkinelmer Informatics, Inc. | Information management system |
US9172979B2 (en) | 2010-08-12 | 2015-10-27 | Net Power And Light, Inc. | Experience or “sentio” codecs, and methods and systems for improving QoE and encoding based on QoE experiences |
GB2525902A (en) * | 2014-05-08 | 2015-11-11 | Ibm | Mobile device data transfer using location information |
US20150346979A1 (en) * | 2010-01-08 | 2015-12-03 | Sony Corporation | Information processing device and program |
US20150355722A1 (en) * | 2014-04-03 | 2015-12-10 | Futureplay Inc. | Method, Device, System And Non-Transitory Computer-Readable Recording Medium For Providing User Interface |
US9213413B2 (en) | 2013-12-31 | 2015-12-15 | Google Inc. | Device interaction with spatially aware gestures |
US9213421B2 (en) | 2011-02-28 | 2015-12-15 | Blackberry Limited | Electronic device and method of displaying information in response to detecting a gesture |
US20150378594A1 (en) * | 2011-09-10 | 2015-12-31 | Microsoft Technology Licensing, Llc | Progressively Indicating New Content in an Application-Selectable User Interface |
US20160034029A1 (en) * | 2011-12-14 | 2016-02-04 | Kenton M. Lyons | Gaze activated content transfer system |
US20160077714A1 (en) * | 2011-12-05 | 2016-03-17 | Houzz, Inc. | Animated Tags |
US20160119464A1 (en) * | 2014-10-24 | 2016-04-28 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20160140933A1 (en) * | 2014-04-04 | 2016-05-19 | Empire Technology Development Llc | Relative positioning of devices |
US20160180813A1 (en) * | 2013-07-25 | 2016-06-23 | Wei Zhou | Method and device for displaying objects |
EP2701036A3 (en) * | 2012-08-23 | 2016-07-27 | Samsung Electronics Co., Ltd. | Method of establishing communication link and display devices thereof |
US20160216862A1 (en) * | 2012-04-25 | 2016-07-28 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US9411507B2 (en) | 2012-10-02 | 2016-08-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Synchronized audio feedback for non-visual touch interface system and method |
CN105849710A (en) * | 2013-12-23 | 2016-08-10 | 英特尔公司 | Method for using magnetometer together with gesture to send content to wireless display |
US9423878B2 (en) | 2011-01-06 | 2016-08-23 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9430127B2 (en) | 2013-05-08 | 2016-08-30 | Cambridgesoft Corporation | Systems and methods for providing feedback cues for touch screen interface interaction with chemical and biological structure drawing applications |
US20160261903A1 (en) * | 2015-03-04 | 2016-09-08 | Comcast Cable Communications, Llc | Adaptive remote control |
WO2016144255A1 (en) * | 2015-03-06 | 2016-09-15 | Collaboration Platform Services Pte. Ltd. | Multi-user information sharing system |
US9465440B2 (en) | 2011-01-06 | 2016-10-11 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9471145B2 (en) | 2011-01-06 | 2016-10-18 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9477311B2 (en) | 2011-01-06 | 2016-10-25 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US20160343350A1 (en) * | 2015-05-19 | 2016-11-24 | Microsoft Technology Licensing, Llc | Gesture for task transfer |
US9507495B2 (en) | 2013-04-03 | 2016-11-29 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US20160349982A1 (en) * | 2015-05-26 | 2016-12-01 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
CN106178546A (en) * | 2016-09-25 | 2016-12-07 | 依云智酷(北京)科技有限公司 | A kind of intelligent toy projecting touch-control |
US9535583B2 (en) | 2012-12-13 | 2017-01-03 | Perkinelmer Informatics, Inc. | Draw-ahead feature for chemical structure drawing applications |
US9557817B2 (en) | 2010-08-13 | 2017-01-31 | Wickr Inc. | Recognizing gesture inputs using distributed processing of sensor data from multiple sensors |
US9639163B2 (en) | 2009-09-14 | 2017-05-02 | Microsoft Technology Licensing, Llc | Content transfer involving a gesture |
US9686346B2 (en) | 2013-04-24 | 2017-06-20 | Blackberry Limited | Device and method for generating data for generating or modifying a display object |
US9690404B2 (en) | 2012-04-17 | 2017-06-27 | Samsung Electronics Co., Ltd. | Method and electronic device for transmitting content |
US9690476B2 (en) | 2013-03-14 | 2017-06-27 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US20170220307A1 (en) * | 2016-02-02 | 2017-08-03 | Samsung Electronics Co., Ltd. | Multi-screen mobile device and operation |
US9751294B2 (en) | 2013-05-09 | 2017-09-05 | Perkinelmer Informatics, Inc. | Systems and methods for translating three dimensional graphic molecular models to computer aided design format |
US9766718B2 (en) | 2011-02-28 | 2017-09-19 | Blackberry Limited | Electronic device and method of displaying information in response to input |
US20170351404A1 (en) * | 2014-12-12 | 2017-12-07 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for moving icon, an apparatus and non-volatile computer storage medium |
US20170372678A1 (en) * | 2015-12-15 | 2017-12-28 | Tencent Technology (Shenzhen) Company Limited | Floating window processing method and apparatus |
US9886089B2 (en) | 2013-05-21 | 2018-02-06 | Samsung Electronics Co., Ltd | Method and apparatus for controlling vibration |
US20180046264A1 (en) * | 2016-08-11 | 2018-02-15 | National Taiwan Normal University | Method for transmitting a virtual object between electronic devices |
US20180077547A1 (en) * | 2016-09-15 | 2018-03-15 | Qualcomm Incorporated | Wireless directional sharing based on antenna sectors |
US20180121073A1 (en) * | 2016-10-27 | 2018-05-03 | International Business Machines Corporation | Gesture based smart download |
US9977876B2 (en) | 2012-02-24 | 2018-05-22 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for drawing chemical structures using touch and gestures |
US20180152830A1 (en) * | 2016-11-25 | 2018-05-31 | Fujitsu Limited | Information reception terminal and information distribution system |
US20180284849A1 (en) * | 2009-12-22 | 2018-10-04 | Nokia Technologies Oy | Output control using gesture input |
US10097591B2 (en) | 2012-01-26 | 2018-10-09 | Blackberry Limited | Methods and devices to determine a preferred electronic device |
US10163245B2 (en) | 2016-03-25 | 2018-12-25 | Microsoft Technology Licensing, Llc | Multi-mode animation system |
US10254927B2 (en) | 2009-09-25 | 2019-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US20190171700A1 (en) * | 2017-12-04 | 2019-06-06 | Microsoft Technology Licensing, Llc | Intelligent object movement |
US10345913B2 (en) * | 2014-03-28 | 2019-07-09 | Samsung Electronics Co., Ltd. | Method of interacting with multiple devices |
US10412131B2 (en) | 2013-03-13 | 2019-09-10 | Perkinelmer Informatics, Inc. | Systems and methods for gesture-based sharing of data between separate electronic devices |
US10572545B2 (en) | 2017-03-03 | 2020-02-25 | Perkinelmer Informatics, Inc | Systems and methods for searching and indexing documents comprising chemical information |
US10742729B2 (en) | 2016-07-22 | 2020-08-11 | Tinker Pte. Ltd. | Proximity network for interacting with nearby devices |
WO2021023208A1 (en) * | 2019-08-08 | 2021-02-11 | 华为技术有限公司 | Data sharing method, graphical user interface, related device, and system |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US11010972B2 (en) | 2015-12-11 | 2021-05-18 | Google Llc | Context sensitive user interface activation in an augmented and/or virtual reality environment |
CN114115524A (en) * | 2021-10-22 | 2022-03-01 | 青岛海尔科技有限公司 | Interaction method, storage medium and electronic device of smart water cup |
CN114138141A (en) * | 2021-10-29 | 2022-03-04 | 维沃移动通信有限公司 | Display method and device and electronic equipment |
US20220191668A1 (en) * | 2019-09-02 | 2022-06-16 | Huawei Technologies Co., Ltd. | Short-Distance Information Transmission Method and Electronic Device |
WO2022240572A1 (en) * | 2021-05-14 | 2022-11-17 | Microsoft Technology Licensing, Llc | Tilt-responsive techniques for sharing content |
US11556264B1 (en) * | 2021-07-26 | 2023-01-17 | Bank Of America Corporation | Offline data transfer between devices using gestures |
US12131094B2 (en) | 2020-11-25 | 2024-10-29 | Boe Technology Group Co., Ltd. | Screen projection interaction method, screen projection system and terminal device |
Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5063600A (en) * | 1990-05-14 | 1991-11-05 | Norwood Donald D | Hybrid information management system for handwriting and text |
US5429322A (en) * | 1994-04-22 | 1995-07-04 | Hughes Missile Systems Company | Advanced homing guidance system and method |
US5548705A (en) * | 1992-04-15 | 1996-08-20 | Xerox Corporation | Wiping metaphor as a user interface for operating on graphical objects on an interactive graphical display |
US5757360A (en) * | 1995-05-03 | 1998-05-26 | Mitsubishi Electric Information Technology Center America, Inc. | Hand held computer control device |
US5809278A (en) * | 1993-12-28 | 1998-09-15 | Kabushiki Kaisha Toshiba | Circuit for controlling access to a common memory based on priority |
US6111580A (en) * | 1995-09-13 | 2000-08-29 | Kabushiki Kaisha Toshiba | Apparatus and method for controlling an electronic device with user action |
US6288705B1 (en) * | 1997-08-23 | 2001-09-11 | Immersion Corporation | Interface device and method for providing indexed cursor control with force feedback |
US6636246B1 (en) * | 2000-03-17 | 2003-10-21 | Vizible.Com Inc. | Three dimensional spatial user interface |
US6681031B2 (en) * | 1998-08-10 | 2004-01-20 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US20050017947A1 (en) * | 2000-01-19 | 2005-01-27 | Shahoian Erik J. | Haptic input devices |
US20050030255A1 (en) * | 2003-08-07 | 2005-02-10 | Fuji Xerox Co., Ltd. | Peer to peer gesture based modular presentation system |
US20050219211A1 (en) * | 2004-03-31 | 2005-10-06 | Kotzin Michael D | Method and apparatus for content management and control |
US20060256074A1 (en) * | 2005-05-13 | 2006-11-16 | Robert Bosch Gmbh | Sensor-initiated exchange of information between devices |
US20060294247A1 (en) * | 2005-06-24 | 2006-12-28 | Microsoft Corporation | Extending digital artifacts through an interactive surface |
US20070146347A1 (en) * | 2005-04-22 | 2007-06-28 | Outland Research, Llc | Flick-gesture interface for handheld computing devices |
US20070157089A1 (en) * | 2005-12-30 | 2007-07-05 | Van Os Marcel | Portable Electronic Device with Interface Reconfiguration Mode |
US20070192692A1 (en) * | 2006-02-10 | 2007-08-16 | Microsoft Corporation | Method for confirming touch input |
US20070198926A1 (en) * | 2004-02-23 | 2007-08-23 | Jazzmutant | Devices and methods of controlling manipulation of virtual objects on a multi-contact tactile screen |
US20080218514A1 (en) * | 1996-08-02 | 2008-09-11 | Sensable Technologies, Inc. | Method and apparatus for generating and interfacing with a haptic virtual reality environment |
US20080291212A1 (en) * | 2007-05-23 | 2008-11-27 | Dean Robert Gary Anderson As Trustee Of D/L Anderson Family Trust | Software for creating engraved images |
US7460123B1 (en) * | 2004-05-06 | 2008-12-02 | The Mathworks, Inc. | Dynamic control of graphic representations of data |
US20080309634A1 (en) * | 2007-01-05 | 2008-12-18 | Apple Inc. | Multi-touch skins spanning three dimensions |
US20090066646A1 (en) * | 2007-09-06 | 2009-03-12 | Samsung Electronics Co., Ltd. | Pointing apparatus, pointer control apparatus, pointing method, and pointer control method |
US7532196B2 (en) * | 2003-10-30 | 2009-05-12 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US20100156812A1 (en) * | 2008-12-22 | 2010-06-24 | Verizon Data Services Llc | Gesture-based delivery from mobile device |
US20110078571A1 (en) * | 2009-09-29 | 2011-03-31 | Monstrous Company | Providing visual responses to musically synchronized touch input |
-
2010
- 2010-01-05 US US12/652,719 patent/US20110163944A1/en not_active Abandoned
Patent Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5063600A (en) * | 1990-05-14 | 1991-11-05 | Norwood Donald D | Hybrid information management system for handwriting and text |
US5548705A (en) * | 1992-04-15 | 1996-08-20 | Xerox Corporation | Wiping metaphor as a user interface for operating on graphical objects on an interactive graphical display |
US5809278A (en) * | 1993-12-28 | 1998-09-15 | Kabushiki Kaisha Toshiba | Circuit for controlling access to a common memory based on priority |
US5429322A (en) * | 1994-04-22 | 1995-07-04 | Hughes Missile Systems Company | Advanced homing guidance system and method |
US5757360A (en) * | 1995-05-03 | 1998-05-26 | Mitsubishi Electric Information Technology Center America, Inc. | Hand held computer control device |
US6111580A (en) * | 1995-09-13 | 2000-08-29 | Kabushiki Kaisha Toshiba | Apparatus and method for controlling an electronic device with user action |
US20080218514A1 (en) * | 1996-08-02 | 2008-09-11 | Sensable Technologies, Inc. | Method and apparatus for generating and interfacing with a haptic virtual reality environment |
US6288705B1 (en) * | 1997-08-23 | 2001-09-11 | Immersion Corporation | Interface device and method for providing indexed cursor control with force feedback |
US6681031B2 (en) * | 1998-08-10 | 2004-01-20 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US20050017947A1 (en) * | 2000-01-19 | 2005-01-27 | Shahoian Erik J. | Haptic input devices |
US6636246B1 (en) * | 2000-03-17 | 2003-10-21 | Vizible.Com Inc. | Three dimensional spatial user interface |
US20050030255A1 (en) * | 2003-08-07 | 2005-02-10 | Fuji Xerox Co., Ltd. | Peer to peer gesture based modular presentation system |
US7532196B2 (en) * | 2003-10-30 | 2009-05-12 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US20070198926A1 (en) * | 2004-02-23 | 2007-08-23 | Jazzmutant | Devices and methods of controlling manipulation of virtual objects on a multi-contact tactile screen |
US20050219211A1 (en) * | 2004-03-31 | 2005-10-06 | Kotzin Michael D | Method and apparatus for content management and control |
US7460123B1 (en) * | 2004-05-06 | 2008-12-02 | The Mathworks, Inc. | Dynamic control of graphic representations of data |
US20070146347A1 (en) * | 2005-04-22 | 2007-06-28 | Outland Research, Llc | Flick-gesture interface for handheld computing devices |
US20060256074A1 (en) * | 2005-05-13 | 2006-11-16 | Robert Bosch Gmbh | Sensor-initiated exchange of information between devices |
US20060294247A1 (en) * | 2005-06-24 | 2006-12-28 | Microsoft Corporation | Extending digital artifacts through an interactive surface |
US20070157089A1 (en) * | 2005-12-30 | 2007-07-05 | Van Os Marcel | Portable Electronic Device with Interface Reconfiguration Mode |
US20070192692A1 (en) * | 2006-02-10 | 2007-08-16 | Microsoft Corporation | Method for confirming touch input |
US20080309634A1 (en) * | 2007-01-05 | 2008-12-18 | Apple Inc. | Multi-touch skins spanning three dimensions |
US20080291212A1 (en) * | 2007-05-23 | 2008-11-27 | Dean Robert Gary Anderson As Trustee Of D/L Anderson Family Trust | Software for creating engraved images |
US20090066646A1 (en) * | 2007-09-06 | 2009-03-12 | Samsung Electronics Co., Ltd. | Pointing apparatus, pointer control apparatus, pointing method, and pointer control method |
US20100156812A1 (en) * | 2008-12-22 | 2010-06-24 | Verizon Data Services Llc | Gesture-based delivery from mobile device |
US20110078571A1 (en) * | 2009-09-29 | 2011-03-31 | Monstrous Company | Providing visual responses to musically synchronized touch input |
Cited By (278)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090265225A1 (en) * | 2006-01-17 | 2009-10-22 | Melinda Kathryn Puente | Transfer methods, systems and devices for managing a compliance instruction lifecycle |
US20070269782A1 (en) * | 2006-01-17 | 2007-11-22 | Puente Melinda K | Instructional game program and method |
US20090136016A1 (en) * | 2007-11-08 | 2009-05-28 | Meelik Gornoi | Transferring a communication event |
US9575980B2 (en) | 2008-08-27 | 2017-02-21 | Perkinelmer Informatics, Inc. | Information management system |
US9152632B2 (en) | 2008-08-27 | 2015-10-06 | Perkinelmer Informatics, Inc. | Information management system |
US20100146422A1 (en) * | 2008-12-08 | 2010-06-10 | Samsung Electronics Co., Ltd. | Display apparatus and displaying method thereof |
US9639163B2 (en) | 2009-09-14 | 2017-05-02 | Microsoft Technology Licensing, Llc | Content transfer involving a gesture |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11972104B2 (en) | 2009-09-22 | 2024-04-30 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11334229B2 (en) | 2009-09-22 | 2022-05-17 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10788965B2 (en) | 2009-09-22 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10564826B2 (en) | 2009-09-22 | 2020-02-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10282070B2 (en) | 2009-09-22 | 2019-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11366576B2 (en) | 2009-09-25 | 2022-06-21 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US9310907B2 (en) | 2009-09-25 | 2016-04-12 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11947782B2 (en) | 2009-09-25 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US10254927B2 (en) | 2009-09-25 | 2019-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10928993B2 (en) | 2009-09-25 | 2021-02-23 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20180284849A1 (en) * | 2009-12-22 | 2018-10-04 | Nokia Technologies Oy | Output control using gesture input |
US20150346979A1 (en) * | 2010-01-08 | 2015-12-03 | Sony Corporation | Information processing device and program |
US10296189B2 (en) * | 2010-01-08 | 2019-05-21 | Sony Corporation | Information processing device and program |
US20110172918A1 (en) * | 2010-01-13 | 2011-07-14 | Qualcomm Incorporated | Motion state detection for mobile device |
US8612884B2 (en) | 2010-01-26 | 2013-12-17 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US20110185316A1 (en) * | 2010-01-26 | 2011-07-28 | Elizabeth Gloria Guarino Reid | Device, Method, and Graphical User Interface for Managing User Interface Content and User Interface Elements |
US8539386B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
US20110181528A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Resizing Objects |
US8677268B2 (en) | 2010-01-26 | 2014-03-18 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US8683363B2 (en) * | 2010-01-26 | 2014-03-25 | Apple Inc. | Device, method, and graphical user interface for managing user interface content and user interface elements |
US8539385B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
US20110193788A1 (en) * | 2010-02-10 | 2011-08-11 | Apple Inc. | Graphical objects that respond to touch or motion input |
US8839150B2 (en) * | 2010-02-10 | 2014-09-16 | Apple Inc. | Graphical objects that respond to touch or motion input |
US20110216076A1 (en) * | 2010-03-02 | 2011-09-08 | Samsung Electronics Co., Ltd. | Apparatus and method for providing animation effect in portable terminal |
US20110239114A1 (en) * | 2010-03-24 | 2011-09-29 | David Robbins Falkenburg | Apparatus and Method for Unified Experience Across Different Devices |
US9110505B2 (en) * | 2010-04-16 | 2015-08-18 | Innovative Devices Inc. | Wearable motion sensing computing interface |
US20130027341A1 (en) * | 2010-04-16 | 2013-01-31 | Mastandrea Nicholas J | Wearable motion sensing computing interface |
US9031977B2 (en) | 2010-05-03 | 2015-05-12 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for processing documents to identify structures |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
US9626098B2 (en) | 2010-07-30 | 2017-04-18 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9172979B2 (en) | 2010-08-12 | 2015-10-27 | Net Power And Light, Inc. | Experience or “sentio” codecs, and methods and systems for improving QoE and encoding based on QoE experiences |
US9557817B2 (en) | 2010-08-13 | 2017-01-31 | Wickr Inc. | Recognizing gesture inputs using distributed processing of sensor data from multiple sensors |
US20120272162A1 (en) * | 2010-08-13 | 2012-10-25 | Net Power And Light, Inc. | Methods and systems for virtual experiences |
US9535561B2 (en) * | 2010-08-24 | 2017-01-03 | Lg Electronics Inc. | Method for controlling content-sharing, and portable terminal and content-sharing system using same |
US20130125018A1 (en) * | 2010-08-24 | 2013-05-16 | Lg Electronics Inc. | Method for controlling content-sharing, and portable terminal and content-sharing system using same |
US20120078788A1 (en) * | 2010-09-28 | 2012-03-29 | Ebay Inc. | Transactions by flicking |
US10740807B2 (en) | 2010-09-28 | 2020-08-11 | Paypal, Inc. | Systems and methods for transmission of representational image-based offers based on a tactile input |
US20120102400A1 (en) * | 2010-10-22 | 2012-04-26 | Microsoft Corporation | Touch Gesture Notification Dismissal Techniques |
US8648799B1 (en) * | 2010-11-02 | 2014-02-11 | Google Inc. | Position and orientation determination for a mobile computing device |
US8253684B1 (en) * | 2010-11-02 | 2012-08-28 | Google Inc. | Position and orientation determination for a mobile computing device |
US8464184B1 (en) * | 2010-11-30 | 2013-06-11 | Symantec Corporation | Systems and methods for gesture-based distribution of files |
US20150026723A1 (en) * | 2010-12-10 | 2015-01-22 | Rogers Communications Inc. | Method and device for controlling a video receiver |
US20120159340A1 (en) * | 2010-12-16 | 2012-06-21 | Bae Jisoo | Mobile terminal and displaying method thereof |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US11126333B2 (en) | 2010-12-23 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US20120172681A1 (en) * | 2010-12-30 | 2012-07-05 | Stmicroelectronics R&D (Beijing) Co. Ltd | Subject monitor |
US10649538B2 (en) | 2011-01-06 | 2020-05-12 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9471145B2 (en) | 2011-01-06 | 2016-10-18 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US10481788B2 (en) | 2011-01-06 | 2019-11-19 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
US9684378B2 (en) | 2011-01-06 | 2017-06-20 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US11698723B2 (en) | 2011-01-06 | 2023-07-11 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
US10884618B2 (en) | 2011-01-06 | 2021-01-05 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
US9465440B2 (en) | 2011-01-06 | 2016-10-11 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US10191556B2 (en) | 2011-01-06 | 2019-01-29 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9015641B2 (en) | 2011-01-06 | 2015-04-21 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
US9766802B2 (en) | 2011-01-06 | 2017-09-19 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
US9477311B2 (en) | 2011-01-06 | 2016-10-25 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US11379115B2 (en) | 2011-01-06 | 2022-07-05 | Blackberry Limited | Electronic device and method of providing visual notification of a received communication |
US9423878B2 (en) | 2011-01-06 | 2016-08-23 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US20140013239A1 (en) * | 2011-01-24 | 2014-01-09 | Lg Electronics Inc. | Data sharing between smart devices |
US20120206388A1 (en) * | 2011-02-10 | 2012-08-16 | Konica Minolta Business Technologies, Inc. | Image forming apparatus and terminal device each having touch panel |
US9733793B2 (en) * | 2011-02-10 | 2017-08-15 | Konica Minolta, Inc. | Image forming apparatus and terminal device each having touch panel |
US20120216153A1 (en) * | 2011-02-22 | 2012-08-23 | Acer Incorporated | Handheld devices, electronic devices, and data transmission methods and computer program products thereof |
US9645986B2 (en) | 2011-02-24 | 2017-05-09 | Google Inc. | Method, medium, and system for creating an electronic book with an umbrella policy |
US8520025B2 (en) * | 2011-02-24 | 2013-08-27 | Google Inc. | Systems and methods for manipulating user annotations in electronic books |
US8543941B2 (en) | 2011-02-24 | 2013-09-24 | Google Inc. | Electronic book contextual menu systems and methods |
US9501461B2 (en) | 2011-02-24 | 2016-11-22 | Google Inc. | Systems and methods for manipulating user annotations in electronic books |
US9063641B2 (en) | 2011-02-24 | 2015-06-23 | Google Inc. | Systems and methods for remote collaborative studying using electronic books |
US10067922B2 (en) | 2011-02-24 | 2018-09-04 | Google Llc | Automated study guide generation for electronic books |
US20120218305A1 (en) * | 2011-02-24 | 2012-08-30 | Google Inc. | Systems and Methods for Manipulating User Annotations in Electronic Books |
US9213421B2 (en) | 2011-02-28 | 2015-12-15 | Blackberry Limited | Electronic device and method of displaying information in response to detecting a gesture |
US9766718B2 (en) | 2011-02-28 | 2017-09-19 | Blackberry Limited | Electronic device and method of displaying information in response to input |
US20120278727A1 (en) * | 2011-04-29 | 2012-11-01 | Avaya Inc. | Method and apparatus for allowing drag-and-drop operations across the shared borders of adjacent touch screen-equipped devices |
US9367224B2 (en) * | 2011-04-29 | 2016-06-14 | Avaya Inc. | Method and apparatus for allowing drag-and-drop operations across the shared borders of adjacent touch screen-equipped devices |
US20120304063A1 (en) * | 2011-05-27 | 2012-11-29 | Cyberlink Corp. | Systems and Methods for Improving Object Detection |
US8769409B2 (en) * | 2011-05-27 | 2014-07-01 | Cyberlink Corp. | Systems and methods for improving object detection |
US9727225B2 (en) * | 2011-07-11 | 2017-08-08 | Samsung Electronics Co., Ltd | Method and apparatus for controlling content using graphical object |
US20130019193A1 (en) * | 2011-07-11 | 2013-01-17 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling content using graphical object |
US20130027316A1 (en) * | 2011-07-26 | 2013-01-31 | Motorola Mobility, Inc. | User interface and method for managing a user interface state between a lock state and an unlock state |
US20130052954A1 (en) * | 2011-08-23 | 2013-02-28 | Qualcomm Innovation Center, Inc. | Data transfer between mobile computing devices |
WO2013027077A1 (en) * | 2011-08-24 | 2013-02-28 | Sony Ericsson Mobile Communications Ab | Short-range radio frequency wireless communication data transfer methods and related devices |
US20130050277A1 (en) * | 2011-08-31 | 2013-02-28 | Hon Hai Precision Industry Co., Ltd. | Data transmitting media, data transmitting device, and data receiving device |
EP2754010A2 (en) * | 2011-09-08 | 2014-07-16 | Samsung Electronics Co., Ltd. | Apparatus and content playback method thereof |
EP2754010A4 (en) * | 2011-09-08 | 2015-04-08 | Samsung Electronics Co Ltd | Apparatus and content playback method thereof |
US20150378594A1 (en) * | 2011-09-10 | 2015-12-31 | Microsoft Technology Licensing, Llc | Progressively Indicating New Content in an Application-Selectable User Interface |
US10254955B2 (en) * | 2011-09-10 | 2019-04-09 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US9930128B2 (en) * | 2011-09-30 | 2018-03-27 | Nokia Technologies Oy | Method and apparatus for accessing a virtual object |
US20130083005A1 (en) * | 2011-09-30 | 2013-04-04 | Nokia Corporation | Method and Apparatus for Accessing a Virtual Object |
WO2013045763A1 (en) * | 2011-09-30 | 2013-04-04 | Nokia Corporation | Method and apparatus for accessing a virtual object |
US9448633B2 (en) | 2011-10-01 | 2016-09-20 | Oracle International Corporation | Moving a display object within a display frame using a discrete gesture |
US9501150B2 (en) | 2011-10-01 | 2016-11-22 | Oracle International Corporation | Moving an object about a display frame by combining classical mechanics of motion |
WO2013049406A1 (en) * | 2011-10-01 | 2013-04-04 | Oracle International Corporation | Moving an object about a display frame by combining classical mechanics of motion |
TWI450102B (en) * | 2011-10-14 | 2014-08-21 | Acer Inc | Data synchronization method and data synchronization system |
US9141404B2 (en) | 2011-10-24 | 2015-09-22 | Google Inc. | Extensible framework for ereader tools |
US9678634B2 (en) | 2011-10-24 | 2017-06-13 | Google Inc. | Extensible framework for ereader tools |
WO2013064733A2 (en) | 2011-10-31 | 2013-05-10 | Nokia Corporation | Method and apparatus for controlled selection and copying of files to a target device |
US20130110974A1 (en) * | 2011-10-31 | 2013-05-02 | Nokia Corporation | Method and apparatus for controlled selection and copying of files to a target device |
EP2774349A4 (en) * | 2011-10-31 | 2015-05-06 | Nokia Corp | METHOD AND APPARATUS FOR CONTROLLED SELECTION AND COPYING OF FILES TO A TARGET DEVICE |
US9031493B2 (en) | 2011-11-18 | 2015-05-12 | Google Inc. | Custom narration of electronic books |
US20160077714A1 (en) * | 2011-12-05 | 2016-03-17 | Houzz, Inc. | Animated Tags |
US10664892B2 (en) | 2011-12-05 | 2020-05-26 | Houzz, Inc. | Page content display with conditional scroll gesture snapping |
US10657573B2 (en) * | 2011-12-05 | 2020-05-19 | Houzz, Inc. | Network site tag based display of images |
US20160034029A1 (en) * | 2011-12-14 | 2016-02-04 | Kenton M. Lyons | Gaze activated content transfer system |
US9766700B2 (en) * | 2011-12-14 | 2017-09-19 | Intel Corporation | Gaze activated content transfer system |
US20140292720A1 (en) * | 2011-12-15 | 2014-10-02 | Uc Mobile Limited | Method, device, and system of cross-device data transfer |
US9430047B2 (en) * | 2011-12-15 | 2016-08-30 | Uc Mobile Limited | Method, device, and system of cross-device data transfer |
US20130155072A1 (en) * | 2011-12-16 | 2013-06-20 | Fih (Hong Kong) Limited | Electronic device and method for managing files using the electronic device |
US20130167090A1 (en) * | 2011-12-22 | 2013-06-27 | Kyocera Corporation | Device, method, and storage medium storing program |
US8782139B2 (en) | 2012-01-16 | 2014-07-15 | International Business Machines Corporation | Transferring applications and session state to a secondary device |
US8938518B2 (en) | 2012-01-16 | 2015-01-20 | International Business Machines Corporation | Transferring applications and session state to a secondary device |
CN103218155A (en) * | 2012-01-19 | 2013-07-24 | 宏达国际电子股份有限公司 | Operating system and method |
US20130187862A1 (en) * | 2012-01-19 | 2013-07-25 | Cheng-Shiun Jan | Systems and methods for operation activation |
US9619038B2 (en) | 2012-01-23 | 2017-04-11 | Blackberry Limited | Electronic device and method of displaying a cover image and an application image from a low power condition |
US8726198B2 (en) | 2012-01-23 | 2014-05-13 | Blackberry Limited | Electronic device and method of controlling a display |
US9226015B2 (en) * | 2012-01-26 | 2015-12-29 | Panasonic Intellectual Property Management Co., Ltd. | Mobile terminal, television broadcast receiver, and device linkage method |
US20170017458A1 (en) * | 2012-01-26 | 2017-01-19 | Panasonic Intellectual Property Management Co., Ltd. | Mobile terminal, television broadcast receiver, and device linkage method |
US10097591B2 (en) | 2012-01-26 | 2018-10-09 | Blackberry Limited | Methods and devices to determine a preferred electronic device |
US9491501B2 (en) * | 2012-01-26 | 2016-11-08 | Panasonic Intellectual Property Management Co., Ltd. | Mobile terminal, television broadcast receiver, and device linkage method |
US20140282728A1 (en) * | 2012-01-26 | 2014-09-18 | Panasonic Corporation | Mobile terminal, television broadcast receiver, and device linkage method |
CN103297608A (en) * | 2012-02-06 | 2013-09-11 | Lg电子株式会社 | Mobile terminal and electronic communication method using the same |
US20130201123A1 (en) * | 2012-02-06 | 2013-08-08 | Lg Electronics Inc. | Mobile terminal and electronic communication method using the same |
EP2680113A1 (en) * | 2012-02-20 | 2014-01-01 | Huawei Technologies Co., Ltd. | File data transmission method and device |
US8850364B2 (en) * | 2012-02-20 | 2014-09-30 | Huawei Technologies Co., Ltd. | Method and device for sending file data |
WO2013126435A1 (en) * | 2012-02-20 | 2013-08-29 | Microsoft Corporation | Transferring of communication event |
EP2680113A4 (en) * | 2012-02-20 | 2014-04-02 | Huawei Tech Co Ltd | File data transmission method and device |
US9977876B2 (en) | 2012-02-24 | 2018-05-22 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for drawing chemical structures using touch and gestures |
US20130227037A1 (en) * | 2012-02-27 | 2013-08-29 | Damon Kyle Wayans | Method and apparatus for implementing a business card application |
US20170078334A1 (en) * | 2012-02-27 | 2017-03-16 | Damon Kyle Wayans | Method and Apparatus for Implementing A Business Card Application |
WO2013130597A2 (en) * | 2012-02-27 | 2013-09-06 | Wayans Damon Kyle | Method and apparatus for implementing a business card application |
WO2013130597A3 (en) * | 2012-02-27 | 2015-01-22 | Wayans Damon Kyle | Method and apparatus for implementing a business card application |
US9537901B2 (en) * | 2012-02-27 | 2017-01-03 | Damon Kyle Wayans | Method and apparatus for implementing a business card application |
JP2015510717A (en) * | 2012-03-07 | 2015-04-09 | ▲華▼▲為▼▲終▼端有限公司 | Data transmission method and apparatus |
EP2790390A4 (en) * | 2012-03-07 | 2014-11-19 | Huawei Device Co Ltd | Data transmission method and device |
EP2790390A1 (en) * | 2012-03-07 | 2014-10-15 | Huawei Device Co., Ltd. | Data transmission method and device |
US9690404B2 (en) | 2012-04-17 | 2017-06-27 | Samsung Electronics Co., Ltd. | Method and electronic device for transmitting content |
US20160216862A1 (en) * | 2012-04-25 | 2016-07-28 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US9507512B1 (en) * | 2012-04-25 | 2016-11-29 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US10871893B2 (en) * | 2012-04-25 | 2020-12-22 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US8914453B2 (en) | 2012-07-18 | 2014-12-16 | Blackberry Limited | Method and apparatus for motion based ping during chat mode |
US20140022183A1 (en) * | 2012-07-19 | 2014-01-23 | General Instrument Corporation | Sending and receiving information |
WO2014015221A1 (en) * | 2012-07-19 | 2014-01-23 | Motorola Mobility Llc | Sending and receiving information |
US20140040762A1 (en) * | 2012-08-01 | 2014-02-06 | Google Inc. | Sharing a digital object |
US9990055B2 (en) * | 2012-08-23 | 2018-06-05 | Samsung Electronics Co., Ltd. | Method of establishing communication link and display devices thereof |
EP2701036A3 (en) * | 2012-08-23 | 2016-07-27 | Samsung Electronics Co., Ltd. | Method of establishing communication link and display devices thereof |
US9600695B2 (en) | 2012-09-14 | 2017-03-21 | Samsung Electronic Co., Ltd. | Apparatus and method for providing data transmission/reception in a terminal using near field communication |
EP2709286A1 (en) * | 2012-09-14 | 2014-03-19 | Samsung Electronics Co., Ltd | Apparatus and Method For Providing Data Transmission/Reception in a Terminal Using Near Field Communication |
US9354721B2 (en) * | 2012-09-28 | 2016-05-31 | Apple Inc. | Movement based image transformation |
US20140092002A1 (en) * | 2012-09-28 | 2014-04-03 | Apple Inc. | Movement Based Image Transformation |
US9411507B2 (en) | 2012-10-02 | 2016-08-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Synchronized audio feedback for non-visual touch interface system and method |
WO2014067843A1 (en) * | 2012-10-31 | 2014-05-08 | Intel Mobile Communications GmbH | Selecting devices for data transactions |
JP2014123349A (en) * | 2012-11-26 | 2014-07-03 | Canon Inc | Information processing system |
US20140145988A1 (en) * | 2012-11-26 | 2014-05-29 | Canon Kabushiki Kaisha | Information processing apparatus which cooperates with other apparatus, and information processing system in which a plurality of information processing apparatuses cooperates |
US9269331B2 (en) * | 2012-11-26 | 2016-02-23 | Canon Kabushiki Kaisha | Information processing apparatus which cooperates with other apparatus, and information processing system in which a plurality of information processing apparatuses cooperates |
EP2735961A1 (en) * | 2012-11-26 | 2014-05-28 | Canon Kabushiki Kaisha | Information processing apparatus which cooperate with another apparatus, and information processing system in which a plurality of information processing apparatus cooperate |
CN103838541A (en) * | 2012-11-26 | 2014-06-04 | 佳能株式会社 | Information processing apparatus and information processing system |
US9535583B2 (en) | 2012-12-13 | 2017-01-03 | Perkinelmer Informatics, Inc. | Draw-ahead feature for chemical structure drawing applications |
US20140168098A1 (en) * | 2012-12-17 | 2014-06-19 | Nokia Corporation | Apparatus and associated methods |
US9030409B2 (en) | 2013-01-11 | 2015-05-12 | Lg Electronics Inc. | Device for transmitting and receiving data using earphone and method for controlling the same |
US11675609B2 (en) | 2013-02-07 | 2023-06-13 | Dizmo Ag | System for organizing and displaying information on a display device |
US20140223313A1 (en) * | 2013-02-07 | 2014-08-07 | Dizmo Ag | System for organizing and displaying information on a display device |
US9645718B2 (en) * | 2013-02-07 | 2017-05-09 | Dizmo Ag | System for organizing and displaying information on a display device |
US20140253417A1 (en) * | 2013-03-11 | 2014-09-11 | International Business Machines Corporation | Colony desktop hive display: creating an extended desktop display from multiple mobile devices using near-field or other networking |
US9858031B2 (en) * | 2013-03-11 | 2018-01-02 | International Business Machines Corporation | Colony desktop hive display: creating an extended desktop display from multiple mobile devices using near-field or other networking |
US10412131B2 (en) | 2013-03-13 | 2019-09-10 | Perkinelmer Informatics, Inc. | Systems and methods for gesture-based sharing of data between separate electronic devices |
US8854361B1 (en) | 2013-03-13 | 2014-10-07 | Cambridgesoft Corporation | Visually augmenting a graphical rendering of a chemical structure representation or biological sequence representation with multi-dimensional information |
US11164660B2 (en) | 2013-03-13 | 2021-11-02 | Perkinelmer Informatics, Inc. | Visually augmenting a graphical rendering of a chemical structure representation or biological sequence representation with multi-dimensional information |
US9690476B2 (en) | 2013-03-14 | 2017-06-27 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
EP2785083A1 (en) * | 2013-03-28 | 2014-10-01 | NEC Corporation | Improved wireless communication of private data between two terminals |
KR102112005B1 (en) * | 2013-04-02 | 2020-05-18 | 엘지전자 주식회사 | Multi screen device and method for controlling the same |
KR20140120122A (en) * | 2013-04-02 | 2014-10-13 | 엘지전자 주식회사 | Multi screen device and method for controlling the same |
US9507495B2 (en) | 2013-04-03 | 2016-11-29 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
WO2014176156A1 (en) * | 2013-04-22 | 2014-10-30 | Google Inc. | Moving content between devices using gestures |
US9686346B2 (en) | 2013-04-24 | 2017-06-20 | Blackberry Limited | Device and method for generating data for generating or modifying a display object |
EP2797280A1 (en) * | 2013-04-24 | 2014-10-29 | BlackBerry Limited | Device, system and method for generating display data |
US9430127B2 (en) | 2013-05-08 | 2016-08-30 | Cambridgesoft Corporation | Systems and methods for providing feedback cues for touch screen interface interaction with chemical and biological structure drawing applications |
US9751294B2 (en) | 2013-05-09 | 2017-09-05 | Perkinelmer Informatics, Inc. | Systems and methods for translating three dimensional graphic molecular models to computer aided design format |
US9310890B2 (en) * | 2013-05-17 | 2016-04-12 | Barnes & Noble College Booksellers, Llc | Shake-based functions on a computing device |
US20140344764A1 (en) * | 2013-05-17 | 2014-11-20 | Barnesandnoble.Com Llc | Shake-based functions on a computing device |
US9886089B2 (en) | 2013-05-21 | 2018-02-06 | Samsung Electronics Co., Ltd | Method and apparatus for controlling vibration |
CN104216506A (en) * | 2013-05-30 | 2014-12-17 | 华为技术有限公司 | Data interaction method and device based on gesture operation |
CN103440095A (en) * | 2013-06-17 | 2013-12-11 | 华为技术有限公司 | File transmission method and terminal |
JP2015001971A (en) * | 2013-06-18 | 2015-01-05 | 船井電機株式会社 | Server device, content distribution control device, and content distribution system |
US20150006669A1 (en) * | 2013-07-01 | 2015-01-01 | Google Inc. | Systems and methods for directing information flow |
US10379631B2 (en) * | 2013-07-17 | 2019-08-13 | Samsung Electronics Co., Ltd. | Method and device for transmitting/receiving data between wireless terminal and electronic pen |
US20150022504A1 (en) * | 2013-07-17 | 2015-01-22 | Samsung Electronics Co., Ltd. | Method and device for transmitting/receiving data between wireless terminal and electronic pen |
US20160180813A1 (en) * | 2013-07-25 | 2016-06-23 | Wei Zhou | Method and device for displaying objects |
US20150033121A1 (en) * | 2013-07-26 | 2015-01-29 | Disney Enterprises, Inc. | Motion based filtering of content elements |
TWI588736B (en) * | 2013-08-30 | 2017-06-21 | 惠普發展公司有限責任合夥企業 | Projective computing system, method to modify touch input in the same, and related non-transitory computer readable storage device |
WO2015030795A1 (en) * | 2013-08-30 | 2015-03-05 | Hewlett Packard Development Company, L.P. | Touch input association |
US10168897B2 (en) | 2013-08-30 | 2019-01-01 | Hewlett-Packard Development Company, L.P. | Touch input association |
US20150074253A1 (en) * | 2013-09-09 | 2015-03-12 | Samsung Electronics Co., Ltd. | Computing system with detection mechanism and method of operation thereof |
US9716991B2 (en) * | 2013-09-09 | 2017-07-25 | Samsung Electronics Co., Ltd. | Computing system with detection mechanism and method of operation thereof |
EP2866415A1 (en) * | 2013-10-24 | 2015-04-29 | NEC Corporation | Instant sharing of contents broadcasted over a local network |
US9286574B2 (en) * | 2013-11-04 | 2016-03-15 | Google Inc. | Systems and methods for layered training in machine-learning architectures |
US20150127590A1 (en) * | 2013-11-04 | 2015-05-07 | Google Inc. | Systems and methods for layered training in machine-learning architectures |
US20150153928A1 (en) * | 2013-12-04 | 2015-06-04 | Autodesk, Inc. | Techniques for interacting with handheld devices |
US11704016B2 (en) * | 2013-12-04 | 2023-07-18 | Autodesk, Inc. | Techniques for interacting with handheld devices |
US9965040B2 (en) | 2013-12-23 | 2018-05-08 | Intel Corporation | Method for using magnetometer together with gesture to send content to wireless display |
EP3087494A4 (en) * | 2013-12-23 | 2017-07-05 | Intel Corporation | Method for using magnetometer together with gesture to send content to wireless display |
CN105849710A (en) * | 2013-12-23 | 2016-08-10 | 英特尔公司 | Method for using magnetometer together with gesture to send content to wireless display |
US20150185823A1 (en) * | 2013-12-27 | 2015-07-02 | Lenovo (Beijing) Limited | Information processing method and first electronic device |
US10234933B2 (en) * | 2013-12-27 | 2019-03-19 | Beijing Lenovo Software Ltd. | Information processing method and first electronic device |
CN103713822A (en) * | 2013-12-27 | 2014-04-09 | 联想(北京)有限公司 | Information processing method and first electronic equipment |
US9213413B2 (en) | 2013-12-31 | 2015-12-15 | Google Inc. | Device interaction with spatially aware gestures |
US9671873B2 (en) | 2013-12-31 | 2017-06-06 | Google Inc. | Device interaction with spatially aware gestures |
US10254847B2 (en) | 2013-12-31 | 2019-04-09 | Google Llc | Device interaction with spatially aware gestures |
EP2891952A1 (en) * | 2014-01-03 | 2015-07-08 | Harman International Industries, Inc. | Seamless content transfer |
US10116728B2 (en) * | 2014-02-12 | 2018-10-30 | Dell Products, Lp | System and method for transferring data using a directional touch gesture |
US20150229697A1 (en) * | 2014-02-12 | 2015-08-13 | Dell Products, Lp | System and Method for Transferring Data using a Directional Touch Gesture |
US20150253946A1 (en) * | 2014-03-10 | 2015-09-10 | Verizon Patent And Licensing Inc. | Method and apparatus for transferring files based on user interaction |
US20150261492A1 (en) * | 2014-03-13 | 2015-09-17 | Sony Corporation | Information processing apparatus, information processing method, and information processing system |
US20150269797A1 (en) * | 2014-03-18 | 2015-09-24 | Google Inc. | Proximity-initiated physical mobile device gestures |
US9721411B2 (en) * | 2014-03-18 | 2017-08-01 | Google Inc. | Proximity-initiated physical mobile device gestures |
US20150268843A1 (en) * | 2014-03-24 | 2015-09-24 | Beijing Lenovo Software Ltd. | Information processing method and first electronic device |
US10216392B2 (en) * | 2014-03-24 | 2019-02-26 | Beijing Lenovo Software Ltd. | Information processing method and first electronic device for detecting second electronic device |
US10345913B2 (en) * | 2014-03-28 | 2019-07-09 | Samsung Electronics Co., Ltd. | Method of interacting with multiple devices |
US10175767B2 (en) | 2014-04-03 | 2019-01-08 | Futureplay Inc. | Method, device, system and non-transitory computer-readable recording medium for providing user interface |
US20150355722A1 (en) * | 2014-04-03 | 2015-12-10 | Futureplay Inc. | Method, Device, System And Non-Transitory Computer-Readable Recording Medium For Providing User Interface |
US20160140933A1 (en) * | 2014-04-04 | 2016-05-19 | Empire Technology Development Llc | Relative positioning of devices |
GB2525902A (en) * | 2014-05-08 | 2015-11-11 | Ibm | Mobile device data transfer using location information |
US20160119464A1 (en) * | 2014-10-24 | 2016-04-28 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9826078B2 (en) * | 2014-10-24 | 2017-11-21 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20170351404A1 (en) * | 2014-12-12 | 2017-12-07 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for moving icon, an apparatus and non-volatile computer storage medium |
US20160261903A1 (en) * | 2015-03-04 | 2016-09-08 | Comcast Cable Communications, Llc | Adaptive remote control |
US11503360B2 (en) * | 2015-03-04 | 2022-11-15 | Comcast Cable Communications, Llc | Adaptive remote control |
WO2016144255A1 (en) * | 2015-03-06 | 2016-09-15 | Collaboration Platform Services Pte. Ltd. | Multi-user information sharing system |
CN104754128A (en) * | 2015-03-23 | 2015-07-01 | 联想(北京)有限公司 | Information processing method and electronic equipment |
US20160343350A1 (en) * | 2015-05-19 | 2016-11-24 | Microsoft Technology Licensing, Llc | Gesture for task transfer |
US10102824B2 (en) * | 2015-05-19 | 2018-10-16 | Microsoft Technology Licensing, Llc | Gesture for task transfer |
US20160349982A1 (en) * | 2015-05-26 | 2016-12-01 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
US10162515B2 (en) * | 2015-05-26 | 2018-12-25 | Beijing Lenovo Software Ltd. | Method and electronic device for controlling display objects on a touch display based on a touch directional touch operation that both selects and executes a function |
US11010972B2 (en) | 2015-12-11 | 2021-05-18 | Google Llc | Context sensitive user interface activation in an augmented and/or virtual reality environment |
US12243171B2 (en) | 2015-12-11 | 2025-03-04 | Google Llc | Context sensitive user interface activation in an augmented and/or virtual reality environment |
US10515610B2 (en) * | 2015-12-15 | 2019-12-24 | Tencent Technology (Shenzhen) Company Limited | Floating window processing method and apparatus |
US20170372678A1 (en) * | 2015-12-15 | 2017-12-28 | Tencent Technology (Shenzhen) Company Limited | Floating window processing method and apparatus |
US20170220307A1 (en) * | 2016-02-02 | 2017-08-03 | Samsung Electronics Co., Ltd. | Multi-screen mobile device and operation |
US10163245B2 (en) | 2016-03-25 | 2018-12-25 | Microsoft Technology Licensing, Llc | Multi-mode animation system |
US11265373B2 (en) | 2016-07-22 | 2022-03-01 | Neeraj Jhanji | Systems and methods to discover and notify devices that come in close proximity with each other |
US11019141B2 (en) | 2016-07-22 | 2021-05-25 | Neeraj Jhanji | Systems and methods to discover and notify devices that come in close proximity with each other |
US11115467B2 (en) | 2016-07-22 | 2021-09-07 | Neeraj Jhanji | Systems and methods to discover and notify devices that come in close proximity with each other |
US10742729B2 (en) | 2016-07-22 | 2020-08-11 | Tinker Pte. Ltd. | Proximity network for interacting with nearby devices |
US10951698B2 (en) | 2016-07-22 | 2021-03-16 | Neeraj Jhanji | Systems and methods to discover and notify devices that come in close proximity with each other |
US10791172B2 (en) | 2016-07-22 | 2020-09-29 | Tinker Pte. Ltd. | Systems and methods for interacting with nearby people and devices |
US20180046264A1 (en) * | 2016-08-11 | 2018-02-15 | National Taiwan Normal University | Method for transmitting a virtual object between electronic devices |
CN107728869A (en) * | 2016-08-11 | 2018-02-23 | 洪荣昭 | Method and electronic system for transmitting virtual object |
US10154388B2 (en) * | 2016-09-15 | 2018-12-11 | Qualcomm Incorporated | Wireless directional sharing based on antenna sectors |
US20180077547A1 (en) * | 2016-09-15 | 2018-03-15 | Qualcomm Incorporated | Wireless directional sharing based on antenna sectors |
CN106178546A (en) * | 2016-09-25 | 2016-12-07 | 依云智酷(北京)科技有限公司 | A kind of intelligent toy projecting touch-control |
US11032698B2 (en) * | 2016-10-27 | 2021-06-08 | International Business Machines Corporation | Gesture based smart download |
US20180121073A1 (en) * | 2016-10-27 | 2018-05-03 | International Business Machines Corporation | Gesture based smart download |
US20180152830A1 (en) * | 2016-11-25 | 2018-05-31 | Fujitsu Limited | Information reception terminal and information distribution system |
US10572545B2 (en) | 2017-03-03 | 2020-02-25 | Perkinelmer Informatics, Inc | Systems and methods for searching and indexing documents comprising chemical information |
US20190171700A1 (en) * | 2017-12-04 | 2019-06-06 | Microsoft Technology Licensing, Llc | Intelligent object movement |
US10984179B2 (en) * | 2017-12-04 | 2021-04-20 | Microsoft Technology Licensing, Llc | Intelligent object movement |
WO2021023208A1 (en) * | 2019-08-08 | 2021-02-11 | 华为技术有限公司 | Data sharing method, graphical user interface, related device, and system |
US20220191668A1 (en) * | 2019-09-02 | 2022-06-16 | Huawei Technologies Co., Ltd. | Short-Distance Information Transmission Method and Electronic Device |
US12137399B2 (en) * | 2019-09-02 | 2024-11-05 | Huawei Technologies Co., Ltd. | Short-distance information transmission method and electronic device |
US12131094B2 (en) | 2020-11-25 | 2024-10-29 | Boe Technology Group Co., Ltd. | Screen projection interaction method, screen projection system and terminal device |
US11550404B2 (en) * | 2021-05-14 | 2023-01-10 | Microsoft Technology Licensing, Llc | Tilt-responsive techniques for sharing content |
US20220365606A1 (en) * | 2021-05-14 | 2022-11-17 | Microsoft Technology Licensing, Llc | Tilt-responsive techniques for sharing content |
WO2022240572A1 (en) * | 2021-05-14 | 2022-11-17 | Microsoft Technology Licensing, Llc | Tilt-responsive techniques for sharing content |
US11556264B1 (en) * | 2021-07-26 | 2023-01-17 | Bank Of America Corporation | Offline data transfer between devices using gestures |
US20230024454A1 (en) * | 2021-07-26 | 2023-01-26 | Bank Of America Corporation | Offline data transfer between devices using gestures |
CN114115524A (en) * | 2021-10-22 | 2022-03-01 | 青岛海尔科技有限公司 | Interaction method, storage medium and electronic device of smart water cup |
CN114138141A (en) * | 2021-10-29 | 2022-03-04 | 维沃移动通信有限公司 | Display method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110163944A1 (en) | Intuitive, gesture-based communications with physics metaphors | |
US8839150B2 (en) | Graphical objects that respond to touch or motion input | |
US20180260029A1 (en) | Systems and Methods for Haptic Message Transmission | |
JP6559881B2 (en) | Device and method for processing touch input based on its strength | |
EP3042275B1 (en) | Tilting to scroll | |
US10521068B2 (en) | Portable device and screen displaying method thereof | |
JP5604594B2 (en) | Method, apparatus and computer program product for grouping content in augmented reality | |
JP5951781B2 (en) | Multidimensional interface | |
KR101296052B1 (en) | Animations | |
US10083617B2 (en) | Portable apparatus and screen displaying method thereof | |
JP5793426B2 (en) | System and method for interpreting physical interaction with a graphical user interface | |
US9798926B2 (en) | Dynamic vector map tiles | |
EP3258361A1 (en) | Mobile terminal using proximity sensor and method of controlling the mobile terminal | |
KR20130121687A (en) | Direction-conscious information sharing | |
KR20140095092A (en) | System and method for wirelessly sharing data amongst user devices | |
WO2011121171A1 (en) | Methods and apparatuses for providing an enhanced user interface | |
US20140298172A1 (en) | Electronic device and method of displaying playlist thereof | |
US20140365968A1 (en) | Graphical User Interface Elements | |
CN109844709B (en) | Method and computerized system for presenting information | |
CN111475069B (en) | Display method and electronic equipment | |
EP2566142A1 (en) | Terminal capable of controlling attribute of application based on motion and method thereof | |
AU2017100482B4 (en) | Devices, methods, and graphical user interfaces for providing haptic feedback | |
KR20150083266A (en) | Portable apparatus and method for sharing content thereof | |
CN116492666A (en) | Handle, handle operation method, device, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BILBREY, BRETT;KING, NICHOLAS V.;BENJAMIN, TODD;REEL/FRAME:023840/0919 Effective date: 20100105 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |