US20160364002A1 - Systems and methods for determining emotions based on user gestures - Google Patents
Systems and methods for determining emotions based on user gestures Download PDFInfo
- Publication number
- US20160364002A1 US20160364002A1 US14/735,060 US201514735060A US2016364002A1 US 20160364002 A1 US20160364002 A1 US 20160364002A1 US 201514735060 A US201514735060 A US 201514735060A US 2016364002 A1 US2016364002 A1 US 2016364002A1
- Authority
- US
- United States
- Prior art keywords
- user
- emotional state
- data
- gesture
- profile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G06N99/005—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Definitions
- the present invention relates to computer systems and, more particularly, to systems, devices, and methods of detecting emotions via gestures by users of computing systems.
- An information handling system generally processes, compiles, stores, and/or communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information.
- information handling systems may also vary regarding what information is handled, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated.
- the variations in information handling systems allow for information handling systems to be general or configured for a specific user or specific use, such as financial transaction processing, airline reservations, enterprise data storage, or global communications.
- information handling systems may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
- Some existing information handling systems are capable of performing various types of facial and voice analyses to aid in the detection of a user's emotional condition. Such mood analysis can be used to gain information about the impact of a product on a user of that product, for example, to determine user satisfaction.
- Other information handling systems apply mood analysis to a piece of software (e.g., gaming or educational software) in an attempt to detect user perception. Mood analysis may be used to detect whether a user perceived the software as performing too slowly. From this it may be inferred that the user is bored, frustrated, etc.
- gestures have been studied in academic research in the area of authentication (e.g., to distinguish users from each other) but gestures have not been used to estimate emotional conditions of a user, mainly because the use of applications on mobile devices does not lend itself to existing approaches for mood analysis.
- FIGURE (“FIG.”) 1 is an illustrative system for determining the emotional state of a user.
- FIG. 2 is as flowchart of an illustrative process for determining the emotional state of a user, according to various embodiments of the invention.
- FIG. 3 is as flowchart of an illustrative process for generating and refining user profiles according to various embodiments of the invention.
- FIG. 4 illustrates a process for generating a response based on detecting an emotional state of a user, according to various embodiments of the invention.
- FIG. 5 depicts a simplified block diagram of an information handling system comprising a system for determining the emotional state of a user, according to various embodiments of the present invention.
- connections between components or systems within the figures are not intended to be limited to direct connections. Rather, data between these components may be modified, re-formatted, or otherwise changed by intermediary components. Also, additional or fewer connections may be used. It shall also be noted that the terms “coupled,” “connected,” or “communicatively coupled” shall be understood to include direct connections, indirect connections through one or more intermediary devices, and wireless connections.
- a service, function, or resource is not limited to a single service, function, or resource; usage of these terms may refer to a grouping of related services, functions, or resources, which may be distributed or aggregated.
- memory, database, information base, data store, tables, hardware, and the like may be used herein to refer to system component or components into which information may be entered or otherwise recorded.
- the term “gesture” includes any interaction between a user and a computing device, such as swiping a finger, a facial expression, a pressure, a speed, a location, a time, a device orientation, and an intensity of an action or reaction, whether provided voluntarily or involuntarily.
- FIG. 1 is an illustrative system for determining the emotional state of a user.
- System 100 comprises one or more sensors 102 , data processor 104 , analysis module 108 , and response module 112 .
- Sensor 102 may be implemented in a pointing device, a keyboard, a touchscreen, or any other device that is coupled to or is part of a computing system.
- Sensor 102 is any sensor capable of detecting a gesture by users of the computing system.
- one or more sensors 102 are configured to detect environmental variables, such as room temperature, illumination, and the like.
- sensor 102 comprises an accelerometer to determine a position of the device or a relative speed of the device.
- Sensor 102 outputs a sensor signal, for example, as a differential analog signal that serves as input into data processor 104 .
- Data processor 104 is any device capable of receiving and converting sensor data (e.g. an analog-to-digital converter).
- Analysis module 108 is coupled or integrated with data processor 104 to receive and process raw or pre-processed sensor data 106 and output processed data 110 to response generator 112 .
- Response generator 112 generates one or more output signals 114 based on processed data 110 .
- Output signals 114 may be provided to actuators and other external systems, including software applications.
- sensor 102 detects one or more gesture information, including any change in intensity, speed, or acceleration of a gesture.
- the gesture is related to a body language or the physical condition of a user that is captured by sensor 102 .
- the term “gesture” comprises any combination of a swiping pattern across a touch display; pressure exerted on a touch display or any other part of an electronic device (e.g., finger pressure as measured by relative finger size generated on a touch screen during the course of a swipe); a swiping speed; a spatial or temporal start or stop location; a verbal or facial expression; a movement of the device; and any other verbal and non-verbal communication, whether voluntarily or involuntarily performed by a user or a device associated with the user.
- detection may also comprise detecting a physiological condition, e.g., measuring the breathing rate of a user.
- the output signal of sensor 102 may be raw or pre-processed sensor data that may be input to data processor 104 .
- data processor 104 converts a format of the sensor signal (e.g., analog sensor data) into a suitable format that can be understood by an analysis module 108 (e.g., digital data).
- Analysis module 108 analyzes sensor data 106 to determine a gesture that allows an inference to be made about an emotional state of a user, including any change or trend.
- the gesture may be a result of an interaction between the user and a mobile device on which the user performs actions involving a touch display. Actions include tapping on the display to open a software application, swiping across the display to move between screens, and scrolling through screens.
- sensor data 106 includes derivative data, such as the time elapsed between two or more taps or the relative length of a swipe action on a screen.
- analysis module 108 compares sensor data 106 to user-specific reference sensor data in order to detect whether a difference exceeds a certain threshold, thereby, indicating a change in an emotional state of the user who caused the difference in sensor data 106 by acting sufficiently different or by interacting sufficiently different with the device to cause sensor 102 to sense the discrepancy. For example, an angry user will handle a phone differently by holding the phone more rigidly, thus, exerting more pressure. The user may also tap in a more staccato-like fashion as compared to a calm person.
- analysis module 108 associates the gesture with one or more predefined emotional states (angry, frustrated, etc.) and outputs the result as emotion data 110 , for example, in form of a predefined mood level to response generator 112 .
- response generator 112 generates one or more output signals 114 based on the emotion data 110 or the determined emotional state.
- Output signals 114 may be provided to an application (e.g., on a mobile device), to actuators, and other external systems for further processing and generating or executing an appropriate response that may include one or more actions intended to affect a change in a user's mood (e.g., playing soothing music), reengage a user with a task, and/or adjust a difficulty level of a task.
- Other possible responses include generating a notification, e.g., via email, display of a dialog box, etc.
- analysis module 108 comprises software that is placed between an application and an operating system of the device on which the user performs actions.
- the software may be designed to capture user-device interactions and pass gesture information, such as tapping data or a swipe pattern representative of the user's emotional state, to a different software application used by the device.
- analysis module 108 automatically or based on user instructions initiates a machine learning process to create or update a user model in order to increase accuracy of subsequent analysis.
- the learning process comprises a training session that prompts the user to interact with the device in different user environments, such that sensor 104 can collect gesture information in different contexts.
- the gathered information is used to develop a model of patterns (e.g., tapping or swiping patterns) based on features, such as a typical amount of pressure a user exerts on a touch screen when tapping or swiping across the screen, a direction, a duration of the action, and the like.
- system 100 comprises internal and/or external storage devices, e.g., to store calibration data. It is understood that data analysis may be performed during regular operation and offline. In embodiments, phases of a training session or data analysis (e.g., learning of a user's general usage pattern) are performed as a background operation.
- FIG. 2 is as flowchart of an illustrative process for determining the emotional state of a user, according to various embodiments of the invention.
- the process 200 for determining the emotional state of a user begins at step 202 , by monitoring an interaction between a user and a computing device to detect one or more gestures. Examples of possible gestures are provided with respect to FIG. 1 above.
- step 204 it is determined whether the system is in learning mode. If so, a training procedure is initiated and information gained may be used to create or update a user model. The decision whether to enter learning mode may be automatically controlled or manually set and reset. In embodiments, if it is decided to enter the learning mode, one or more reference gestures are captured, either in a training session, or by background processing during normal operation, or both. When in learning mode, information from the captured sensor data as well as contextual data may be continuously added to update the user model in step 210 .
- processing may proceed as usual, at step 206 , at which sensor data is compared to existing sensor data, e.g., statistically normalized data. Based on the results of the comparison, at step 208 , it is determined whether the data indicate a change, for example, a statistically significant deviation that is sufficiently large to indicate an emotional state or change in the emotional state of a user. If the result of this determination is in the affirmative, then, at step 220 , data associated with the detected emotional state of the user is output to execute some action or response prior to process 200 resuming with receiving sensor data, at step 202 .
- process 200 may directly proceed to step 202 .
- FIG. 3 is as flowchart of an illustrative process for generating and refining a user profile according to various embodiments of the invention.
- Process 300 begins at step 302 when input data comprising gesture and/or contextual data, such as location data, environmental data, the presence of others, and the like, is gathered for one or more users in preparation for a training session.
- input data comprising gesture and/or contextual data, such as location data, environmental data, the presence of others, and the like
- a training session comprises the learning of use patterns that are characteristic for certain emotional states, such as frustration or anger, which are triggered under certain conditions. Emotional states may be triggered by provoking a predetermined user response during a specific training phase. As part of training, for example for purposes of calibration, a user may be prompted to think of or engage in activities that cause the user to experience a particular emotional state. In embodiments, the training session purposefully interjects non-responsive or repeatedly delayed or incorrect responses to user input by the device, application, or computing system, to observe user responses and interactions with the device that enable the detection of gestures that allow for an inference of an associated emotional state.
- a training or calibration phase may include the application of a test procedure that evaluates cognitive abilities or performance, and the generation of reference gesture data of one or more users.
- machine learning is used to build a model of the behavior of users that may be verified by experimental data. For example, a large number of users may undergo a process designed to elicit various types of emotions that are observed to determine a correlation between users' emotional states and usage of a computing device (e.g., being angry and pressing the device more abruptly). This information may be used in generating a general user model against which individual users' behavior may then be compared in order to confirm that a particular detected device-user interaction, in fact, corresponds to an emotion displayed by that individual.
- Training may be designed to allow a model to differentiate between a particular interaction performed under different conditions, e.g., different locations.
- a mood detection module operates in a data collection only mode to capture an initial set of gesture data from one or more users. After a certain amount of data has been collected, at step 304 , the mood detection module identifies characteristics that are representative of a user's different emotional states. In embodiments, categories of user activity are evaluated based on different environments in which the activity takes place so as to increase the accuracy of gesture and emotional state recognition. In embodiments, contextual data is used to set varying thresholds to different parameters that are considered in determining a particular user mood. As a result, the particular emotional state (e.g., happy) detected under one set of circumstances may be different from the emotional state detected under another set of circumstances.
- the particular emotional state e.g., happy
- the mood detection module may evaluate a gesture differently than the same gesture detected within the user's home.
- the environment may be detected and recognized as providing less reliable data, such that data collected under these circumstances has to undergo additional filtering to suppress noise in the collected data.
- the less reliable data may be discarded altogether.
- the evaluation of user activities may be augmented by environmental cues that aid in improving the user model and, ultimately, result in more accurate mood detection.
- emotional states may serve as reference emotional states for a user reference model.
- reference emotional states may be generated based on collected single and/or multi-user gesture data.
- a single user reference model or profile may be used as initial starting point to train numerous user models or profiles.
- a factor or parameter derived from contextual data serves as a weighing factor for adjusting an emotional state parameter or a threshold value in the user reference model.
- Reference user models account for activities in different categories (e.g. pressure, speed, etc.) that may be used to determine an emotional state (e.g., anger).
- the training phase comprises data collected from two or more individual users.
- Multi-user data can be analyzed to extract common and different features across users. This information serves to create one or more reference user models against which an individual user's behavior may be compared to detect any deviations and determine for the particular user an emotional state.
- input data comprising gesture and contextual data for a particular user is received, e.g., from a sensor or memory device.
- contextual data is not limited to locations and environmental variables that can be detected, such as room temperature, illumination, etc., but may also include additional factors that may be used to provide context, such as the presence or mood of other persons, weather, detection of a facial expression, etc., that may be included in the learning process to normalize parameters associated with the identified characteristics.
- a mood detection module clusters the user-specific input data based on characteristics that are associated with a particular user emotional states to generate one or more context-dependent user emotional state profiles.
- the information provided by contextual data allows drawing a context-based distinction between user acting in one environment versus the same user acting in a different environment so as to permit proper recognition of the same mood despite changing ambience.
- each profile may consider, weigh, and correlate different parameters differently to establish a unique user profile that aids in evaluating and interpreting context-based mood parameters.
- the time of day may play a role in determining mood at a user's work place or during travel, but the time of day would not have to be taken into account for determining a certain mood at the user's home, unless it contributes to altering the probability of the presence of the to-be-determined emotional state of the user.
- the initial data collection may result, for example, in the creation of only two clusters (e.g., happy and angry) instead of three clusters (e.g., happy, angry, and sad), due to the user not having been sad when using the device during the data collection period of the initial training phase.
- clusters may be used as inputs for the user model while additional inputs (e.g., swipes and taps) that may aid in the determination of the user's mood are being gathered.
- clusters and, thus, user emotional state profiles are refined as more data is collected. For example, while, at first, a hard tap may be categorized as indicating anger, over time, a mood detection program may learn that hard taps merely indicate impatience, whereas relatively harder taps, in fact, indicate anger. In this manner, the user model is trained to recognize that relatively hard taps are required to indicate anger. In embodiments, machine learning is used to continuously update and improve user emotional state profiles.
- a user emotional state profile may be updated based on a profile that is assigned to another user. It is understood that user profiles may or may not be related to each other and need not necessarily involve common emotional states or parameters.
- FIG. 4 illustrates a process for generating a response based on detecting an emotional state of a user, according to various embodiments of the invention.
- Process 400 for generating a response begins at step 402 when input data comprising gesture data and/or contextual data is received, for example, from a sensor or memory device.
- the input data is applied to a trained user emotional state profile to generate, select, or modify one or more emotional state parameters for a user from which an emotional state of the individual user can be determined based on, for example, a correlation between emotional state parameters and the contextual data.
- a detected emotional state of the user is output.
- step 408 based on the detected emotional state, one or more actions previously described are taken.
- FIG. 5 depicts a simplified block diagram of an information handling system comprising a system for determining the emotional state of a user, according to various embodiments of the present invention.
- system 500 includes a central processing unit (CPU) 501 that provides computing resources and controls the computer.
- CPU 501 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations.
- System 500 may also include a system memory 502 , which may be in the form of random-access memory (RAM) and read-only memory (ROM).
- RAM random-access memory
- ROM read-only memory
- An input controller 503 represents an interface to various input device(s) 504 , such as a keyboard, touch display, mouse, or stylus.
- a scanner controller 505 which communicates with a scanner 506 .
- System 500 may also include a storage controller 507 for interfacing with one or more storage devices 508 each of which includes a storage medium such as magnetic tape or disk, or an optical medium that might be used to record programs of instructions for operating systems, utilities and applications which may include embodiments of programs that implement various aspects of the present invention.
- Storage device(s) 508 may also be used to store processed data or data to be processed in accordance with the invention.
- System 500 may also include a display controller 509 for providing an interface to a display device 511 , which may be a cathode ray tube (CRT), a thin film transistor (TFT) display, or other type of display.
- the computing system 500 may also include a printer controller 512 for communicating with a printer 513 .
- a communications controller 514 may interface with one or more communication devices 515 , which enables system 500 to connect to remote devices through any of a variety of networks including the Internet, an Ethernet cloud, an FCoE/DCB cloud, a local area network (LAN), a wide area network (WAN), a storage area network (SAN) or through any suitable electromagnetic carrier signals including infrared signals.
- LAN local area network
- WAN wide area network
- SAN storage area network
- bus 516 which may represent more than one physical bus.
- various system components may or may not be in physical proximity to one another.
- input data and/or output data may be remotely transmitted from one physical location to another.
- programs that implement various aspects of this invention may be accessed from a remote location (e.g., a server) over a network.
- Such data and/or programs may be conveyed through any of a variety of machine-readable medium including, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices.
- ASICs application specific integrated circuits
- PLDs programmable logic devices
- flash memory devices ROM and RAM devices.
- Embodiments of the present invention may be encoded upon one or more non-transitory computer-readable media with instructions for one or more processors or processing units to cause steps to be performed.
- the one or more non-transitory computer-readable media shall include volatile and non-volatile memory.
- alternative implementations are possible, including a hardware implementation or a software/hardware implementation.
- Hardware-implemented functions may be realized using ASIC(s), programmable arrays, digital signal processing circuitry, or the like. Accordingly, the “means” terms in any claims are intended to cover both software and hardware implementations.
- the term “computer-readable medium or media” as used herein includes software and/or hardware having a program of instructions embodied thereon, or a combination thereof.
- embodiments of the present invention may further relate to computer products with a non-transitory, tangible computer-readable medium that have computer code thereon for performing various computer-implemented operations.
- the media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind known or available to those having skill in the relevant arts.
- Examples of tangible computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices.
- ASICs application specific integrated circuits
- PLDs programmable logic devices
- flash memory devices and ROM and RAM devices.
- Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter.
- Embodiments of the present invention may be implemented in whole or in part as machine-executable instructions that may be in program modules that are executed by a processing device.
- Examples of program modules include libraries, programs, routines, objects, components, and data structures. In distributed computing environments, program modules may be physically located in settings that are local, remote, or both.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- A. Technical Field
- The present invention relates to computer systems and, more particularly, to systems, devices, and methods of detecting emotions via gestures by users of computing systems.
- B. Background of the Invention
- As the value and use of information continues to increase, individuals and businesses seek additional ways to process and store information. One option available to users is information handling systems. An information handling system generally processes, compiles, stores, and/or communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information. Because technology and information handling needs and requirements vary between different users or applications, information handling systems may also vary regarding what information is handled, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated. The variations in information handling systems allow for information handling systems to be general or configured for a specific user or specific use, such as financial transaction processing, airline reservations, enterprise data storage, or global communications. In addition, information handling systems may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
- Some existing information handling systems are capable of performing various types of facial and voice analyses to aid in the detection of a user's emotional condition. Such mood analysis can be used to gain information about the impact of a product on a user of that product, for example, to determine user satisfaction. Other information handling systems apply mood analysis to a piece of software (e.g., gaming or educational software) in an attempt to detect user perception. Mood analysis may be used to detect whether a user perceived the software as performing too slowly. From this it may be inferred that the user is bored, frustrated, etc.
- User gestures have been studied in academic research in the area of authentication (e.g., to distinguish users from each other) but gestures have not been used to estimate emotional conditions of a user, mainly because the use of applications on mobile devices does not lend itself to existing approaches for mood analysis.
- What is needed are systems and methods that overcome the above-mentioned limitations and allow for the detection of the emotional state of a user interacting with a computing system, such that appropriate action can be initiated to achieve a desired outcome.
- Reference will be made to embodiments of the invention, examples of which may be illustrated in the accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in the context of these embodiments, it should be understood that this is not intended to limit the scope of the invention to these particular embodiments.
- FIGURE (“FIG.”) 1 is an illustrative system for determining the emotional state of a user.
-
FIG. 2 is as flowchart of an illustrative process for determining the emotional state of a user, according to various embodiments of the invention. -
FIG. 3 is as flowchart of an illustrative process for generating and refining user profiles according to various embodiments of the invention. -
FIG. 4 illustrates a process for generating a response based on detecting an emotional state of a user, according to various embodiments of the invention. -
FIG. 5 depicts a simplified block diagram of an information handling system comprising a system for determining the emotional state of a user, according to various embodiments of the present invention. - In the following description, for purposes of explanation, specific details are set forth in order to provide an understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these details. Furthermore, one skilled in the art will recognize that embodiments of the present invention, described below, may be implemented in a variety of ways, such as a process, an apparatus, a system, a device, or a method on a tangible computer-readable medium.
- Components, or modules, shown in diagrams are illustrative of exemplary embodiments of the invention and are meant to avoid obscuring the invention. It shall also be understood that throughout this discussion that components may be described as separate functional units, which may comprise sub-units, but those skilled in the art will recognize that various components, or portions thereof, may be divided into separate components or may be integrated together, including integrated within a single system or component. It should be noted that functions or operations discussed herein may be implemented as components. Components may be implemented in software, hardware, or a combination thereof.
- Furthermore, connections between components or systems within the figures are not intended to be limited to direct connections. Rather, data between these components may be modified, re-formatted, or otherwise changed by intermediary components. Also, additional or fewer connections may be used. It shall also be noted that the terms “coupled,” “connected,” or “communicatively coupled” shall be understood to include direct connections, indirect connections through one or more intermediary devices, and wireless connections.
- Reference in the specification to “one embodiment,” “preferred embodiment,” “an embodiment,” or “embodiments” means that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the invention and may be in more than one embodiment. Also, the appearances of the above-noted phrases in various places in the specification are not necessarily all referring to the same embodiment or embodiments.
- The use of certain terms in various places in the specification is for illustration and should not be construed as limiting. A service, function, or resource is not limited to a single service, function, or resource; usage of these terms may refer to a grouping of related services, functions, or resources, which may be distributed or aggregated. Furthermore, the use of memory, database, information base, data store, tables, hardware, and the like may be used herein to refer to system component or components into which information may be entered or otherwise recorded.
- Furthermore, it shall be noted that: (1) certain steps may optionally be performed; (2) steps may not be limited to the specific order set forth herein; (3) certain steps may be performed in different orders; and (4) certain steps may be done concurrently.
- In this document, the term “gesture” includes any interaction between a user and a computing device, such as swiping a finger, a facial expression, a pressure, a speed, a location, a time, a device orientation, and an intensity of an action or reaction, whether provided voluntarily or involuntarily.
-
FIG. 1 is an illustrative system for determining the emotional state of a user.System 100 comprises one ormore sensors 102,data processor 104,analysis module 108, andresponse module 112.Sensor 102 may be implemented in a pointing device, a keyboard, a touchscreen, or any other device that is coupled to or is part of a computing system.Sensor 102 is any sensor capable of detecting a gesture by users of the computing system. In embodiments, one ormore sensors 102 are configured to detect environmental variables, such as room temperature, illumination, and the like. - In embodiments,
sensor 102 comprises an accelerometer to determine a position of the device or a relative speed of the device.Sensor 102 outputs a sensor signal, for example, as a differential analog signal that serves as input intodata processor 104.Data processor 104 is any device capable of receiving and converting sensor data (e.g. an analog-to-digital converter).Analysis module 108 is coupled or integrated withdata processor 104 to receive and process raw or pre-processedsensor data 106 and output processeddata 110 toresponse generator 112.Response generator 112 generates one ormore output signals 114 based on processeddata 110.Output signals 114 may be provided to actuators and other external systems, including software applications. - In operation,
sensor 102 detects one or more gesture information, including any change in intensity, speed, or acceleration of a gesture. In embodiments, the gesture is related to a body language or the physical condition of a user that is captured bysensor 102. As used herein, the term “gesture” comprises any combination of a swiping pattern across a touch display; pressure exerted on a touch display or any other part of an electronic device (e.g., finger pressure as measured by relative finger size generated on a touch screen during the course of a swipe); a swiping speed; a spatial or temporal start or stop location; a verbal or facial expression; a movement of the device; and any other verbal and non-verbal communication, whether voluntarily or involuntarily performed by a user or a device associated with the user. - In embodiments, detection may also comprise detecting a physiological condition, e.g., measuring the breathing rate of a user. The output signal of
sensor 102 may be raw or pre-processed sensor data that may be input todata processor 104. - In embodiments,
data processor 104 converts a format of the sensor signal (e.g., analog sensor data) into a suitable format that can be understood by an analysis module 108 (e.g., digital data).Analysis module 108analyzes sensor data 106 to determine a gesture that allows an inference to be made about an emotional state of a user, including any change or trend. The gesture may be a result of an interaction between the user and a mobile device on which the user performs actions involving a touch display. Actions include tapping on the display to open a software application, swiping across the display to move between screens, and scrolling through screens. When the user is angry or frustrated, and the interactions with the device are performed different from a “normal” or reference mode, a difference may be detected in the form of a variation in gesture patterns, speed, etc. For example, it may be detected that in some instances the user taps the screen more violently or performs swiping motions more rapidly in comparison to a reference behavior in the normal mode. In embodiments,sensor data 106 includes derivative data, such as the time elapsed between two or more taps or the relative length of a swipe action on a screen. - In embodiments,
analysis module 108 comparessensor data 106 to user-specific reference sensor data in order to detect whether a difference exceeds a certain threshold, thereby, indicating a change in an emotional state of the user who caused the difference insensor data 106 by acting sufficiently different or by interacting sufficiently different with the device to causesensor 102 to sense the discrepancy. For example, an angry user will handle a phone differently by holding the phone more rigidly, thus, exerting more pressure. The user may also tap in a more staccato-like fashion as compared to a calm person. - In embodiments,
analysis module 108 associates the gesture with one or more predefined emotional states (angry, frustrated, etc.) and outputs the result asemotion data 110, for example, in form of a predefined mood level toresponse generator 112. - In embodiments,
response generator 112 generates one ormore output signals 114 based on theemotion data 110 or the determined emotional state. Output signals 114 may be provided to an application (e.g., on a mobile device), to actuators, and other external systems for further processing and generating or executing an appropriate response that may include one or more actions intended to affect a change in a user's mood (e.g., playing soothing music), reengage a user with a task, and/or adjust a difficulty level of a task. Other possible responses include generating a notification, e.g., via email, display of a dialog box, etc. - In embodiments,
analysis module 108 comprises software that is placed between an application and an operating system of the device on which the user performs actions. The software may be designed to capture user-device interactions and pass gesture information, such as tapping data or a swipe pattern representative of the user's emotional state, to a different software application used by the device. - In embodiments,
analysis module 108 automatically or based on user instructions initiates a machine learning process to create or update a user model in order to increase accuracy of subsequent analysis. In embodiments, the learning process comprises a training session that prompts the user to interact with the device in different user environments, such thatsensor 104 can collect gesture information in different contexts. The gathered information is used to develop a model of patterns (e.g., tapping or swiping patterns) based on features, such as a typical amount of pressure a user exerts on a touch screen when tapping or swiping across the screen, a direction, a duration of the action, and the like. - One of skill in the art will appreciate that
system 100 comprises internal and/or external storage devices, e.g., to store calibration data. It is understood that data analysis may be performed during regular operation and offline. In embodiments, phases of a training session or data analysis (e.g., learning of a user's general usage pattern) are performed as a background operation. -
FIG. 2 is as flowchart of an illustrative process for determining the emotional state of a user, according to various embodiments of the invention. In embodiments, theprocess 200 for determining the emotional state of a user begins atstep 202, by monitoring an interaction between a user and a computing device to detect one or more gestures. Examples of possible gestures are provided with respect toFIG. 1 above. - In embodiments, at
step 204, it is determined whether the system is in learning mode. If so, a training procedure is initiated and information gained may be used to create or update a user model. The decision whether to enter learning mode may be automatically controlled or manually set and reset. In embodiments, if it is decided to enter the learning mode, one or more reference gestures are captured, either in a training session, or by background processing during normal operation, or both. When in learning mode, information from the captured sensor data as well as contextual data may be continuously added to update the user model instep 210. - Upon updating the user model, processing may proceed as usual, at
step 206, at which sensor data is compared to existing sensor data, e.g., statistically normalized data. Based on the results of the comparison, atstep 208, it is determined whether the data indicate a change, for example, a statistically significant deviation that is sufficiently large to indicate an emotional state or change in the emotional state of a user. If the result of this determination is in the affirmative, then, atstep 220, data associated with the detected emotional state of the user is output to execute some action or response prior toprocess 200 resuming with receiving sensor data, atstep 202. - If, however, at
step 208 it is determined that the results do not indicate an emotional state or a change in the emotional state of a user,process 200 may directly proceed to step 202. -
FIG. 3 is as flowchart of an illustrative process for generating and refining a user profile according to various embodiments of the invention.Process 300 begins atstep 302 when input data comprising gesture and/or contextual data, such as location data, environmental data, the presence of others, and the like, is gathered for one or more users in preparation for a training session. - In embodiments, a training session comprises the learning of use patterns that are characteristic for certain emotional states, such as frustration or anger, which are triggered under certain conditions. Emotional states may be triggered by provoking a predetermined user response during a specific training phase. As part of training, for example for purposes of calibration, a user may be prompted to think of or engage in activities that cause the user to experience a particular emotional state. In embodiments, the training session purposefully interjects non-responsive or repeatedly delayed or incorrect responses to user input by the device, application, or computing system, to observe user responses and interactions with the device that enable the detection of gestures that allow for an inference of an associated emotional state. A training or calibration phase may include the application of a test procedure that evaluates cognitive abilities or performance, and the generation of reference gesture data of one or more users.
- In embodiments, machine learning is used to build a model of the behavior of users that may be verified by experimental data. For example, a large number of users may undergo a process designed to elicit various types of emotions that are observed to determine a correlation between users' emotional states and usage of a computing device (e.g., being angry and pressing the device more abruptly). This information may be used in generating a general user model against which individual users' behavior may then be compared in order to confirm that a particular detected device-user interaction, in fact, corresponds to an emotion displayed by that individual.
- Training may be designed to allow a model to differentiate between a particular interaction performed under different conditions, e.g., different locations.
- In embodiments, first, a mood detection module operates in a data collection only mode to capture an initial set of gesture data from one or more users. After a certain amount of data has been collected, at
step 304, the mood detection module identifies characteristics that are representative of a user's different emotional states. In embodiments, categories of user activity are evaluated based on different environments in which the activity takes place so as to increase the accuracy of gesture and emotional state recognition. In embodiments, contextual data is used to set varying thresholds to different parameters that are considered in determining a particular user mood. As a result, the particular emotional state (e.g., happy) detected under one set of circumstances may be different from the emotional state detected under another set of circumstances. - For example, for a user in a work environment, the mood detection module may evaluate a gesture differently than the same gesture detected within the user's home. As another example, in scenarios where a user performs gestures while physically moving or being moved together with a device (e.g., inside a vehicle), the environment may be detected and recognized as providing less reliable data, such that data collected under these circumstances has to undergo additional filtering to suppress noise in the collected data. Alternatively, the less reliable data may be discarded altogether. In short, the evaluation of user activities may be augmented by environmental cues that aid in improving the user model and, ultimately, result in more accurate mood detection.
- In embodiments, emotional states may serve as reference emotional states for a user reference model. Generally, reference emotional states may be generated based on collected single and/or multi-user gesture data. One skilled in the art will appreciate that a single user reference model or profile may be used as initial starting point to train numerous user models or profiles. In embodiments, a factor or parameter derived from contextual data serves as a weighing factor for adjusting an emotional state parameter or a threshold value in the user reference model. Reference user models account for activities in different categories (e.g. pressure, speed, etc.) that may be used to determine an emotional state (e.g., anger).
- In embodiments, the training phase comprises data collected from two or more individual users. Multi-user data can be analyzed to extract common and different features across users. This information serves to create one or more reference user models against which an individual user's behavior may be compared to detect any deviations and determine for the particular user an emotional state.
- At
step 306, input data comprising gesture and contextual data for a particular user is received, e.g., from a sensor or memory device. It will be appreciated that contextual data is not limited to locations and environmental variables that can be detected, such as room temperature, illumination, etc., but may also include additional factors that may be used to provide context, such as the presence or mood of other persons, weather, detection of a facial expression, etc., that may be included in the learning process to normalize parameters associated with the identified characteristics. - At
step 308, for one or more contexts, a mood detection module clusters the user-specific input data based on characteristics that are associated with a particular user emotional states to generate one or more context-dependent user emotional state profiles. The information provided by contextual data allows drawing a context-based distinction between user acting in one environment versus the same user acting in a different environment so as to permit proper recognition of the same mood despite changing ambience. - In embodiments, each profile may consider, weigh, and correlate different parameters differently to establish a unique user profile that aids in evaluating and interpreting context-based mood parameters. As an example, the time of day may play a role in determining mood at a user's work place or during travel, but the time of day would not have to be taken into account for determining a certain mood at the user's home, unless it contributes to altering the probability of the presence of the to-be-determined emotional state of the user.
- It is noted that the initial data collection may result, for example, in the creation of only two clusters (e.g., happy and angry) instead of three clusters (e.g., happy, angry, and sad), due to the user not having been sad when using the device during the data collection period of the initial training phase. Once generated, clusters may be used as inputs for the user model while additional inputs (e.g., swipes and taps) that may aid in the determination of the user's mood are being gathered.
- Finally, at
step 310, clusters and, thus, user emotional state profiles are refined as more data is collected. For example, while, at first, a hard tap may be categorized as indicating anger, over time, a mood detection program may learn that hard taps merely indicate impatience, whereas relatively harder taps, in fact, indicate anger. In this manner, the user model is trained to recognize that relatively hard taps are required to indicate anger. In embodiments, machine learning is used to continuously update and improve user emotional state profiles. - It is noted that a user emotional state profile may be updated based on a profile that is assigned to another user. It is understood that user profiles may or may not be related to each other and need not necessarily involve common emotional states or parameters.
-
FIG. 4 illustrates a process for generating a response based on detecting an emotional state of a user, according to various embodiments of the invention.Process 400 for generating a response begins atstep 402 when input data comprising gesture data and/or contextual data is received, for example, from a sensor or memory device. - At
step 404, the input data is applied to a trained user emotional state profile to generate, select, or modify one or more emotional state parameters for a user from which an emotional state of the individual user can be determined based on, for example, a correlation between emotional state parameters and the contextual data. - At
step 406, a detected emotional state of the user is output. - Finally, at
step 408, based on the detected emotional state, one or more actions previously described are taken. -
FIG. 5 depicts a simplified block diagram of an information handling system comprising a system for determining the emotional state of a user, according to various embodiments of the present invention. It will be understood that the functionalities shown forsystem 500 may operate to support various embodiments of an information handling system—although it shall be understood that an information handling system may be differently configured and include different components. As illustrated inFIG. 5 ,system 500 includes a central processing unit (CPU) 501 that provides computing resources and controls the computer.CPU 501 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations.System 500 may also include asystem memory 502, which may be in the form of random-access memory (RAM) and read-only memory (ROM). - A number of controllers and peripheral devices may also be provided, as shown in
FIG. 5 . Aninput controller 503 represents an interface to various input device(s) 504, such as a keyboard, touch display, mouse, or stylus. There may also be ascanner controller 505, which communicates with ascanner 506.System 500 may also include astorage controller 507 for interfacing with one ormore storage devices 508 each of which includes a storage medium such as magnetic tape or disk, or an optical medium that might be used to record programs of instructions for operating systems, utilities and applications which may include embodiments of programs that implement various aspects of the present invention. Storage device(s) 508 may also be used to store processed data or data to be processed in accordance with the invention.System 500 may also include adisplay controller 509 for providing an interface to adisplay device 511, which may be a cathode ray tube (CRT), a thin film transistor (TFT) display, or other type of display. Thecomputing system 500 may also include aprinter controller 512 for communicating with aprinter 513. Acommunications controller 514 may interface with one ormore communication devices 515, which enablessystem 500 to connect to remote devices through any of a variety of networks including the Internet, an Ethernet cloud, an FCoE/DCB cloud, a local area network (LAN), a wide area network (WAN), a storage area network (SAN) or through any suitable electromagnetic carrier signals including infrared signals. - In the illustrated system, all major system components may connect to a
bus 516, which may represent more than one physical bus. However, various system components may or may not be in physical proximity to one another. For example, input data and/or output data may be remotely transmitted from one physical location to another. In addition, programs that implement various aspects of this invention may be accessed from a remote location (e.g., a server) over a network. Such data and/or programs may be conveyed through any of a variety of machine-readable medium including, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices. - Embodiments of the present invention may be encoded upon one or more non-transitory computer-readable media with instructions for one or more processors or processing units to cause steps to be performed. It shall be noted that the one or more non-transitory computer-readable media shall include volatile and non-volatile memory. It shall be noted that alternative implementations are possible, including a hardware implementation or a software/hardware implementation. Hardware-implemented functions may be realized using ASIC(s), programmable arrays, digital signal processing circuitry, or the like. Accordingly, the “means” terms in any claims are intended to cover both software and hardware implementations. Similarly, the term “computer-readable medium or media” as used herein includes software and/or hardware having a program of instructions embodied thereon, or a combination thereof. With these implementation alternatives in mind, it is to be understood that the figures and accompanying description provide the functional information one skilled in the art would require to write program code (i.e., software) and/or to fabricate circuits (i.e., hardware) to perform the processing required.
- It shall be noted that embodiments of the present invention may further relate to computer products with a non-transitory, tangible computer-readable medium that have computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind known or available to those having skill in the relevant arts. Examples of tangible computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter. Embodiments of the present invention may be implemented in whole or in part as machine-executable instructions that may be in program modules that are executed by a processing device. Examples of program modules include libraries, programs, routines, objects, components, and data structures. In distributed computing environments, program modules may be physically located in settings that are local, remote, or both.
- One skilled in the art will recognize no computing system or programming language is critical to the practice of the present invention. One skilled in the art will also recognize that a number of the elements described above may be physically and/or functionally separated into sub-modules or combined together.
- It will be appreciated to those skilled in the art that the preceding examples and embodiment are exemplary and not limiting to the scope of the present invention. It is intended that all permutations, enhancements, equivalents, combinations, and improvements thereto that are apparent to those skilled in the art upon a reading of the specification and a study of the drawings are included within the true spirit and scope of the present invention.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/735,060 US10514766B2 (en) | 2015-06-09 | 2015-06-09 | Systems and methods for determining emotions based on user gestures |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/735,060 US10514766B2 (en) | 2015-06-09 | 2015-06-09 | Systems and methods for determining emotions based on user gestures |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160364002A1 true US20160364002A1 (en) | 2016-12-15 |
US10514766B2 US10514766B2 (en) | 2019-12-24 |
Family
ID=57516615
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/735,060 Active 2037-06-01 US10514766B2 (en) | 2015-06-09 | 2015-06-09 | Systems and methods for determining emotions based on user gestures |
Country Status (1)
Country | Link |
---|---|
US (1) | US10514766B2 (en) |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160173944A1 (en) * | 2014-12-15 | 2016-06-16 | Vessel Group, Inc. | Processing techniques in audio-visual streaming systems |
US20170177203A1 (en) * | 2015-12-18 | 2017-06-22 | Facebook, Inc. | Systems and methods for identifying dominant hands for users based on usage patterns |
US20170201592A1 (en) * | 2016-01-13 | 2017-07-13 | Sap Se | Contextual user experience |
WO2018132364A1 (en) * | 2017-01-10 | 2018-07-19 | Intuition Robotics, Ltd. | A method for performing emotional gestures by a device to interact with a user |
US20180247443A1 (en) * | 2017-02-28 | 2018-08-30 | International Business Machines Corporation | Emotional analysis and depiction in virtual reality |
US10176807B2 (en) | 2017-04-17 | 2019-01-08 | Essential Products, Inc. | Voice setup instructions |
US10212040B2 (en) | 2017-04-17 | 2019-02-19 | Essential Products, Inc. | Troubleshooting voice-enabled home setup |
WO2019054846A1 (en) | 2017-09-18 | 2019-03-21 | Samsung Electronics Co., Ltd. | Method for dynamic interaction and electronic device thereof |
CN109726655A (en) * | 2018-12-19 | 2019-05-07 | 平安普惠企业管理有限公司 | Customer service evaluation method, device, medium and equipment based on Emotion identification |
US20190138107A1 (en) * | 2016-10-11 | 2019-05-09 | Valve Corporation | Virtual reality hand gesture generation |
US10353480B2 (en) * | 2017-04-17 | 2019-07-16 | Essential Products, Inc. | Connecting assistant device to devices |
US10874939B2 (en) | 2017-06-16 | 2020-12-29 | Valve Corporation | Electronic controller with finger motion sensing |
US10888773B2 (en) | 2016-10-11 | 2021-01-12 | Valve Corporation | Force sensing resistor (FSR) with polyimide substrate, systems, and methods thereof |
US10898797B2 (en) | 2016-10-11 | 2021-01-26 | Valve Corporation | Electronic controller with finger sensing and an adjustable hand retainer |
US10898796B2 (en) | 2016-10-11 | 2021-01-26 | Valve Corporation | Electronic controller with finger sensing and an adjustable hand retainer |
CN112347774A (en) * | 2019-08-06 | 2021-02-09 | 北京搜狗科技发展有限公司 | A model determination method and device for user emotion recognition |
CN112437909A (en) * | 2018-06-20 | 2021-03-02 | 威尔乌集团 | Virtual reality gesture generation |
US20210141453A1 (en) * | 2017-02-23 | 2021-05-13 | Charles Robert Miller, III | Wearable user mental and contextual sensing device and system |
GB2590473A (en) * | 2019-12-19 | 2021-06-30 | Samsung Electronics Co Ltd | Method and apparatus for dynamic human-computer interaction |
US11167213B2 (en) | 2016-10-11 | 2021-11-09 | Valve Corporation | Electronic controller with hand retainer and finger motion sensing |
US11170783B2 (en) * | 2019-04-16 | 2021-11-09 | At&T Intellectual Property I, L.P. | Multi-agent input coordination |
US11185763B2 (en) | 2016-10-11 | 2021-11-30 | Valve Corporation | Holding and releasing virtual objects |
US11216066B2 (en) * | 2018-11-09 | 2022-01-04 | Seiko Epson Corporation | Display device, learning device, and control method of display device |
US11294485B2 (en) | 2016-10-11 | 2022-04-05 | Valve Corporation | Sensor fusion algorithms for a handheld controller that includes a force sensing resistor (FSR) |
US20220206918A1 (en) * | 2020-12-29 | 2022-06-30 | Imperva, Inc. | Dynamic emotion detection based on user inputs |
CN115081334A (en) * | 2022-06-30 | 2022-09-20 | 支付宝(杭州)信息技术有限公司 | Method, system, apparatus and medium for predicting a user's age group or gender |
WO2022245134A1 (en) * | 2021-05-20 | 2022-11-24 | Samsung Electronics Co., Ltd. | A system and method for context resolution and purpose driven clustering in autonomous systems |
US20220407827A1 (en) * | 2021-06-17 | 2022-12-22 | Canon Kabushiki Kaisha | Information processing system, control method thereof, and non-transitory computer-readable storage medium |
US11625898B2 (en) | 2016-10-11 | 2023-04-11 | Valve Corporation | Holding and releasing virtual objects |
CN116075838A (en) * | 2019-10-15 | 2023-05-05 | 爱思唯尔股份有限公司 | System and method for predicting user emotion in SAAS application |
CN116360666A (en) * | 2023-05-31 | 2023-06-30 | Tcl通讯科技(成都)有限公司 | Page sliding method and device, electronic equipment and computer storage medium |
WO2024170162A1 (en) * | 2023-02-15 | 2024-08-22 | British Telecommunications Public Limited Company | Emotional state detection |
KR102810896B1 (en) | 2018-06-20 | 2025-05-20 | 밸브 코포레이션 | Create virtual reality hand gestures |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11301760B2 (en) * | 2018-11-28 | 2022-04-12 | International Business Machines Corporation | Automated postulation thresholds in computer-based questioning |
US11902091B2 (en) * | 2020-04-29 | 2024-02-13 | Motorola Mobility Llc | Adapting a device to a user based on user emotional state |
US12205211B2 (en) * | 2021-05-05 | 2025-01-21 | Disney Enterprises, Inc. | Emotion-based sign language enhancement of content |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5720619A (en) * | 1995-04-24 | 1998-02-24 | Fisslinger; Johannes | Interactive computer assisted multi-media biofeedback system |
US20020131331A1 (en) * | 2001-03-19 | 2002-09-19 | International Business Machines Corporation | Simplified method for setting time using a graphical representation of an analog clock face |
US20030217123A1 (en) * | 1998-09-22 | 2003-11-20 | Anderson Robin L. | System and method for accessing and operating personal computers remotely |
US20120011477A1 (en) * | 2010-07-12 | 2012-01-12 | Nokia Corporation | User interfaces |
US20130157719A1 (en) * | 2010-09-02 | 2013-06-20 | Yong Liu | Mobile terminal and transmission processing method thereof |
US20130216126A1 (en) * | 2012-02-21 | 2013-08-22 | Wistron Corporation | User emotion detection method and associated handwriting input electronic device |
US20150015509A1 (en) * | 2013-07-11 | 2015-01-15 | David H. Shanabrook | Method and system of obtaining affective state from touch screen display interactions |
US20160070440A1 (en) * | 2014-09-04 | 2016-03-10 | Microsoft Corporation | User interface with dynamic transition times |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7539533B2 (en) | 2006-05-16 | 2009-05-26 | Bao Tran | Mesh network monitoring appliance |
US8677281B2 (en) | 2007-02-09 | 2014-03-18 | Intel-Ge Care Innovations Llc | System, apparatus and method for emotional experience time sampling via a mobile graphical user interface |
US7930676B1 (en) | 2007-04-27 | 2011-04-19 | Intuit Inc. | System and method for adapting software elements based on mood state profiling |
US20090002178A1 (en) | 2007-06-29 | 2009-01-01 | Microsoft Corporation | Dynamic mood sensing |
CA2719007A1 (en) | 2008-03-19 | 2009-09-24 | Appleseed Networks, Inc. | Method and apparatus for detecting patterns of behavior |
US10204625B2 (en) * | 2010-06-07 | 2019-02-12 | Affectiva, Inc. | Audio analysis learning using video data |
US9104231B2 (en) | 2012-09-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Mood-actuated device |
US10139937B2 (en) | 2012-10-12 | 2018-11-27 | Microsoft Technology Licensing, Llc | Multi-modal user expressions and user intensity as interactions with an application |
US20140107531A1 (en) | 2012-10-12 | 2014-04-17 | At&T Intellectual Property I, Lp | Inference of mental state using sensory data obtained from wearable sensors |
US9392463B2 (en) | 2012-12-20 | 2016-07-12 | Tarun Anand | System and method for detecting anomaly in a handheld device |
US20140191939A1 (en) | 2013-01-09 | 2014-07-10 | Microsoft Corporation | Using nonverbal communication in determining actions |
US20140278455A1 (en) * | 2013-03-14 | 2014-09-18 | Microsoft Corporation | Providing Feedback Pertaining to Communication Style |
US9799043B2 (en) | 2013-05-07 | 2017-10-24 | Yp Llc | Accredited advisor management system |
US10540348B2 (en) * | 2014-09-22 | 2020-01-21 | At&T Intellectual Property I, L.P. | Contextual inference of non-verbal expressions |
US9818126B1 (en) * | 2016-04-20 | 2017-11-14 | Deep Labs Inc. | Systems and methods for sensor data analysis through machine learning |
EP3488371A4 (en) * | 2016-07-21 | 2019-07-17 | Magic Leap, Inc. | Technique for controlling virtual image generation system using emotional states of user |
-
2015
- 2015-06-09 US US14/735,060 patent/US10514766B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5720619A (en) * | 1995-04-24 | 1998-02-24 | Fisslinger; Johannes | Interactive computer assisted multi-media biofeedback system |
US20030217123A1 (en) * | 1998-09-22 | 2003-11-20 | Anderson Robin L. | System and method for accessing and operating personal computers remotely |
US20020131331A1 (en) * | 2001-03-19 | 2002-09-19 | International Business Machines Corporation | Simplified method for setting time using a graphical representation of an analog clock face |
US20120011477A1 (en) * | 2010-07-12 | 2012-01-12 | Nokia Corporation | User interfaces |
US20130157719A1 (en) * | 2010-09-02 | 2013-06-20 | Yong Liu | Mobile terminal and transmission processing method thereof |
US20130216126A1 (en) * | 2012-02-21 | 2013-08-22 | Wistron Corporation | User emotion detection method and associated handwriting input electronic device |
US20150015509A1 (en) * | 2013-07-11 | 2015-01-15 | David H. Shanabrook | Method and system of obtaining affective state from touch screen display interactions |
US20160070440A1 (en) * | 2014-09-04 | 2016-03-10 | Microsoft Corporation | User interface with dynamic transition times |
Non-Patent Citations (4)
Title |
---|
Brownlee, "A Tour of Machine Learning Algorithms," 25 November 2013, https://machinelearningmastery.com/a-tour-of-machine-learning-algorithms/ * |
Castellano et al., "Recognising Human Emotions from Body Movement and Gesture Dynamics," ACII 2007, LNCS 4738, pp. 71–82, 2007 * |
Coutrix et al., "Identifying Emotions Expressed by Mobile Users through 2D Surface and 3D Motion Gestures," UbiComp 2012 * |
Joung et al., "Tactile Hand Gesture Recognition through Haptic Feedback for Affective Online Communication," HCI International 2011, Volume 6, LNCS 6766 * |
Cited By (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160173944A1 (en) * | 2014-12-15 | 2016-06-16 | Vessel Group, Inc. | Processing techniques in audio-visual streaming systems |
US11006176B2 (en) * | 2014-12-15 | 2021-05-11 | Verizon Digital Media Services Inc. | Processing techniques in audio-visual streaming systems |
US10070183B2 (en) * | 2014-12-15 | 2018-09-04 | Verizon Digital Media Services Inc. | Processing techniques in audio-visual streaming systems |
US20170177203A1 (en) * | 2015-12-18 | 2017-06-22 | Facebook, Inc. | Systems and methods for identifying dominant hands for users based on usage patterns |
US20170201592A1 (en) * | 2016-01-13 | 2017-07-13 | Sap Se | Contextual user experience |
US10554768B2 (en) * | 2016-01-13 | 2020-02-04 | Sap Se | Contextual user experience |
US10898797B2 (en) | 2016-10-11 | 2021-01-26 | Valve Corporation | Electronic controller with finger sensing and an adjustable hand retainer |
US11294485B2 (en) | 2016-10-11 | 2022-04-05 | Valve Corporation | Sensor fusion algorithms for a handheld controller that includes a force sensing resistor (FSR) |
US11992751B2 (en) * | 2016-10-11 | 2024-05-28 | Valve Corporation | Virtual reality hand gesture generation |
US11786809B2 (en) | 2016-10-11 | 2023-10-17 | Valve Corporation | Electronic controller with finger sensing and an adjustable hand retainer |
US11625898B2 (en) | 2016-10-11 | 2023-04-11 | Valve Corporation | Holding and releasing virtual objects |
US20190138107A1 (en) * | 2016-10-11 | 2019-05-09 | Valve Corporation | Virtual reality hand gesture generation |
US10987573B2 (en) * | 2016-10-11 | 2021-04-27 | Valve Corporation | Virtual reality hand gesture generation |
US11465041B2 (en) | 2016-10-11 | 2022-10-11 | Valve Corporation | Force sensing resistor (FSR) with polyimide substrate, systems, and methods thereof |
US20210228978A1 (en) * | 2016-10-11 | 2021-07-29 | Valve Corporation | Virtual reality hand gesture generation |
US12042718B2 (en) | 2016-10-11 | 2024-07-23 | Valve Corporation | Holding and releasing virtual objects |
US11185763B2 (en) | 2016-10-11 | 2021-11-30 | Valve Corporation | Holding and releasing virtual objects |
US10888773B2 (en) | 2016-10-11 | 2021-01-12 | Valve Corporation | Force sensing resistor (FSR) with polyimide substrate, systems, and methods thereof |
US11167213B2 (en) | 2016-10-11 | 2021-11-09 | Valve Corporation | Electronic controller with hand retainer and finger motion sensing |
US10898796B2 (en) | 2016-10-11 | 2021-01-26 | Valve Corporation | Electronic controller with finger sensing and an adjustable hand retainer |
US12186885B2 (en) | 2017-01-10 | 2025-01-07 | Intuition Robotics, Ltd | Device for performing emotional gestures to interact with a user |
WO2018132364A1 (en) * | 2017-01-10 | 2018-07-19 | Intuition Robotics, Ltd. | A method for performing emotional gestures by a device to interact with a user |
US20210141453A1 (en) * | 2017-02-23 | 2021-05-13 | Charles Robert Miller, III | Wearable user mental and contextual sensing device and system |
US20180247443A1 (en) * | 2017-02-28 | 2018-08-30 | International Business Machines Corporation | Emotional analysis and depiction in virtual reality |
US10353480B2 (en) * | 2017-04-17 | 2019-07-16 | Essential Products, Inc. | Connecting assistant device to devices |
US10355931B2 (en) | 2017-04-17 | 2019-07-16 | Essential Products, Inc. | Troubleshooting voice-enabled home setup |
US10212040B2 (en) | 2017-04-17 | 2019-02-19 | Essential Products, Inc. | Troubleshooting voice-enabled home setup |
US10176807B2 (en) | 2017-04-17 | 2019-01-08 | Essential Products, Inc. | Voice setup instructions |
US10874939B2 (en) | 2017-06-16 | 2020-12-29 | Valve Corporation | Electronic controller with finger motion sensing |
WO2019054846A1 (en) | 2017-09-18 | 2019-03-21 | Samsung Electronics Co., Ltd. | Method for dynamic interaction and electronic device thereof |
US11914787B2 (en) * | 2017-09-18 | 2024-02-27 | Samsung Electronics Co., Ltd. | Method for dynamic interaction and electronic device thereof |
US11209907B2 (en) * | 2017-09-18 | 2021-12-28 | Samsung Electronics Co., Ltd. | Method for dynamic interaction and electronic device thereof |
EP3681678A4 (en) * | 2017-09-18 | 2020-11-18 | Samsung Electronics Co., Ltd. | METHOD OF DYNAMIC INTERACTION AND ELECTRONIC DEVICE THEREFORE |
US20220147153A1 (en) * | 2017-09-18 | 2022-05-12 | Samsung Electronics Co., Ltd. | Method for dynamic interaction and electronic device thereof |
US20190094980A1 (en) * | 2017-09-18 | 2019-03-28 | Samsung Electronics Co., Ltd | Method for dynamic interaction and electronic device thereof |
CN112437909A (en) * | 2018-06-20 | 2021-03-02 | 威尔乌集团 | Virtual reality gesture generation |
KR102810896B1 (en) | 2018-06-20 | 2025-05-20 | 밸브 코포레이션 | Create virtual reality hand gestures |
US11216066B2 (en) * | 2018-11-09 | 2022-01-04 | Seiko Epson Corporation | Display device, learning device, and control method of display device |
CN109726655A (en) * | 2018-12-19 | 2019-05-07 | 平安普惠企业管理有限公司 | Customer service evaluation method, device, medium and equipment based on Emotion identification |
US11664032B2 (en) | 2019-04-16 | 2023-05-30 | At&T Intellectual Property I, L.P. | Multi-agent input coordination |
US11170783B2 (en) * | 2019-04-16 | 2021-11-09 | At&T Intellectual Property I, L.P. | Multi-agent input coordination |
CN112347774A (en) * | 2019-08-06 | 2021-02-09 | 北京搜狗科技发展有限公司 | A model determination method and device for user emotion recognition |
CN116075838A (en) * | 2019-10-15 | 2023-05-05 | 爱思唯尔股份有限公司 | System and method for predicting user emotion in SAAS application |
GB2590473A (en) * | 2019-12-19 | 2021-06-30 | Samsung Electronics Co Ltd | Method and apparatus for dynamic human-computer interaction |
GB2590473B (en) * | 2019-12-19 | 2022-07-27 | Samsung Electronics Co Ltd | Method and apparatus for dynamic human-computer interaction |
US12169591B2 (en) | 2019-12-19 | 2024-12-17 | Samsung Electronics Co., Ltd. | Method and apparatus for dynamic human-computer interaction |
US11593243B2 (en) * | 2020-12-29 | 2023-02-28 | Imperva, Inc. | Dynamic emotion detection based on user inputs |
US20220206918A1 (en) * | 2020-12-29 | 2022-06-30 | Imperva, Inc. | Dynamic emotion detection based on user inputs |
WO2022245134A1 (en) * | 2021-05-20 | 2022-11-24 | Samsung Electronics Co., Ltd. | A system and method for context resolution and purpose driven clustering in autonomous systems |
US20220407827A1 (en) * | 2021-06-17 | 2022-12-22 | Canon Kabushiki Kaisha | Information processing system, control method thereof, and non-transitory computer-readable storage medium |
US12101283B2 (en) * | 2021-06-17 | 2024-09-24 | Canon Kabushiki Kaisha | Information processing system, control method thereof, and non-transitory computer-readable storage medium |
WO2024001599A1 (en) * | 2022-06-30 | 2024-01-04 | 支付宝(杭州)信息技术有限公司 | Method, system and apparatus for predicting age range or gender of user, and medium |
CN115081334A (en) * | 2022-06-30 | 2022-09-20 | 支付宝(杭州)信息技术有限公司 | Method, system, apparatus and medium for predicting a user's age group or gender |
WO2024170162A1 (en) * | 2023-02-15 | 2024-08-22 | British Telecommunications Public Limited Company | Emotional state detection |
CN116360666A (en) * | 2023-05-31 | 2023-06-30 | Tcl通讯科技(成都)有限公司 | Page sliding method and device, electronic equipment and computer storage medium |
Also Published As
Publication number | Publication date |
---|---|
US10514766B2 (en) | 2019-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10514766B2 (en) | Systems and methods for determining emotions based on user gestures | |
US12118999B2 (en) | Reducing the need for manual start/end-pointing and trigger phrases | |
US11389084B2 (en) | Electronic device and method of controlling same | |
EP2730223B1 (en) | Apparatus and method for determining user's mental state | |
EP2778843B1 (en) | Automatic haptic effect adjustment system | |
Buriro et al. | Itsme: Multi-modal and unobtrusive behavioural user authentication for smartphones | |
US9299350B1 (en) | Systems and methods for identifying users of devices and customizing devices to users | |
US8793134B2 (en) | System and method for integrating gesture and sound for controlling device | |
KR102423298B1 (en) | Method for operating speech recognition service, electronic device and system supporting the same | |
EP3693958A1 (en) | Electronic apparatus and control method thereof | |
KR20170080672A (en) | Augmentation of key phrase user recognition | |
CN110737339B (en) | Visual-tactile interaction model construction method based on deep learning | |
Blanco-Gonzalo et al. | Automatic usability and stress analysis in mobile biometrics | |
US11594149B1 (en) | Speech fluency evaluation and feedback | |
Trong et al. | Recognizing hand gestures for controlling home appliances with mobile sensors | |
KR101567154B1 (en) | Method for processing dialogue based on multiple user and apparatus for performing the same | |
Krell et al. | Fusion of fragmentary classifier decisions for affective state recognition | |
CN105100875B (en) | Control method and device for recording multimedia information | |
Seipp et al. | BackPat: one-handed off-screen patting gestures | |
WO2016014597A2 (en) | Translating emotions into electronic representations | |
Putze et al. | Design and evaluation of a self-correcting gesture interface based on error potentials from EEG | |
US10008206B2 (en) | Verifying a user | |
KR20190101100A (en) | Voice input processing method and electronic device supportingthe same | |
Bakhtiyari et al. | Implementation of emotional-aware computer systems using typical input devices | |
KR101605848B1 (en) | Method and apparatus for analyzing speech recognition performance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DELL PRODUCTS L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GATES, CARRIE ELAINE;SILBERMAN, GABRIEL MAURICIO;SIGNING DATES FROM 20150603 TO 20150604;REEL/FRAME:035814/0466 |
|
AS | Assignment |
Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT, TEXAS Free format text: SUPPLEMENT TO PATENT SECURITY AGREEMENT (NOTES);ASSIGNORS:DELL PRODUCTS L.P.;DELL SOFTWARE INC.;WYSE TECHNOLOGY L.L.C.;REEL/FRAME:036502/0291 Effective date: 20150825 Owner name: BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT, NORTH CAROLINA Free format text: SUPPLEMENT TO PATENT SECURITY AGREEMENT (ABL);ASSIGNORS:DELL PRODUCTS L.P.;DELL SOFTWARE INC.;WYSE TECHNOLOGY, L.L.C.;REEL/FRAME:036502/0206 Effective date: 20150825 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: SUPPLEMENT TO PATENT SECURITY AGREEMENT (TERM LOAN);ASSIGNORS:DELL PRODUCTS L.P.;DELL SOFTWARE INC.;WYSE TECHNOLOGY L.L.C.;REEL/FRAME:036502/0237 Effective date: 20150825 Owner name: BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT, NO Free format text: SUPPLEMENT TO PATENT SECURITY AGREEMENT (ABL);ASSIGNORS:DELL PRODUCTS L.P.;DELL SOFTWARE INC.;WYSE TECHNOLOGY, L.L.C.;REEL/FRAME:036502/0206 Effective date: 20150825 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., A Free format text: SUPPLEMENT TO PATENT SECURITY AGREEMENT (NOTES);ASSIGNORS:DELL PRODUCTS L.P.;DELL SOFTWARE INC.;WYSE TECHNOLOGY L.L.C.;REEL/FRAME:036502/0291 Effective date: 20150825 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: SUPPLEMENT TO PATENT SECURITY AGREEMENT (TERM LOAN);ASSIGNORS:DELL PRODUCTS L.P.;DELL SOFTWARE INC.;WYSE TECHNOLOGY L.L.C.;REEL/FRAME:036502/0237 Effective date: 20150825 |
|
AS | Assignment |
Owner name: WYSE TECHNOLOGY L.L.C., CALIFORNIA Free format text: RELEASE OF REEL 036502 FRAME 0206 (ABL);ASSIGNOR:BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:040017/0204 Effective date: 20160907 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF REEL 036502 FRAME 0206 (ABL);ASSIGNOR:BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:040017/0204 Effective date: 20160907 Owner name: DELL SOFTWARE INC., CALIFORNIA Free format text: RELEASE OF REEL 036502 FRAME 0206 (ABL);ASSIGNOR:BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:040017/0204 Effective date: 20160907 |
|
AS | Assignment |
Owner name: DELL SOFTWARE INC., CALIFORNIA Free format text: RELEASE OF REEL 036502 FRAME 0291 (NOTE);ASSIGNOR:BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT;REEL/FRAME:040027/0637 Effective date: 20160907 Owner name: WYSE TECHNOLOGY L.L.C., CALIFORNIA Free format text: RELEASE OF REEL 036502 FRAME 0237 (TL);ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:040028/0088 Effective date: 20160907 Owner name: DELL SOFTWARE INC., CALIFORNIA Free format text: RELEASE OF REEL 036502 FRAME 0237 (TL);ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:040028/0088 Effective date: 20160907 Owner name: WYSE TECHNOLOGY L.L.C., CALIFORNIA Free format text: RELEASE OF REEL 036502 FRAME 0291 (NOTE);ASSIGNOR:BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT;REEL/FRAME:040027/0637 Effective date: 20160907 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF REEL 036502 FRAME 0237 (TL);ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:040028/0088 Effective date: 20160907 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF REEL 036502 FRAME 0291 (NOTE);ASSIGNOR:BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT;REEL/FRAME:040027/0637 Effective date: 20160907 |
|
AS | Assignment |
Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT, NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNORS:ASAP SOFTWARE EXPRESS, INC.;AVENTAIL LLC;CREDANT TECHNOLOGIES, INC.;AND OTHERS;REEL/FRAME:040134/0001 Effective date: 20160907 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT, TEXAS Free format text: SECURITY AGREEMENT;ASSIGNORS:ASAP SOFTWARE EXPRESS, INC.;AVENTAIL LLC;CREDANT TECHNOLOGIES, INC.;AND OTHERS;REEL/FRAME:040136/0001 Effective date: 20160907 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., A Free format text: SECURITY AGREEMENT;ASSIGNORS:ASAP SOFTWARE EXPRESS, INC.;AVENTAIL LLC;CREDANT TECHNOLOGIES, INC.;AND OTHERS;REEL/FRAME:040136/0001 Effective date: 20160907 Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLAT Free format text: SECURITY AGREEMENT;ASSIGNORS:ASAP SOFTWARE EXPRESS, INC.;AVENTAIL LLC;CREDANT TECHNOLOGIES, INC.;AND OTHERS;REEL/FRAME:040134/0001 Effective date: 20160907 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., T Free format text: SECURITY AGREEMENT;ASSIGNORS:CREDANT TECHNOLOGIES, INC.;DELL INTERNATIONAL L.L.C.;DELL MARKETING L.P.;AND OTHERS;REEL/FRAME:049452/0223 Effective date: 20190320 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., TEXAS Free format text: SECURITY AGREEMENT;ASSIGNORS:CREDANT TECHNOLOGIES, INC.;DELL INTERNATIONAL L.L.C.;DELL MARKETING L.P.;AND OTHERS;REEL/FRAME:049452/0223 Effective date: 20190320 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., TEXAS Free format text: SECURITY AGREEMENT;ASSIGNORS:CREDANT TECHNOLOGIES INC.;DELL INTERNATIONAL L.L.C.;DELL MARKETING L.P.;AND OTHERS;REEL/FRAME:053546/0001 Effective date: 20200409 |
|
AS | Assignment |
Owner name: WYSE TECHNOLOGY L.L.C., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: SCALEIO LLC, MASSACHUSETTS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: MOZY, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: MAGINATICS LLC, CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: FORCE10 NETWORKS, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: EMC CORPORATION, MASSACHUSETTS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: DELL SYSTEMS CORPORATION, TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: DELL SOFTWARE INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: DELL MARKETING L.P., TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: DELL INTERNATIONAL, L.L.C., TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: DELL USA L.P., TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: CREDANT TECHNOLOGIES, INC., TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: AVENTAIL LLC, CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 Owner name: ASAP SOFTWARE EXPRESS, INC., ILLINOIS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058216/0001 Effective date: 20211101 |
|
AS | Assignment |
Owner name: SCALEIO LLC, MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: EMC IP HOLDING COMPANY LLC (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO MOZY, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: EMC CORPORATION (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO MAGINATICS LLC), MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: DELL MARKETING CORPORATION (SUCCESSOR-IN-INTEREST TO FORCE10 NETWORKS, INC. AND WYSE TECHNOLOGY L.L.C.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: DELL INTERNATIONAL L.L.C., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: DELL USA L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: DELL MARKETING L.P. (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO CREDANT TECHNOLOGIES, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 Owner name: DELL MARKETING CORPORATION (SUCCESSOR-IN-INTEREST TO ASAP SOFTWARE EXPRESS, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (040136/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061324/0001 Effective date: 20220329 |
|
AS | Assignment |
Owner name: SCALEIO LLC, MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: EMC IP HOLDING COMPANY LLC (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO MOZY, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: EMC CORPORATION (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO MAGINATICS LLC), MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: DELL MARKETING CORPORATION (SUCCESSOR-IN-INTEREST TO FORCE10 NETWORKS, INC. AND WYSE TECHNOLOGY L.L.C.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: DELL INTERNATIONAL L.L.C., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: DELL USA L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: DELL MARKETING L.P. (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO CREDANT TECHNOLOGIES, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 Owner name: DELL MARKETING CORPORATION (SUCCESSOR-IN-INTEREST TO ASAP SOFTWARE EXPRESS, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (045455/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:061753/0001 Effective date: 20220329 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |